Spark

Designing tech for the most vulnerable users leads to better products for all, says researcher

Consumer technology must be designed with the needs of its most vulnerable users in mind, says human rights researcher Afsaneh Rigot.

Marginalized users can offer designers insight into the main safety and security loopholes in their tech

A woman in a stylish beige trench coat leans against a grey brick wall listening to something on earbuds while holding her smartphone.
Unintended design flaws can pose risks for certain communities of users, some experts say. (GaudiLab/Shutterstock)

Consumer technology must be designed with the needs of its most vulnerable users in mind, says human rights researcher Afsaneh Rigot.

For people living under political instability — protesters, activists, refugees — access to messaging apps and social media is a lifeline. And for the LGBTQ+ community in these contexts, where meeting in person may neither be safe nor possible, these applications play a vital role in forging connections.

But these same tools can also be used against them, said Rigot, who works for human rights group Article 19. For nearly a decade, she has been looking at how technology affects vulnerable communities in the Middle East and North Africa.

"We're seeing a trend throughout the years where policing actors and state actors are using the same technologies and weaponizing these hubs of congregational connection to identify people," Rigot, told Spark host Nora Young.

Weaponizing tools for connection

She says dating apps, like Hornet and Grindr, have become traps in places like Egypt and Lebanon. "In these arbitrary eyes, where something like your identity is criminalized, or seen as a criminal act, solely being or having these apps is seen as a crime in itself."

During searches at security checkpoints, Rigot says "the device itself becomes the crime scene." When it comes to evidence, nothing is off limits — from conversations on WhatsApp, Telegram and social media, to chat logs of text messages, photo galleries, or even the names of contacts stored on a phone. And according to the research, the Grindr logo itself was creating risk for users.

Having certain apps on your phone in places like Egypt and Lebanon can create risk for users, according to researcher Afsaneh Rigot. (Leon Neal/Getty Image)

Rigot has been working with designers on solutions that include the discreet app icon feature on Grindr, which was rolled out in 2020. The security feature allows users to make the app look like a calendar, calculator or to-do list — things that won't create any kind of suspicion.

This is an example of what she refers to as "design from the margins," which is also the name of a report she authored in 2022 that outlines how to centre "the most marginalized and impacted in design processes."

"We need to start designing our tech, whether it's from features, changes or a whole new technology or platform, with a grounding point of those who are most impacted," said Rigot.

She says that while these instances are often called "edge cases" or outliers, they offer designers important insight into how their applications can go wrong or be compromised. 

Accessible assistive technologies

Prioritizing the needs of people "at the margins" is especially necessary when designing technologies meant to serve them, says Chancey Fleet, a library-based technology educator for blind, low-vision and print-disabled people in New York.

Assistive technologies have come a long way over the last decade. 

Fleet says thanks to light detection and ranging (LIDAR), a remote sensing technology that uses light to measure distances, there are now apps that go beyond providing general descriptions about users' surroundings to now allow users to touch various parts of the screen on their device to find where specific objects are located.

A black and white photo of a smiling woman with long straight blonde hair.
Chancey Fleet is a library-based technology educator for blind, low-vision and print-disabled people. (Data & Society)

There are numerous computer plugins and mobile apps in the consumer market that aim to help people with disabilities navigate the physical and digital world, but only some of them are well designed.

"Half the battle is learning to use the assistive technology and another half of the battle is accessing the technology that would actually help you the best," said Fleet, who identifies as blind.

Human in the loop technology

While tools that do automated accessibility checking on web pages are helpful, says Fleet, a lot of these technologies are missing a "human in the loop." 

"It takes a computer to notice that an image is missing alt text, it takes a human to know that an image of a car stopped in a road is not an adequate description for the aftermath of the Kennedy assassination." 

But having just any human in the loop doesn't solve the problem. Fleet coined the term "ghost-written" code in reference to the dark patterns baked into accessibility technologies, which create unintended barriers and friction for users.

She says this happens when the people who use the tool aren't among those who write the code.

When someone is hired to design an assistive technology, Fleet says, "they are there to be my voice in the room. They're there to be the voice of a blind or disabled or assistive technology-using person. But they don't have the lived experience. They've done some studying. They've done some research. They are the person in the room, they are the person designing or encoding the experience. And they're making a decision for me that often is not in my interest."

Fleet says rigorous user testing is "the absolute lowest bar that we need to set."

"If we want to benefit from the deep knowledge people with disabilities have about innovation, about the unexpected directions technology takes us in, we really need our folks positioned at every level within the development, design industry."

Comments

To encourage thoughtful and respectful conversations, first and last names will appear with each submission to CBC/Radio-Canada's online communities (except in children and youth-oriented communities). Pseudonyms will no longer be permitted.

By submitting a comment, you accept that CBC has the right to reproduce and publish that comment in whole or in part, in any manner CBC chooses. Please note that CBC does not endorse the opinions expressed in comments. Comments on this story are moderated according to our Submission Guidelines. Comments are welcome while open. We reserve the right to close comments at any time.

Become a CBC Account Holder

Join the conversation  Create account

Already have an account?

now