Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

3 minute read

The ACLU Is Suing For The Right To Uncover Online Discrimination

Many companies' terms of service forbid using the tools and techniques researchers need to find out if supposedly impartial platforms and algorithms actually have implicit biases.

The ACLU Is Suing For The Right To Uncover Online Discrimination

[Illustration: cepera via Shutterstock]

In December, three Harvard researchers released a paper that showed the vacation rental site Airbnb can be a racist place, where people with black-sounding names like Lakisha or Rasheed have a harder time getting a booking than a Brent or Kristen.

How did they uncover this bias? They set up 20 fake profiles and sent housing requests to more than 6,000 hosts around the country and simply tallied the replies.

What the researchers found was something that Airbnb users of color may have had a hunch about before. But they might have had a hard time proving it, because they could only know their own experience. It took a planned research study that allowed direct comparisons to ferret out this happening. Now Airbnb’s CEO has vowed to fight racism on its platform.

As almost all of our important life transactions move online, studies like this have great value—especially as algorithms and software make more automated decisions that, unlike the Airbnb example, can be biased in ways that are harder to examine. This is especially important in areas like housing or employment that are regulated by federal law that bans racial and other kinds of discrimination on the part of employers or developers.

But technically, many studies where outsiders set up fake profiles or automatically extract data from a site are technically against the law—and the ACLU is now suing on behalf of several academic researchers and journalists to make these "audits" legal.

The broad-ranging federal computer crimes law, the Computer Fraud and Abuse Act (CFAA), says that it is a crime to violate a website's’ "terms of service," which is all the fine print that you probably don't read but agree to when you sign up for a site and check the box. Many sites, including Airbnb and hiring sites like LinkedIn, Glassdoor, and Monster, for example, ban the creation of multiple or fake accounts in their terms of service. Many, including real estate sites like Zillow, Trulia, and Redfin, also ban the automated collection of data from these sites, a practice called "scraping." What’s more, most sites reserve the right to change their terms of service at any time.

ACLU, representing four researchers and First Look Media (publisher of The Intercept), is suing to overturn this provision of the law, saying it’s a violation of the First Amendment because it limits people from gathering publicly available information needed to root out bias and discrimination.

"A lot of that research is being chilled because people are fearful of violating the CFAA," says ACLU attorney Esha Bhandari.

There are many ways discrimination can occur online, and it’s often not as obvious as an Airbnb host rejecting a guest with a black-sounding name. Cookies that track people online allow websites to target different advertisements, content, or prices based on the demographic "categories" that they fall in—and there’s some limited evidence that this does happen. Algorithms that are based on data reflecting past patterns of discrimination can help continue to perpetuate it—for example, people who live in a certain zip code that houses a minority community might pay more for online car insurance. The ACLU notes one study that found that Google searches for black-sounding names are more likely to pull up ads for criminal records, and another study showed that men were shown higher-paying job ads than women.

Bhandari says there haven't been actual prosecutions of researchers investigating algorithm bias under the CFAA, though the law has been used to prosecute terms of service violations done for other reasons. But the plaintiffs say the mere threat is enough.

"I can't speak for other researchers, but we certainly feel that the threat exists, and it does impact what we choose to research, and how we choose to conduct our studies," says Northeastern University computer researcher Christo Wilson, one of the plaintiffs who conducts "algorithm audits" of sites like Amazon and Uber as part of his research.

"I have heard of papers and grant proposals getting rejected in the past due to CFAA concerns. Reviewers sometimes feel that any research methodology that violates ToS [terms of service] must be rejected due to CFAA issues."

Have something to say about this article? You can email us and let us know. If it's interesting and thoughtful, we may publish your response.

loading