Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

3 minute read

The FTC Reminds Businesses That Big Data Can Be Full Of Bad Biases

The data gathered by companies doesn't necessarily reflect reality, and companies have to be careful with the decisions they make using it.

The FTC Reminds Businesses That Big Data Can Be Full Of Bad Biases

[Illustrations: Pi-Lens via Shutterstock]

Imagine you want to increase your credit limit, but you’re denied by your credit card company even though you pay your bills on time. In this age of predictive data, there might any number of hidden reasons at play. Maybe you charged marriage counseling services or even got a tire repair—and your credit company has noticed that people who make similar purchases pose a greater credit risk.

Sorry, it’s a world of big data, and we all are just living in it.

This is one scenario that really happened, as the U.S. Federal Trade Commission outlines in a new report that seeks to advise companies on how to use big data ethically. The report, Big Data: A Tool For Inclusion or Exclusion?, specifically focuses on both the risks and benefits of the commercial use of big data about low-income and underserved consumers.

On the one hand, big data can actually help extend services that benefit the poor and underserved, whether in health, education, finances, or job opportunities. For example, a big data analysis might find someone who deserves a mortgage, even if he has no credit history, or a job, even if she didn’t complete college.

But uses of big data can also run afoul of existing laws meant to prevent, say, employment, credit, or housing discrimination. For example, of a credit company makes decisions based on its analysis of consumer's’ zip codes—that can fall afoul of the Equal Credit Opportunity Act if it results in discrimination against a particular ethnic group (and can’t be justified by a "legitimate business necessity.") Big data can actually help unscrupulous companies target vulnerable people, such as those more likely to be taken in by a marketing scheme. Some companies may be charging different prices based on your browsing patterns.

The FTC’s report seeks simply to raise awareness about these issues. It also provides for helpful questions businesses should ask themselves to avoid breaking the law or simply being unfair or unethical:

1. How representative is the data set?

An initial data set may exclude certain populations or information—for example, if a survey or marketing offer is made on social media, that excludes many people who aren’t as tech-savvy or who don’t have frequent Internet access.

2. Does the data model account for biases?

Companies should try to overcome biases that they identify in their data or analysis. Perhaps a company hiring for a job only considers graduates from certain colleges—that may incorporate previous biases in the college admissions process. (To be fair, employers do this without big data all the time).

3. How accurate are predictions based on big data?

According to the report, "Companies should remember that while big data is very good at detecting correlations, it does not explain which correlations are meaningful." An example is Google Flu Trends—it was initially accurate at predicting flu outbreaks based on web searches for "flu." But it became more inaccurate over time, because it left out certain considerations, like the fact that flu searches in an area can also spike based on news stories about far away outbreaks.

4. Does a company’s reliance on big data raise ethical or fairness concerns?

After a big data analysis is made, companies should try to balance the results with fairness considerations. "For example, one company determined that employees who live closer to their jobs stay at these jobs longer than those who live farther away. However, another company decided to exclude this factor from its hiring algorithm because of concerns about racial discrimination," the report notes.

The report is worth a read. All of the FTC Commissioners voted in support of the text, though Commissioner Maureen K. Ohlhausen wrote a side note that warned against giving "undue credence to hypothetical harms." "We risk distracting ourselves from genuine harms and discouraging the development of the very tools that promise new benefits to low-income, disadvantaged, and vulnerable people."

loading