Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

2 minute read

Facebook Is Making Us Polarized And Predictable

We only read and share things we agree with—with other people who are exactly like us.

Facebook Is Making Us Polarized And Predictable

[Top Photo: JTSorrell/Getty Images; App Photo: Gage Skidmore; Michael Vadon]

In the early years, people hoped "the Net" would inform people as never before and lead to an improved democracy. These days, that dream looks quaint. We're more polarized than ever, and the Internet isn't making us wiser; it turns out it's making us more predictable and extreme.

Consider new research looking at how people use Facebook. Led by Michela Del Vicario at the Laboratory of Computational Social Science, in Italy, it shows how "echo chambers" of like-minded people reinforce existing beliefs and make Internet-users more impervious to information that might challenge those beliefs. That, in turn, allows conspiracy theories to spread, like the recent Jade Helm 15 rumor, which had thousands of people believing that a military training exercise was actually the beginning of a government plot to round up and imprison its own citizens.

Benjamin Roffelsen/Getty Images

The research focused on two types of web pages: one set of 32 containing conspiracy theories (or "controversial information, often lacking supporting evidence") and 35 offering science news offering the opposite type of information (factual information, you might call it). Between 2010 to 2014, researchers analyzed whether 1.2 million Facebook users linked to the pages and how they came across the information in the first place.

In short, people are much more likely to share something that accords with something they already think. They also prefer stories that come from someone within their peer group. "Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest," the paper says. "In particular, we show that social homogeneity is the primary driver of content diffusion, and one frequent result is the formation of homogeneous, polarized clusters."

The paper continues:

Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization. This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.

Most of the time, this phenomenon isn't that important. Who cares, really, if a few cranks think America is being invaded? There have always been cranks. But you can also see how, over time, this narrowing of perspective could be corrupting, and how confirmation bias could be a serious threat to society (the World Economic Forum certainly ranks it as a serious risk).

Google is apparently developing "a trustworthiness score" to rank search queries not just by their popularity but by their veracity. And Facebook has proposed a community approach where users can flag stuff that's clearly erroneous. But it's also up to us, as users, to tread more softly online, and to be more open to opposing views.

How Does Mark Zuckerberg Generate Innovation?

loading