How Does the Facebook Algorithm Help Shape Our Opinions?

How diverse are the viewpoints you see in your Facebook feed? When was the last time you read something there that seriously clashed with your worldview and made you feel uncomfortable, even upset?

If it’s been a while, that’s by design.

Facebook makes its billions of dollars by advertising to its nearly 2 billion users as they scroll through the site, said Claire Woodcock, digital strategist for Razorfish London. Woodcock gave a talk on social media algorithms at the South by Southwest Interactive conference in Austin, Texas.

Give the people what they want

facebook algorithm rewire pbs
At South by Southwest in Austin, Texas, Claire Woodcock of Razorfish London explains how Facebook’s ethos, and its platform, has changed over time. Photo by Katie Moritz.

Because its financial success is tied to advertising, Facebook has an incentive to keep you from closing the tab, Woodcock said. The longer you stay on the site, the more advertisements you see and the more valuable the site is as an advertising platform.

That seems like common sense, right? But to keep us from clicking away, Facebook learns what we like and what we don’t like.

Woodcock gave this example: She’s a dog person, she said, and she definitely prefers dogs to cats. When she sees a photo of a dog—specifically a pug dressed in a cute costume—she clicks “Like.” Then she keeps scrolling. Seeing that photo has released happiness chemicals in her brain and she wants more.

“That’s why we’re so addicted to social media,” Woodcock said. “We just keep scrolling for another hit of feeling good.”

She scrolls and scrolls, liking more posts along the way. But then comes a photo of a hissing cat. Yikes. She’s snapped out of her social media happy place and suddenly remembers all the work she needs to get done that day. She closes the browser window. And Facebook takes note: Pugs yes, cats no.

When put in the context of pugs in costumes, it seems harmless enough. What’s wrong with Facebook showing us the stuff we want to see? Unfortunately, this algorithm goes beyond pet photos—“most importantly (it) includes our views and values,” Woodcock said. It’s responsible for an “echo chamber effect” that’s keeping us insulated from other political viewpoints, causing a growing chasm between political ideologies.

facebook algorithm rewire pbsIt’s been studied: when our opinions are contradicted—whether in a real-life argument or on in our news feeds—the logic part of our brains turn off and fight-or-flight kicks in, Woodcock said. We go into argument mode or we run away. On Facebook, that often means logging off or unfollowing the offending friend.

“So in social media, we exist in a world as we want to experience it,” she said. “Everybody agrees with us, and if someone disagrees with them, we simply un-friend them, right? … Over time we start to feel really self-important.”

This echo chamber can also lead to big political surprises. She gave the example of the U.K.’s vote to leave the European Union.

“I’m from London; I really wanted to stay (in the European Union),” Woodcock said. “All of my friends in London were voting to stay. Everyone I could see in social media were voting to stay, and when we woke up to the results it was really hard, because we had no idea that that sentiment of opinion was out there. And I really wish that I’d have known before the vote happened what the other side’s point of view was, because it’s been a long, hard process to come to terms with what we’re going to do, the fact that we’re leaving Europe.”

How can we escape the echo chamber?

Facebook CEO Mark Zuckerberg wrote in an open letter he published last month that he wants to fix the platform’s political bubble problem and get users on common ground. So, how can Facebook get people engaged with things they don’t agree with to expand their understanding of the world and end the troubling echo chamber effect?

If (Zuckerberg) were here today, I would tell him this,” Woodcock said. “Echo chambers are a horribly difficult problem and they can’t be solved by one entity with one solution. There is no magic bullet. We need to use a combination of solutions to realize his dream of a global community.”

facebook algorithm rewire pbsShe suggested a three-pronged approach, one that borrows from a solution noodled by Netflix for a similar echo-chamber issue in its suggestion feature. (Netflix offered a $1 million prize for the best fix but never ended up implementing it, partially because its model changed from primarily DVD rental to primarily online streaming.)

Netflix saw a problem in their users’ behavior—they were largely watching the same, most popular movies and TV shows (like “Friends”) over and over and never trying anything new. The solution: An algorithm that would identify attributes of the shows and movies the viewer likes (humor, strong female lead, horror, iconic soundtrack, etc.) and root out the same attributes in shows and movies the viewer has never seen before. Then those unwatched shows and movies would be recommended to the viewer. So, if you love the “James Bond” franchise (strong male lead, themes of good vs. evil, action) and you haven’t seen “Star Wars,” Netflix would make sure you heard about it.

Woodcock said the articles Facebook shows us should be curated in a similar way. She gave the Brexit vote as an example. As we know, she saw almost exclusively anti-Brexit articles in her feed. But what if Facebook used the information it already has about what she’s interested in (pugs, cars, anti-Brexit content, the Wall Street Journal) to identify something it could add to her newsfeed that would match those interests but also expand her worldview? For example, offering up a Wall Street Journal article (which she’s already into) written by a pro-Brexit economist.

facebook algorithm rewire pbsA pro-Brexit voter could be shown the same article because of his interests, even though he doesn’t typically read the Wall Street Journal. So they’re both engaging with the same content, albeit for different reasons.

“It’s expanding our opinions wider than our own echo chambers but we’re both going to interact with it, and that is why it is a bubble burster,” Woodcock said.

When we’re eased into an alternative opinion in this way, it’s less likely to trigger our illogical fight-or-flight mode, Woodcock said.

We need to keep people on the social platform in order to resolve the echo chamber effect,” she said.

The second of the three prongs Woodcock suggests is Facebook’s own fake news fact-checking initiative. In December, the social media site began a roll-out of a partnership with third-party media companies enlisted to fact-check articles posted to Facebook that were tagged as fake news by users. If the fact-checker decides the news is fake, it’s labeled as “disputed by third parties.” If you try to share the article, you’re reminded again that it’s disputed. The articles are also deprioritized in Facebook’s algorithm.

The third prong is a set of strong ethical guidelines that include users in the solution, Woodcock said.

“We should be working with Facebook as a society to solve this problem,” she said. “It is too important for a single entity to be making these kinds of decisions for us. Right now Facebook is trying to tackle this problem alone, without working with us, without deciding as a collective what is best for us.”facebook algorithm rewire pbs

Katie Moritz

Katie Moritz is Rewire’s web editor and a Pisces who enjoys thrift stores, rock concerts and pho. She covered politics for a newspaper in Juneau, Alaska, before driving down to balmy Minnesota to help produce long-standing public affairs show “Almanac” at Twin Cities PBS. Now she works on this here website. Reach her via email at kmoritz@tpt.org. Follow her on Twitter @katecmoritz and on Instagram @yepilikeit.

Comments

comments

related posts