460552342
In a talk at SXSW Sunday, two researchers discussed the influence of content-curation algorithms and their inherent biases. Peter Macdiarmid/Getty Images

AUSTIN, Texas -- You may never see this article because a computer didn’t think it was relevant to you, and that’s a matter of great concern to two researchers who led a debate Sunday scrutinizing the growing influence of algorithms, and the cultural biases -- both subtle and overt -- they sometimes reflect.

“It’s a scary prospect out there,” Karrie Karahalios, an associate professor at the University of Illinois at Urbana-Champaign, told an engaged crowd here at the South by Southwest Interactive festival.

Karahalios was joined on stage by Christian Sandvig, a research professor at the University of Michigan, for a panel discussion called “When Algorithms Attack,” which looked at the effect of content-curation algorithms used by tech giants like Facebook Inc., Google Inc. and Amazon.com Inc., and explored ways to better inform the public about their often-invisible influence.

The researchers cited a number of examples of algorithmic snafus, including Coca-Cola’s short-lived automated tweet campaign #MakeItHappy, which was suspended after a writer from Gawker tricked it into tweeting passages from Adolf Hitler’s “Mein Kampf.” They also cited research showing how Google’s search engine produced ads for arrest records when it was fed African-American-sounding names like “Keisha,” whereas more traditionally white-sounding names did not produce the same results.

“It turns out, for certain kinds of queries, certain ads get clicked the most and so they’re shown the most,” said Sandvig. “And so the algorithm becomes kind of a racism accelerator.”

SWSXPanel
Karrie Karahalios and Christian Sandvig lead a discussion at South by Southwest Interactive Sunday, March 15, 2015, on the influence of algorithms used by websites like Facebook, Google and Amazon. Christopher Zara/International Business Times

But it was Facebook’s notoriously cryptic news feed -- a platform often criticized for its heavy curation -- that figured most prominently in the discussion. Sandvig and Karahalios noted their recently published research on Facebook user habits and how those habits changed after users became aware that an algorithm was determining what posts are shown.

For a study led by Sandvig and Karahalios in 2013, participants got a glimpse into something most Facebook users will never see -- an unfiltered news feed. Using a tool called FeedVis, which the researchers built themselves, participants saw every piece of content posted by all of their Facebook friends and were able to compare the unfiltered feed with an algorithmically curated feed.

The results were mixed, Karahalios said, depending on whether the participants were aware of the algorithm before the study. In fact, 62.5 percent of the participants had no idea a news feed algorithm even existed. Among those, many reacted with anger and frustration when they realized that Facebook was curating posts from friends or family. One participant said learning about the algorithm felt like being in “The Matrix,” a reference to the popular sci-fi film in which reality turns out to be a computer-generated fabrication.

Visceral Reactions

“Their initial reactions were quite visceral," Karahalios said.

But after following up with the research subjects months later, Karahalios said some users had changed certain Facebook behaviors after learning of the algorithm. “People started being more careful about what they liked and what they commented on,” Karahalios said. “People rightly figured out that comments had more impact than likes.”

So how did Facebook react when it caught wind of the researchers’ FeedVis tool?

“They asked for the code,” Karahalios said. “Maybe they wanted to make sure we followed their terms of service.” She added that Facebook “toward the end had a very open dialogue with us."

Sunday’s crowd was a diverse mix of computer programmers, advertising executives, journalists and marketers, a sign that few of us are these days untouched by algorithmic tentacles. Everyone seemed to agree that algorithms are necessary in a digital world, but an undercurrent of the discussion focused on more accountability from the companies that program them.

Sandvig spoke of the possible need for a third-party “watchdog” -- not a government entity, necessarily, but perhaps an oversight committee, trade group or nonprofit that would help to ensure proprietary algorithms meet certain standards. “In a number of domains we have complicated systems that are technical and social, and we use experts to manage them,” Sandvig said. “I think that’s a model for it.”

Christopher Zara is a senior writer who covers media and culture. News tips? Email me here. Follow me on Twitter @christopherzara.