asoltani-headshot
Ashkan Soltani, who became chief technologist of the Federal Trade Commission in October, has made algorithmic transparency one of his stated goals. The agency is hiring researchers to test for biases in automated online services. ashkansoltani.org

In 2012, the Wall Street Journal published the results of an exhaustive investigation into the online pricing practices of Staples Inc. As it turned out, what customers were paying for office equipment was, in part, dependent upon where they lived. An algorithm built into the Staples website, the investigation revealed, was inadvertently giving discounted prices to customers who lived closer to rival stores, or rather penalizing those who lived farther away.

For many who read the Journal’s much-cited story, it was a flashbulb moment: Here was a crystal-clear example of how an automated computer system can have a direct effect on our lives -- and wallets. One of the reporters who worked on the investigation was Ashkan Soltani, a privacy and security researcher who went on to work for the Washington Post, where he was part of the Pulitzer Prize-winning team that reported on the National Security Agency’s surveillance activities.

These days Soltani is still concerned with algorithms and the power they wield, but he’s tackling the issue in a more public capacity. In October he was named chief technologist of the Federal Trade Commission, where he has made a push for “algorithmic transparency” one of his stated goals. He admits it’s a concept that makes some companies nervous -- proprietary algorithms are the lifeblood of Silicon Valley -- but he says he’s not out to pull back the tech industry’s magic curtain. He just wants to look under the hood.

Soltani spoke with International Business Times from Washington this week, where he expounded about the need to learn more about, and correct, the hidden biases in computer algorithms.

International Business Times: What is “algorithmic transparency” and why is it important?

Ashkan Soltani: Whether we know it or not, we’re interacting with algorithms every day, whether it’s as we’re browsing on the Web and we see ads pop up, or we’re searching for things, or we’re communicating with our friends on a social network. When we’re using dating sites to find matches, some programmer decided to code an algorithm -- which information to show and not show. And the fact that we’re interacting with these algorithms: One, people might not know it, and two, it’s possible for them to be biased in some way.

IBTimes: Tech companies generally say their algorithms are proprietary and they can’t reveal how they work, because that would be giving away trade secrets. How do you respond to that answer?

Soltani: It’s not a case of having to hand over your source code or hand over your secret sauce. I think a lot of it can be done by making assertions and testing for biases. So in the past, my work has highlighted companies charging different prices for things based on where users were located. The goal of that research was that everyone can relate to money. I think companies could self-test, or researchers like myself could self-test, and assert whether an algorithm exhibits any biases. And so that’s the idea, create a framework by which designers of algorithms -- at least for ones that are working in sensitive areas like credit and housing and jobs and healthcare -- are mindful of, and testing for, biases that might be inherent in their systems.

staples
Shares of retailers Office Depot and Staples plunged Wednesday following news that the companies plan to terminate their planned merger after a U.S. federal judge ordered the deal temporarily halted because of antitrust concerns. Reuters

IBTimes: When you talk about this issue with tech companies, what kind of response do you get?

Soltani: I think it varies. Some companies are doing this already. You look at Google, where there was a big debate around glass ceilings and gender bias. They ran big data studies on their own hiring practices and found that they did, in fact, have gender biases in their hiring, and so they could essentially tune or correct for that. I think some companies are proactive. Other companies are not mindful of this at all. It’s a very new issue. It’s only in the last 10 years that we’ve become accustomed to all of our interactions being algorithmically governed, and more recently companies are catching on to the fact that their algorithms wield some power, whether intentionally or not.

IBTimes: You mention intentional. So what role would you see the FTC as having if you did discover that a company was doing something intentional, either for its own gain or even for more nefarious reasons?

Soltani: The FTC has a number of tools that they would look at this from under their Section 5 authority -- anything from competition to deception to unfairness. I think it would depend on the specific facts. My role is to help raise awareness about, and develop tools for, determining the biases that exist.

IBTimes: What have you done to advance the cause since you took over?

Soltani: One of the big things is to first build more of a research capacity at the agency. The study we did at the Journal took months and months and months, and a number of people. One of the things I want to do here is hire teams of people that are doing that kind of work on an ongoing basis. As part of the most recent announcement, we launched this research initiative and are hiring a technology research coordinator, technology research fellows and a number of interns who will be coming through.

IBTimes: Is there a component of this that would help raise awareness among the general public? I’ve heard debates about, “How much can the public really be expected to know about these things?” It’s sort of a literacy debate.

Soltani: It is and it isn’t. The value that I and others like me have, both to policymakers and even in the newsroom to reporters, is to dig in and understand -- look under the hood and communicate what is the most significant factor for the issue you’re concerned with. I think a lot of the challenge in technology is that people are somewhat intimidated by technology.

IBTimes: They just want it to work, for the most part. It’s like your car.

Soltani: Yes, so with your car, you don’t need to work on your car, but you need to know if it’s your engine or if you have a flat tire. You need to be confident enough to know when you have an issue whether you should take it to the tire shop or the mechanic.

IBTimes: Do you think there’s a tendency to do a little too much trial and error with some of these companies? I think of Facebook’s famous saying, “Move fast and break things.” It’s almost like they embrace the chaos of getting it wrong the first time?

Soltani: I don’t know if there’s too much or too little. There is a tendency to move really quickly. So their values are to move fast at all costs, but you might say, you can move fast and still make sure X or Y is also maintained ... I’m going to be writing a post about this in the coming months to explain more about what I mean, because a few people have asked us, “Do you mean companies handing over their source code?” But that’s not necessarily what we need to do.

IBTimes: I think that’s how it comes off if you just hear the word “transparency.”

Soltani: Maybe “accountability” would have been a better term to use. At the end of the day, companies have a huge interest in making sure that people understand, and that they’re not bias, because what you risk is consumer trust.

This interview was edited for length and clarity. Christopher Zara is a senior writer who covers media and culture. News tips? Email me here. Follow me on Twitter @christopherzara