Categories


Authors

The Tech World’s Blind Spots Are Windows into Its Larger Failures

The Tech World’s Blind Spots Are Windows into Its Larger Failures

Interview by Sarah Stankorb

Technically Wrong_978-0-393-63463-1.jpg

Next week, top executives from Facebook, Google, and Twitter will testify before Congress about how Russian agents may have used the Silicon Valley giants to sway the U.S. 2016 elections. Whether you think the ad buys, designed to stoke partisan anger, is “laughably small” or an alarming amount of foreign money spent on what were essentially political advertisements, the debacle highlights just how unprepared we are for the ways massive tech companies can influence society at large.

For Sara Wachter-Boettcher, a content strategist and user experience expert, the dilemma in which Facebook currently finds itself is unsurprising. “The company has a long track record of treating ethical failures like bugs to be fixed: say sorry, squash them down, and keep moving forward,” she recently wrote for Quartz, “…every failure gets treated like an isolated incident, rather than part of a systemic pattern that needs systemic action.”

In her new book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, Wachter-Boettcher describes how a lack of diversity among engineers and designers combined with an insistence on growth as the only meaningful metric can lead to real-life human distress and profound failures in oversight. How something as simple as an online form can exclude whole populations through limited options for race and gender. How algorithms and a wealth of intimate personal data can dredge up painful memories, like a cheerful Facebook “Year in Review” slideshow showcasing a user’s recently deceased daughter. How biases can unintentionally insult users, like Etsy serving a woman with the push notification “Valentine’s Day gifts for him!” (Of course, her partner was a woman). How Twitter can suspend users for angrily responding to bigoted or sexist trolls, but be virtually useless when it comes to grappling with those trolls in the first place.

These omissions, presumptions, and blind spots might seem like small bugs to be fixed, but they make a big impression, both on the users they affect and in what they tell us about who really calls the shots in Silicon Valley.

Wachter-Boettcher spoke with Make Change recently about how unchecked biases pop up in an increasingly “smart” world. —Callie Enlow

Sara Wachter-Boettcher. Photograph by Carina Romano.

Sara Wachter-Boettcher. Photograph by Carina Romano.

The assumptions made during product development increasingly have real ramifications. Especially with so many people now integrating tech into how they take care of themselves and their health needs. You gave an example of a bathroom scale suggesting that a toddler diet due to very natural growth-related weight gain. That presumption could easily cause harm to someone with an eating disorder. It’s a whole other realm of potential injury that was unforeseen. 

Dan Hon had that experience with his smart scale, telling his toddler to lose weight, but also congratulating his wife about weight loss after a pregnancy. He talked to the company about this, sent them a letter about it, and they realized that they were messing this up. But every time I see something like this, I think: How did you not think about this before now? How did you go all the way through the process of designing and making this product and not realize this was going to be a problem?

Right. Most companies going to market with a new product do testing, but do you have any idea if they’re limiting testing of these products to a narrow target market and leaving out other groups who could flag these problems?

Yeah, you know, working in tech, one of the things I’ve realized is that things get tested, but often times a lot less than you might think. It varies, but when that research happens, a lot of companies will actually do testing with the “friends and family” approach—this is particularly true in any kind of smaller start up. They'll test it with themselves internally, and then with the people they know, and odds are pretty high that the people you know are going to be relatively similar to you. It's a blind spot because the people who work in a lot of tech companies come from such narrow backgrounds. If you take people who are mostly straight, white, cis men, and you put them all together in a room, they're going to miss some things that a more diverse group wouldn't miss. That's not to say that they are necessarily bad people or have bad intentions, I think that's rarer.

Another problem is definitely when people do go out and test with real humans, they sometimes are overly prescriptive about who they're testing with, and so they'll try to find these “ideal users.” As a result, they are only focusing on this one use case, and they are not seeing what might be outside of that.

I'm really interested in the assumptions behind the internet’s ethos of delight, the forced cheerfulness in so much tech nowadays. It's in the language for Facebook's On this Day function, which can as easily select a puppy picture as one from chemo treatment. It’s Siri cracking jokes when she doesn’t know an answer. There’s that chirpy copy on so many other websites we use every day. How is this affecting us?  

I think a lot of that cutsieness is often designed to kind of mask power.

But I think those delightful interfaces—where we're going to cover your profile in balloons even though you're posting on Facebook about an upcoming divorce, that kind thing—that's because what they're really designing for is more engagement, and they know that those little features, even if they're going to hurt some percentage of users, they're going to result in a slight uptick in engagement. And when your only metric is growth, when people are still buying Facebook ads, Facebook is still profitable, and that's the only thing you are thinking about, you actually don’t care that much about those people who might be hurt.

And that creates a sort of race to the bottom. It creates this industry that's really all built around scrambling toward growth. Particularly this year, we've kind of reached the logical conclusion of pushing and pushing and pushing toward growth, at expense of people. You end up in the situation Facebook is in now, where they're now being investigated for the role they played in the election last year, and they're starting to talk very seriously at the governmental level about how to regulate a company like Facebook. That's what you get when you play fast and loose with people for so long and never really think about the ramifications. 

Throughout the book, you show how maybe an online form has inherent bias, and then that form gives the company a very false view of the people they’re really advertising to. That initial bias and the decisions it affects keep stacking up and stacking up until you have these broken systems. An entire company is built upon flawed thinking. So that's one side. And then on the other side, there's a very human element that ruins these things. I mean it's the trolls on Twitter, the jerks on Reddit, who use these systems to manipulate and terrorize people. Is it a human problem? Is it a technical problem? Is it even fixable? 

Kind of all of the above. I think fundamentally though, it's a much more human problem than just mechanical problems. One of the problems has been that people in tech have been worried about whether something is possible technically instead of what the limits of technology are. They haven't been focused enough on the actual people they're designing for. There was this post by Zeynep Tufekci recently on Twitter that I thought really summed it up nicely. She said, ‘Silicon Valley is run by people [who] want to be in the tech business, but are in the people business. They are way, way, in over their heads.’ And I think that that's absolutely true. They're very much focused on this idea of being technologists and that has led them astray when it comes to serving people. And they're now just coming to terms with that. It's a mess. It's gone on way too long. 

Dealing with the Financial Aftermath of a Natural Disaster

Dealing with the Financial Aftermath of a Natural Disaster

African Countries Are Fighting Back Against the Global Scourge of Cheap Clothing

African Countries Are Fighting Back Against the Global Scourge of Cheap Clothing