Too Many Tech Products Leave Women Out — Here's How to Change That

POPSUGAR Photography | Rima Brindamour
POPSUGAR Photography | Rima Brindamour

Remember Bic For Her, the ridiculous line of pens made just for our dainty lady hands? "Shrink it and pink it" has long been how lazy companies try to target women (see also: power tools, electronics, sports gear). It's frustratingly reductive, assuming that women only care about "cute" . . . rather than real features that serve the complex, busy, brilliant people we are.

Now it's happening in our tech, too. While researching a new book about how bias is designed into technology, I found tons of products that paid lip service to women by plastering their apps with pink flowers and peppy "you-go-girl" attitude, but actually failed them — from apps that assumed all women were straight to photo filters that promised "beauty" by making users thinner and whiter to software that forced women to identify their marital status without even giving "Ms." as an option.

Sometimes I wish I could throw my phone in the river, but the truth is I love technology. It's my career, my social calendar, my connection to loved ones. So what's a feminist to do? It starts with all of us recognizing when technology alienates or harms women, and then demanding that the companies behind it do better. Here are just a few ways they need to change, stat:

Stop equating "average users" with men.

You know how a movie about a bunch of dudes is just a movie, but a film starring several women is a "chick flick"? Or how "unisex" sizes are actually just men's sizes? The world is full of this kind of bias: men are considered default humans, women are considered special edge cases, and nonbinary folks are ignored altogether. Now the tech that's supposedly carrying us into the future often reinforces those same worn-out tropes.

Did you know that almost every game you can buy from your phone's app store comes standard with a male character you can play, but hardly any include a female character? Or that when Apple Health launched, it promised to track "your whole health picture," but then didn't include a period tracker for a whole year (whose whole health picture, exactly)?

The problem even extends to technology like facial recognition, which is often much better at identifying white men than any other group. Even Sen. Al Franken has taken note: when Apple released the new iPhone X, which features new Face ID technology, he wrote a letter to Tim Cook asking, "How is Apple protecting against racial, gender, or age bias?"

Make safety central, not special.

Heard about Snap Map? The new Snapchat feature, which launched this Summer, shares your location with your friends whenever you open the app — right down to the street address. That rightly freaked out plenty of women, raising concerns of stalking, assault, and other types of abuse.

This is what happens when tech products aren't designed with safety — and particularly women's safety — in mind: "fun" ideas make it to market with serious flaws. Take Twitter: the company is still
struggling to stop the rampant abuse and harassment women experience on the platform. One reason the problem is hard to fix is that it was baked into the platform from the start. Unlike other networks, where you ask to be someone's friend, Twitter is nonreciprocal by default: you can simply follow people. That's because Twitter's central organizing principle isn't relationships. It's updates. "It started with a fascination with cities and how they work and what's going on in them right now," cofounder Jack Dorsey explained back in 2009. But updates are made by people — some of whom will game the system to harm others (remember what happened to Leslie Jones last year?). It's 2017. If tech products aren't thinking about safety from day one, they're putting women, and everyone else, at risk.

Let everyone be beautiful.

The first time I used the beauty filter in Snapchat, I was both enthralled and horrified — why did I look like some kind of sexy alien? Well, because the app slimmed my face, popped my cheekbones, enlarged my eyes, and . . . whitened my skin. It's not just Snapchat. Instagram filters are well known for "whitewashing" people of color, and earlier this year, users revolted against FaceApp's "hotness" filter, which lightened skin, slimmed noses, and replaced Asian eyes with Western ones.

Photo filters might seem trivial, but they're everywhere, and they're reinforcing the toxic idea that thinner and whiter is always better. Those are messages we've all heard way too many times already, and I'll be damned if I'm gonna take it from some app, too. Plus, the ramifications of this kind of design go far beyond just making you feel bad about your body. Artificial intelligence is already being trained to consider white people more beautiful than people of color. The more we let tech companies reinforce a limited definition of beauty, the more deeply our machines will learn that bias, too.

That's the thing about tech: the problem isn't just one sexist app. We're now embedding technology in every corner of our lives, and that tech is becoming less and less visible. Algorithms work behind the scenes to decide what you'll see, how much you pay, and a million other things. If we're not demanding that tech companies design for us now, we'll wake up to find that all that invisible tech is, increasingly, designed against us.

It doesn't have to be that way. Women aren't some niche group. We're a make-or-break market. So let's use that power — and demand tech that works for all of us.

Sara Wachter-Boettcher's new book, Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, is available on Amazon and in stores now.