Some interesting concepts:
- Normalizing
- Edge vs Stress
- The 'select one' being the problematic idea
- The default settings of our lives (and everything else)
- Q: metrics are only as good as the goals and intentions that underlie them (c)
- Q: inappropriate, trying-too-hard, chatty tech products. (c)
- “marketing negging”
- The unlikely delights of Q:'1-800-Flowers purchase particularly relevant to a Scorpio' ©
- DAUs/MAUs/CAUs
Quite a lot of problematic issues. Precisely the ones that lead diversity intentions to ruin.
Some ludicrous ideas: like, going about how to connect with a made up persona made up specifically to connect with. I'm calling this one a BS job!
Also, a lot of genuinely good material. Here go handpicked examples of both:
Q:
There, there, dear. Don’t worry about what we’re doing with your account. Have a balloon. (c)
Q:
Back in 2011, if you told Siri you were thinking about shooting yourself, it would give you directions to a gun store. (c) Now I'm tempted to use Siri. Attabotgirl.
Q:
far too many people in tech have started to believe that they’re truly saving the world. Even when they’re just making another ride-hailing app or restaurant algorithm. (c) I'm pretty sure that goes way beyond that. And even beyond the tech.
Q:
I’m writing this in the wake of the 2016 presidential election—an election that gave us an American president who is infamous for allegations of sexual assault, racism, conflicts of interest, collusion, and angry Tweetstorms, and who rode to power on a wave of misinformation. (c) The problem was that there were 2 very problematic candidates, not just one. Hah. Another problem is that people actually expect Facebook or Twitter or some other shit to tell them how to vote. Problem numero trece is that people seem to be actually believing that that the flow of trash called 'news' in FB isn't actually 'news'. So… How very comfy to blame the tech.
Q:
You don’t need a computer science degree or a venture capital fund. You don’t need to be able to program an algorithm. All you need to do is slough away the layers of self-aggrandizement and jargon, and get at the heart of how people in technology work—and why their decisions so often don’t serve you. (c) That's actually not true. Self-aggrandizement and jargon - all of that is just perception which might be skewed or not. Understanding is the key.
Q:
we’ll take a closer look at how the tech industry operates, and see how its hiring practices and work culture create teams that don’t represent most of us—no matter how many “diversity” events these companies put on. (c) Why should they hire someone who represents anything instead of someone who's able to do the job? Diversity is about not refusing to hire a capable young mother or someone of another race. Hiring representatives is a totally different opera.
Q:
Designers and technologists don’t head into the office planning to launch a racist photo filter or build a sexist assumption into a database. (c) LOL
Q:
… she spent the next hour listening to older men tell her about the “female market,”…
The men in the room insisted that most women really care about leisure-time activities. (c) Now, this must have been fun :)
Q:
Even though the company had forty-odd employees and had been in business more than a decade, no staff member had ever been pregnant. … “We have three other women of childbearing age on our team, and we don’t want to set a precedent,” the owner told her, as if pregnancy were some sort of new trend. (c) Wowser. These guys must have grown on trees. Some rotten fruits.
Q:
… the two teams with lots of women on staff, were sent an email by a board member asking them to “put together some kind of dance routine to perform at the company presentation.
…
The heads of each department, all men, stood up and talked about their successes over the course of the year. The only women who graced the stage were a group of her peers in crop tops and hot pants. The men in the audience wolf-whistled while the women danced. (c) That's some company.
Q:
Amélie Lamont, whose manager once claimed she hadn’t seen her in a meeting. “You’re so black, you blend into the chair,” she told her. (c) Damn. I've actually once had a very similar discussion. I've never before or after wanted so much to suggest that that reviewer should by the effing glasses and spare me the bullshit!
Q:
Tech is also known for its obsession with youth—an obsession so absurd that I now regularly hear rumors about early-thirties male startup founders getting cosmetic surgery so that investors will think they’re still in their twenties. (c) Yep, that's a fact.
Q:
Other companies start their workdays with all-staff meetings held while everyone does planks—the fitness activity where you get on the ground, prop yourself up by your feet and elbows, and hold the position until your abs can’t handle it anymore. If you’re physically able to plank, that is. And you’re not wearing a dress. Or feeling modest. Or embarrassed. Or uncomfortable getting on your hands and knees at work. (c) Ridiculous… Riddiculus!
Q:
I’m not interested in ping-pong, beer, or whatever other gimmick used to attract new grads. The fact that I don’t like those things shouldn’t mean I’m not a “culture fit.” I don’t want to work in tech to fool around, I want to create amazing things and learn from other smart people. That is the culture fit you should be looking for. (c) Golden words!
Q:
The good news is there’s actually no magic to tech. As opaque as it might seem from the outside, it’s just a skill set—one that all kinds of people can, and do, learn. There’s no reason to allow tech companies to obfuscate their work, to call it special and exempt it from our pesky ethics. Except that we’ve never demanded they do better. (c) And except that many of us don't really bother learning how stuff works. Had these companies disclosed all their proprietary code today not many of us would know how to make head or tail of it.
Q:
Are you a “Kelly,” the thirty-seven-year-old minivan mom from the Minneapolis suburbs? Or do you see yourself as a “Matt,” the millennial urban dweller who loves CrossFit and cold-brew coffee? Maybe you’re more of a “Maria,” the low-income community college student striving to stay in school while supporting her parents.
No? Well, this is how many companies think about you. (c) Now, that's a great point.
Q:
…
she test-drove some menstrual cycle apps, looking for one that would help her get the information she needed.
What she found wasn’t so rosy.
Most of the apps she saw were splayed with pink and floral motifs, and Delano immediately hated the gender stereotyping. But even more, she hated how often the products assumed that fertility was her primary concern—rather than, you know, asking her. (c) LOL. It wasn't rosy: it was pink and florid.
Q:
Glow works well for women who are trying to get pregnant with a partner. But for everyone else, both services stop making sense—and can be so alienating that would-be users feel frustrated and delete them. (c) Well, frankly, I don't think the right problem is being highlighted here. Glow might be cheesy. It also actually was initially rolled out for women trying to get pregnant. So, IMO, women who don't, might do better choosing some other app. No shit, Sherlock. Every single app doesn't have to be a multitool capable of Pyhon coding, getting one pregnant and building space ships.
The problem more likely is that the market either doesn't clearly specify the alternative needs and apps applicable to other cases or does have voids in some respects. That's actually both a problem and a business opportunity.
Q:
What happens when those someones are the people we met in Chapter 2: designers and developers who’ve been told that they’re rock stars, gurus, and geniuses, and that the world is made for people like them? (c) The Big Flip Flop?
Q:
But when default settings present one group as standard and another as “special”—such as men portrayed as more normal than women, or white people as more normal than people of color—the people who are already marginalized end up having the most difficult time finding technology that works for them. (c) Amen.
Q:
If you’ve designed a cockpit to fit the average pilot, you’ve actually designed it to fit no one. …
So, what did the air force do? Instead of designing for the middle, it demanded that airplane manufacturers design for the extremes instead—mandating planes that fit both those at the smallest and the largest sizes along each dimension. Pretty soon, engineers found solutions to designing for these ranges, including adjustable seats, foot pedals, and helmet straps—the kinds of inexpensive features we now take for granted. (c)
Q:
When designers call someone an edge case, they imply that they’re not important enough to care about—that they’re outside the bounds of concern. In contrast, a stress case shows designers how strong their work is—and where it breaks down. (c) Edge vs Stress gives interesting dichotomy.
Q:
I saw race and ethnicity menus that couldn’t accommodate people of multiple races. I saw simple sign-up forms that demanded to know users’ gender, and then offered only male and female options. I saw college application forms that assumed an applicant’s parents lived together at a single address. (c) Fucked up design. And not just design.
Q:
Take Shane Creepingbear, a member of the Kiowa tribe of Oklahoma. In 2014 he tried to log into Facebook. But rather than being greeted by his friend’s posts like usual, he was locked out of his account and shown this message:
Your Name Wasn’t Approved. …
Adding to the insult, the site gave him only one option: a button that said “Try Again.” There was nowhere to click for “This is my real name” or “I need help.”
…
Facebook also rejected the names of a number of other Native Americans: Robin Kills the Enemy, Dana Lone Hill, Lance Brown Eyes. (In fact, even after Brown Eyes sent in a copy of his identification, Facebook changed his name to Lance Brown.) (c) Oh, this is top.
Q:
… there’s still the fact that Facebook has placed itself in the position of deciding what’s authentic and what isn’t—of determining whose identity deserves an exception and whose does not. (c) Which is quite obviously bonkers.
Q:
People who identify as more than one race end up having to select “multiracial.” As a result, people who are multiracial end up flattened: either they get lumped into a generic category, stripped of meaning, or they have to pick one racial identity to prioritize and effectively hide any others. They can’t identify the way they would in real life, and the result is just one more example of the ways people who are already marginalized feel even more invisible or unwelcome. (c)
Q:
When you remember how few people change the default settings in the software they use, Facebook’s motivations become a lot clearer: Facebook needs advertisers. Advertisers want to target by gender. Most users will never go back to futz with custom settings. So, Facebook effectively designs its onboarding process to gather the data it wants, in the format advertisers expect. Then it creates its customizable settings and ensures it gets glowing reviews from the tech press, appeasing groups that feel marginalized—all the while knowing that very few people, statistically, will actually bother to adjust anything. Thus, it gets a feel-good story about inclusivity, while maintaining as large an audience as possible for advertisers. It’s a win-win . . . if you’re Facebook or an advertiser, that is. (c)
Q:
It was cute, unless you wanted to react to a serious post and all you had was a sad Frankenstein (c) Quite the company.
Q:
“Hi Tyler,” one man’s video starts, using title cards. “Here are your friends.” He’s then shown five copies of the same photo. The result is equal parts funny and sad—like he has just that one friend. It only gets better (or worse, depending on your sense of humor) from there. Another title card comes up: “You’ve done a lot together,” followed by a series of photos of wrecked vehicles, culminating in a photo of an injured man giving the thumbs up from a hospital bed. I suppose Facebook isn’t wrong, exactly: getting in a car accident is one definition of “doing a lot together.” (c) This is both hilarious and horrifying.
Q:
You can probably guess what went wrong: in one, Facebook created a montage of a man’s near-fatal car crash, set to an acoustic-jazz ditty. Just imagine your photos of a totaled car and scraped-up arms, taken on a day you thought you might die, set to a soft scat vocal track. Doo-be-doo-duh-duh, indeed. (c)
Q:
… Tumblr: “Beep beep! #neo-nazis is here!” it read. …
a Tumblr employee told Rooney that it was probably a “what you missed” notification. Rooney had previously read posts about the rise in fascism, and the notification system had used her past behavior to predict that she might be interested in more neo-Nazi content. …
… another Tumblr user shared a version of the notification he received: “Beep beep! #mental-illness is here!”) (c) Well, this is what happens when people are being treated as kids by apps.
Q:
Maybe I’m the only one who’s just not interested in snotty comebacks from my phone, though I doubt it.
…
Why would anyone want their credit card offers to be dependent on the weather?
What, precisely, would we do to make a 1-800-Flowers purchase particularly relevant to a Scorpio?
…
“How the hell did I end up here?” (c)
Q:
Delight is a concept that’s been tossed around endlessly in the tech industry these past few years, and I’ve always hated it. (c)
Q:
… What Facebook Thinks You Like. The extension trawls Facebook’s ad-serving settings, and spits out a list of keywords the site thinks you’re interested in, and why. There’s the expected stuff... Then there’s a host of just plain bizarre connections: “Neighbors (1981 Film),” a film I’ve never seen and don’t know anything about. A host of no-context nouns: “Girlfriend,” “Brand,” “Wall,” “Extended essay,” “Eternity.” I have no idea where any of this comes from—or what sort of advertising it would make me a target for. Then it gets creepy: “returned from trip 1 week ago,” “frequent international travelers.” I rarely post anything to Facebook, but it knows where I go, and how often. (c)
Q:
1,500 individual tidbits of information about you, all stored in a database somewhere and handed out to whoever will pay the price (c)
Q:
The technology is based on deep neural networks: massive systems of information that enable machines to “see,” much in the same way the human brain does. (c) That's not precisely correct.
Q:
… a future where Facebook AI listens in on conversations to identify potential terrorists, where elected officials hold meetings on Facebook, and where a “global safety infrastructure” responds to emergencies ranging from disease outbreaks to natural disasters to refugee crises. (c) Welcome to the fish bowl!