Goodreads helps you follow your favorite authors. Be the first to learn about new releases!
Start by following Meredith Broussard.
Showing 1-30 of 62
“I heard people repeat the same promises about the bright technological future, but I saw the digital world replicate the inequalities of the “real” world”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“(To think like a computer programmer, it helps to be lazy.)”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“no statistical prediction can or will ever be 100 percent accurate -- because human beings are not and never will be statistics.”
― Artificial Unintelligence
― Artificial Unintelligence
“One approach involves looking at three different kinds of bias: physical bias, computational bias, and interpretation bias. This approach was proposed by UCLA engineering professor Achuta Kadambi in a 2021 Science paper.26 Physical bias manifests in the mechanics of the device, as when a pulse oximeter works better on light skin than darker skin. Computational bias might come from the software or the dataset used to develop a diagnostic, as when only light skin is used to train a skin cancer detection algorithm. Interpretation bias might occur when a doctor applies unequal, race-based standards to the output of a test or device, as when doctors give a different GFR threshold to Black patients. “Bias is multidimensional,” Kadambi told Scientific American. “By understanding where it originates, we can better correct it.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“In CS, it can be hard to explain the difference between the easy and the virtually impossible,” reads the caption.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“The problems are hidden inside code and data, which makes them harder to see and easier to ignore.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“standardized tests are not based on general knowledge. As I learned during my investigation, they are based on specific knowledge contained in specific sets of books: the textbooks created by the test makers.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“We tend to assume that when people are experts at one thing, their expertise extends to other areas as well.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“Disciplines like math, engineering, and computer science pardon a whole host of antisocial behaviors because the perpetrators are geniuses.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“Reasonable, smart people disagree about what will happen in the future—in part because nobody can see the future.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“code is prioritized above human interactions.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“All these assistants are given female names and default identities by tech executives and developers—no accident. “I think that probably reflects what some men think about women—that they’re not fully human beings,”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“We understand the quantitative through the qualitative.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“The NYC algorithm task force disaster emphasizes that it's important to have the ability to say "no" to the tech if it is not working well or as expected. Few people are prepared to let software projects go; it's hard to kill your darlings after you've written them, and it's even harder to kill your darlings after you've invested thousands or millions into developing them. Everyone making an algorithmic system needs to be prepared to confront the shortcomings of the computational system and of the larger sociocultural context.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“Kaggle.16 Kaggle, which is owned by Google’s parent company, Alphabet,”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“We can learn to make better decisions about the downstream effects of technology so that we don’t cause unintentional harm inside complex social systems.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“Here’s an open secret of the big data world: all data is dirty. All of it.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“After the shoplifting incident, the Shinola store gave a copy of its surveillance video to the Detroit police. Five months later, a digital image examiner for the Michigan State Police looked at the grainy, poorly lit surveillance video on her computer and took a screen shot.2 She uploaded it to the facial recognition software the police used: a $5.5 million program supplied by DataWorks Plus, a South Carolina firm founded in 2000 that began selling facial recognition software developed by outside vendors in 2005. The system accepted the photo; scanned the image for shapes, indicating eyes, nose, and mouth; and set markers at the edges of each shape. Then, it measured the distance between the markers and stored that information. Next, it checked the measurements against the State Network of Agency Photos (SNAP) database, which includes mug shots, sex offender registry photographs, driver’s license photos, and state ID photos. To give an idea of the scale, in 2017, this database had 8 million criminal photos and 32 million DMV photos. Almost every Michigan adult was represented in the database.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“Joy Buolamwini’s organization, the Algorithmic Justice League (AJL), is doing a project called CRASH, which stands for Community Reporting of Algorithmic System Harms. Part of the project involves establishing bug bounties. The concept of a bug bounty comes from cybersecurity, where people can be rewarded by tech companies for finding and reporting system bugs.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“Sasha Costanza-Chock’s book Design Justice outlines the principles and explores how “universalist design principles and practices erase certain groups of people—specifically, those who are intersectionally disadvantaged or multiply burdened under the matrix of domination (white supremacist heteropatriarchy, ableism, capitalism, and settler colonialism).”12 Design can be a tool for collective liberation. The disability justice movement has taken leadership on this issue.13 The #disabilityjustice hashtag is one place to start learning more; other hashtags for learning about disability include #deaftwitter, #blindtwitter, #a11y, #blindtiktok, #disabilityawareness, and #instainclusion.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“In April 2021, the FTC published guidance on corporate use of AI. “If a data set is missing information from particular populations, using that data to build an AI model may yield results that are unfair or inequitable to legally protected groups,” reads the FTC guidance. “From the start, think about ways to improve your data set, design your model to account for data gaps, and—in light of any shortcomings—limit where or how you use the model.”3 Other tips include watching out for discriminatory outcomes, embracing transparency, telling the truth about where data comes from and how it is used, and not exaggerating an algorithm’s capabilities. If a model causes more harm than good, FTC can challenge the model as unfair. This guidance put corporate America on alert. Companies need to hold themselves accountable for ensuring their algorithmic systems are not unfair, in order to avoid FTC enforcement penalties.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“Data journalism is the practice of finding stories in numbers and using numbers to tell stories.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“Technochauvinism is a kind of bias that considers computational solutions to be superior to all other solutions. Embedded”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“Microsoft researchers developed what they call a “community jury,” a process for soliciting input from stakeholders during the software development process. Community input is not a new idea, but embedding it in business processes as a step to stave off algorithmic harms is a new development.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“I recommend trusting that the reason for an algorithmic decision is some kind of preexisting social bias manifesting in subtle (or obvious) ways. It feels unsatisfying to lack a reason, and that dissatisfaction can fuel a drive to achieve social justice. Tech is racist and sexist and ableist because the world is so. Computers just reflect the existing reality and suggest that things will stay the same—they predict the status quo. By adopting a more critical view of technology, and by being choosier about the tech we allow into our lives and our society, we can employ technology to stop reproducing the world as it is, and get us closer to a world that is truly more just.”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“Biologist Carl T. Bergstrom and information scientist Jevin West teach a class called “Calling Bullshit” at the University of Washington, and published a book with the same name. “Bullshit involves language, statistical figures, data graphics, and other forms of presentation intended to persuade by impressing and overwhelming a reader or listener, with a blatant disregard for truth and logical coherence,” in their definition. They offer a simple, three-step method for bullshit detection, which involves these questions: Who is telling me this? How do they know it? What are they trying to sell me?”
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
― More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech
“data is socially constructed.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“Because social decisions are about more than just calculations, problems will always ensue if we use data alone to make decisions that involve social and value judgments.”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“Using a journalist’s skepticism about what can go wrong can help us move from blind technological optimism to a more reasonable, balanced perspective”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World
“The first data-driven investigative story appeared in 1967, when Philip Meyer used social science methods and a mainframe computer to analyze data on race riots in Detroit for the Detroit Free Press. “One theory, popular with editorial writers, was that the rioters”
― Artificial Unintelligence: How Computers Misunderstand the World
― Artificial Unintelligence: How Computers Misunderstand the World





