Jump to ratings and reviews
Rate this book

The Black Box Society: The Secret Algorithms That Control Money and Information

Rate this book
Every day, corporations are connecting the dots about our personal behavior—silently scrutinizing clues left behind by our work habits and Internet use. The data compiled and portraits created are incredibly detailed, to the point of being invasive. But who connects the dots about what firms are doing with this information? The Black Box Society argues that we all need to be able to do so—and to set limits on how big data affects our lives.Hidden algorithms can make (or ruin) reputations, decide the destiny of entrepreneurs, or even devastate an entire economy. Shrouded in secrecy and complexity, decisions at major Silicon Valley and Wall Street firms were long assumed to be neutral and technical. But leaks, whistleblowers, and legal disputes have shed new light on automated judgment. Self-serving and reckless behavior is surprisingly common, and easy to hide in code protected by legal and real secrecy. Even after billions of dollars of fines have been levied, underfunded regulators may have only scratched the surface of this troubling behavior.Frank Pasquale exposes how powerful interests abuse secrecy for profit and explains ways to rein them in. Demanding transparency is only the first step. An intelligible society would assure that key decisions of its most important firms are fair, nondiscriminatory, and open to criticism. Silicon Valley and Wall Street need to accept as much accountability as they impose on others.

320 pages, Kindle Edition

First published November 17, 2014

94 people are currently reading
2852 people want to read

About the author

Frank Pasquale

14 books19 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
65 (20%)
4 stars
134 (41%)
3 stars
93 (28%)
2 stars
26 (8%)
1 star
3 (<1%)
Displaying 1 - 30 of 35 reviews
Profile Image for ☘Misericordia☘ ⚡ϟ⚡⛈⚡☁ ❇️❤❣.
2,531 reviews19.2k followers
December 7, 2019
The Fair, the Foul, and the Creepy. Algorythmic secrecy and negligence, collateral consequences.

I'm pretty sure this is one of those books that are the ones we return again and again and again. This is a very timely wake up call to our robotized, socially inept, befuddled in the social networks draggle world. We are gonna live dystopia soon. Or maybe are already, have been for some time and it just hasn't registered.

Some ideas in here are really naive but even so, it's refreshing to see someone giving a damn about any and all of it and not just basking under the dubious glow of some personalized discount coupons that bomb one's mailbox after they are done surfing some stuff online.

Q:
Consider just a few of the issues raised by the new technologies of ranking and evaluation:
• Should a credit card company be entitled to raise a couple’s interest rate if they seek marriage counseling? If so, should cardholders know this?
• Should Google, Apple, Twitter, or Facebook be able to shut out websites or books entirely, even when their content is completely legal? And if they do, should they tell us?
• Should the Federal Reserve be allowed to print unknown sums of money to save banks from their own scandalous behavior? If so, how and when should citizens get to learn what’s going on?
• Should the hundreds of thousands of American citizens placed on secret “watch lists” be so informed, and should they be given the chance to clear their names? (c)

... or ... Welcome to the fishbowl and lots of fun living dystopia!

Q:
Amateur epistemologists have many names for this problem. “Unknown unknowns,” “black swans,” and “deep secrets” are pop u lar catchphrases for our many areas of social blankness. There is even an emerging fi eld of “agnotology” that studies the “structural production of ignorance, its diverse causes and conformations, whether brought about by neglect, forgetfulness, myopia, extinction, secrecy, or suppression.” (c)
Q:
Alan Greenspan, once the most powerful central banker in the world, claimed that today’s markets are driven by an “unredeemably opaque” version of Adam Smith’s “invisible hand,” and that no one (including regulators) can ever get “more than a glimpse at the internal workings of the simplest of modern fi nancial systems.” (c)
Q:
But what if the “knowledge problem” is not an intrinsic aspect of the market, but rather is deliberately encouraged by certain businesses? What if financiers keep their doings opaque on purpose, precisely to avoid or to confound regulation? That would imply something very different about the merits of deregulation. (c)
Q:
What we do and don’t know about the social (as opposed to the natural) world is not inherent in its nature, but is itself a function of social constructs. (c)
Q:
But while powerful businesses, fi nancial institutions, and government agencies hide their actions behind nondisclosure agreements, “proprietary methods,” and gag rules, our own lives are increasingly open books. (c)
Q:
Credit raters, search engines, major banks, and the TSA take in data about us and convert it into scores, rankings, risk calculations, and watch lists with vitally important consequences. But the proprietary algorithms by which they do so are immune from scrutiny, except on the rare occasions when a whistleblower litigates or leaks. (c)
Q:
In his book Turing’s Cathedral, George Dyson quipped that “Facebook defines who we are, Amazon defines what we want, and Google defines what we think.” We can extend that epigram to include finance, which defi nes what we have (materially, at least), and reputation, which increasingly defi nes our opportunities. (c)
Q:
Transactions that are too complex to explain to outsiders may well be too complex to be allowed to exist. (c) Okay, that's much too much. It's people who should be working to better their judgement and understanding of the world, not the world dumbing down to their level. Though, of course, today's transparency isn't precisely transparent. Just look at the mess that is IFRS9.
Q:
Perched on Teo’s shoulder, as seen through the glasses, was a small dragon. ... It looked as though it couldn’t decide whether to nuzzle Teo or tear out his jugular. I sympathized. (c)
Q:
“Dr. Davis says BPD has something to do with sensitive people being raised in ‘invalidating environments.’ Whatever that means. So I guess, you know, don’t invalidate me.”(c)
Q:
A homeowner who followed the instructions on “Where’s the Note” reported that he took a 40- point hit on his credit score after his inquiry. In the Heisenberg- meets- Kafka world of credit scoring, merely trying to fi gure out possible effects on one’s score can reduce it. (c)
Q:
(One casino tracks how often its card dealers and waitstaff smile.) Analysts mine our e-mails for “insights about our productivity, our treatment of co- workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.” (с)
Q:
Automated systems claim to rate all individuals the same way, thus averting discrimination. They may ensure some bosses no longer base hiring and firing decisions on hunches, impressions, or prejudices. But software engineers construct the datasets mined by scoring systems; they defi ne the parameters of data-mining analyses; they create the clusters, links, and decision trees applied; they generate the predictive models applied. Human biases and values are embedded into each and every step of development. Computerization may simply drive discrimination upstream. (c)
Q:
Bewitched by matching and sorting programs, a company may treat ever more hires as “purple squirrels”— an HR term of art denoting the exact perfect fi t for a given position. For example, consider a health lawyer qualifi ed to work on matters involving Zone Program Integrity Contractors, but who does not use the specific acronym “ZPIC” on her resume. If automated software is set to search only for resumes that contain “ZPIC,” she’s probably not going to get an interview. She may never fi nd out that this small omission was the main, or only, reason she never got a callback. (c)
Q:
Then there’s the growing use of personality tests by retailers...
Writer Barbara Ehrenreich encountered one of those tests when she applied for a job at Walmart, and she was penalized for agreeing “strongly” rather than “totally” with this statement: “All rules must be followed to the letter at all times.” (c)
Q:
For example, a company might find that every applicant who answered “strongly agree” to all the questions above turned out to be a model employee, and those who answered “strongly disagree” ended up quitting or being fired within a month or two. The HR department would be sorely tempted to hire future applicants who “strongly agreed,” even without knowing how such professed attitudes related to the job at hand. (c)
Q:
In 2012, Latanya Sweeney, former director of the Data Privacy Lab at Harvard and now a senior technologist at the Federal Trade Commission, suspected that African Americans were being unfairly targeted by an online service. When Sweeney searched her own name on Google, she saw an ad saying, “Latanya Sweeney: Arrested?” In contrast, a search for “Tanya Smith” produced an ad saying, “Located: Tanya Smith.” The discrepancy provoked Sweeney to conduct a study of how names affected the ads served. She suspected that “ads suggesting arrest tend to appear with names associated with blacks, and neutral ads or no [such] ads tend to appear with names associated with whites, regardless of whether the company [purchasing the ad] has an arrest record associated with the name.” She concluded that “Google searches for typically African-
American names lead to negative ads posted by [the background check site] InstantCheckmate .com, while typically Caucasian names draw neutral ads.”
After Sweeney released her findings, several explanations for her results were proposed. ... let us suppose that (for what ever reasons) web searchers tended to click on Instant Checkmate ads more often when names associated with blacks had “arrest” associations, rather than more neutral ones. In that case, the programmer behind the ad- matching engine could say that all it is doing is optimizing for clicks— it is agnostic about people’s reasons for clicking.114 It presents itself as a cultural voting machine, merely registering, rather than creating, perceptions. (c)
Q:
Anyone may be labeled in a database as “unreliable,” “high medical cost,” “declining income,” or some other derogatory term. Reputation systems are creating new (and largely invisible) minorities, disfavored due to error or unfairness. Algorithms are not immune from the fundamental problem of discrimination, in which negative and baseless assumptions congeal into prejudice. They are programmed by human beings, whose values are embedded into their software. (c)
Q:
In Maryland, fifty-three antiwar activists, including two nuns and a Democratic candidate for local office, were placed on terrorist watch lists. (c)
Q:
An unaccountable surveillance state may pose a greater threat to liberty than any particular terror threat. It is not a spectacular danger, but rather an erosion of a range of freedoms. ... Mass surveillance may be doing less to deter destructive acts than it is slowly narrowing of the range of tolerable thought and behavior.(с)
Q:
We should not have to worry that the fates of individuals, businesses, and even our financial systems are at the mercy of hidden databases, dubious scores, and shadowy bets. (с)
Q:
Profile Image for Clare O'Beara.
Author 25 books372 followers
February 3, 2017
"Corporate secrecy expands as the privacy of human beings contracts."
Looking at the amount of data being amassed about each of us, the transparency or lack of it around use of this data, and the fact that it can be sold again and again, including to governments, the author warns us about the black box - the data recorder, the unknown workings. He tells us that giant corporations, banks and dodgy dealers benefit.

Well, just the other day Western Union was fined mightily for enabling money laundering activities for many individuals and groups. They had a threshold in place above which money transfers abroad needed authorisation, so of course such transfers were split in two. It's not hard to think up, folks. Nobody in Western Union cared, because they were doing business. We're told that the money laundering activity went unchecked for years, but the fine equates to nine months of profits. Well, nobody will do that again then, will they. I believe it is time that individuals were held personally responsible, from top down.

I run serious ad blockers on my computer. Any time I use an unsecured computer or phone I am astonished by the absolute rubbish ads that proliferate, and if you accidentally hover a cursor over them, on some computers, that is enough to unleash a torrent of screens about get rich quick schemes, gambling, fast weight loss, millionaire homes; nothing I would ever want to see. The ads follow the computer around through the browser. The ads alone would be enough to drive anyone to use the net incognito. Then there's the creepy way that search engines follow your tracks. Marketers want to know what they can sell to whom. We're told here that joining the wrong facebook group can raise a red flag for employers checking applicants. Personally I stay away from facebook. They keep changing the privacy settings every few months, specifically so they can sell customers' data, and have sold everything, your kids' photos and all, to a Russian based search engine. You can't get it back. And they put zombie cookies on your computer, that reappear after you disinfect it.

The author warns that even searching on health apps can ruin your chance of getting health insurance. Because the firms don't want to insure sick people, do they? The author says one percent of patients account for a fifth of health care costs. My recommendation is to use a search engine just to go to Wikipedia. You may find an article on your health issue by in-site searches and links underneath to actual medical research papers. All Google knows, we hope, is that you went to Wikipedia. There's a search engine that does not record your searches. Duckduckgo.

Target, a US retailer, went after tracking pregnant customers. They compared purchases made by those who had signed up for a baby club with the general run of customers, and where correlations existed they sent pregnancy and baby related ads. Some customers found this creepy. I would. The author says the process was revealed by a journalist, Charles Duhigg, in 2012. Target stopped talking to Duhigg. A couple of years later, hackers stole the stored details of 110 million customers. Other firms just sell all your data and someone else can easily combine two lists, the author points out, say of cancer patients or marriage counselling sessions and credit ratings.

To help us keep terms clear, we're given a useful table early on: first party tracking is a person using a fitness app, home finance software. Second party tracking is Amazon noticing you buy a diet book, or a supermarket logging your purchases in a database. Third party is when a intermediary logs data about what you have done, like a search engine or a credit card company. Fourth party is a broker buying up lists of data about someone else's customers, and typically combining lists.

The unknown data being stored and used can, we are told, lead to one luxury purchase adding to the prices of the next goods we search for online. My thought; shop with cash in a real shop and don't use a store card. We don't even have RFIDs mentioned. Media scholar Joseph Turow says firms decide on data whether we are targets to them, or 'waste'. Nice. Reminds me of the 'dead peasants' life assurance stealthily taken out by a retailer on its workers and revealed by Michael Moore in the film 'Bowling For Columbine'.

Workers are routinely under surveillance and sign away all rights in this matter. And the most-watched jobs can then be done by automation. Hiring is a hugely automated process to weed out applicants; I like that Barbara Ehrenreich is referred to when she went undercover at Walmart in 'Nickel And Dimed'. Algorithms are not explained to users and may just be tracking information from past users, not programmed to discriminate in any way.

I note that so far there is not a mention of terrorism. Isn't that what surveillance is supposed to be about?
We next get government (US) involvement in surveillance, but on its own citizens at protest marches and Occupy Wall Street movements. "$150 million of taxpayer money going to equip a government facility in lower Manhattan where Wall Street firms, serially charged with corruption, get to sit alongside the NYPD and spy on law abiding citizens." To this end, the government may have limits upon its own data gathering powers, so it just buys extra data from data brokers to build complete portraits.
Then we get a look at anti-terrorism. Odd that the men involved in the largest terrorist attacks on USA came from and were funded by countries like Saudi Arabia and Pakistan, none of which were among the seven countries just blocked from travel to USA by President Trump. Nor was Afghanistan. Wouldn't you think his computers would tell him which countries to block?

For crime investigation, we're told Reuters discovered in 2013 that tips were being passed from the NSA - which could have got them anywhere, facebook or Google - to the DEA which gave them to the IRS. The Special Operations Division of the DEA retrospectively fabricated reasons for investigating their targets. This is real black box work. Police and security are increasingly sharing information. And "many security officials go on to lucrative private sector employment." Edward Snowden leaked extensive files in 2013 proving that the security services worked with or otherwise used, telecoms services and big internet sites like facebook and Google.

For those concerned with privacy, we are warned that hackers can steal files from big stores, while your careful privacy settings can be changed by the companies you use; and going to the extreme of using Tor as a browser may flag you up as someone to watch. My thought; not to mention you may be revealing something of yourself to bad company on the dark net. While the very rich routinely hide their money in places like Panama, even the Panama Files breach recently proved that nothing is safe. "James Henry, a senior adviser to the Tax Justice Network, calculated the total amount of money hidden away from tax authorities as between $21 and 32 trillion."

All this before Chapter Three. The author tells us that he's in favour of terrorist plots and super-rich tax evasion being uncovered, he just dislikes all the marketing. Read on and decide what you agree or disagree with; what scares or delights you or creeps you out about the issues.

Apple and apps, Google and SEO rankings, Twitter trends, customised web searches (which means you get different results if the search engine knows who you are, rather than in an internet cafe), the sheer scale of enterprises like Google or Amazon. And don't forget facial recognition, barely mentioned. All this and more. Like content providers charging more in future if you access someone else's content, or making the access slower. But we read that like old pirate radio, the originally illegal download sites such as Youtube have been cleaning up their acts and protecting copyright. Legislation is trying to keep up with tech. Financial dealings and failures are looked at in another chapter, ever more murky. Including derivatives; bets on whether a share price will rise or fall, without investing in the shares. Gambling. "In 1994 Orange County, California.... (went) bankrupt after its treasurer lost $1.7 billion of its $7.4 billion investment portfolio in derivative bets." You what?

To be fair, the author doesn't just complain. He suggests ways of regulating, watching and fining the watchers. Of forcing them to share with you what they know about you and where they have sold it. Of checking how much influence any giant company has on the marketplace and the government.

Some issues don't seem to apply to me in Ireland, concerning the First Amendment and so on. But where are these companies headquartered? Nowhere in the world, if you ask Apple, which was recently judged by the EU to have fraudulently claimed a non-existent head office in order not to pay tax. Most tech giants though, seem to live in America. American health care fraud, which included patients being given treatment they didn't need, inflated time charges and imaginary patients, could really apply anywhere. Computer systems are flagging inconsistencies. We are told that the US "fell to 47th in Reporters Without Borders' Press Freedom Index," largely due to harassment of investigations of factory farming and the Wall Street protests. This matters to the world. High finance companies siphoning off money from every transaction you make, or fund you invest in, and interest as well, mean that "the New Economics Foundation calculated that leading London bankers 'destroy £7 worth of value for every pound in value they generate.'" I'm sure it's not just in London.

The author proposes that surveillance should be used, not against the law abiding, but against corporate greed and waste. I support that view. This is a fascinating book which will reward a read. You can find some of the same content elsewhere but this is comparatively easy to read and doesn't get bogged down on any one topic for too long. I noticed no mention of ANNs or artificial neural networks, also called AI or artificial intelligence (which is mentioned in a later note). I don't think they are outside the scope of a book on black boxes and indeed are the next step.

Notes occupy pages 211 - 306, index to 311. I saw 14 names I could be sure were female in the index. I counted 127 names I could be sure were female in the body of the detailed notes and references. Pasquale cites many journalists but is himself a Professor of Law at the University of Maryland and a member of the Council for Big Data, Ethics and Society.
Profile Image for Leif.
1,968 reviews104 followers
December 31, 2017
I wasn't thrilled.

Let's get this out of the way first: Pasquale's got his head squared on his shoulders the right way. The topics are engaging and genuinely pressing areas of public policy, and he uses examples from history that are illustrative and helpful. Issues like algorithmic prejudice, speculative data financialization, and shadowy technocratic governance are super important and super complex, and I value any intelligent confrontation with them. So that's all here, and it's all good.

For some reason, however, I found Pasquale's writing both accessible and impenetrable. Walls of text would slip past me and I wouldn't be able to piece together a coherent narrative or story; after reading entire chapters on a good unified topic – "search", for instance – I couldn't tell you Pasquale's key contention. I highlighted, took notes, and thought about the text, but organizationally and stylistically it couldn't hold me and I couldn't hold onto it.

So that wasn't too great.

At the end of the day, read about this, yes, please. There are other introductions to the subject, however, and you might be better off looking up those.
Profile Image for Lars K Jensen.
170 reviews51 followers
March 16, 2015
This book gets it title from double-meaning of the term 'black box':

It can refer to a recording device, like the data-monitoring systems in planes, trains and cars. Or it can mean a system whose workings are mysterious; we can observe its inputs and outputs, but we cannot tell how one becomes the other.


Pasquale has his sights set on the worlds of technology and finance, since it's here we find the most rampant use of algorithms. Here in Europe I imagine we are most familiar with the tech algorithms, brought to us by especially Google and Apple. Certainly the level of 'scoring' on US citizens surprised me. You guys are rated all of the time.

If you are really into the issues of algorithms and privacy, you might not get a lot of new knowledge from this book, but I feel I did. For instance, I hadn't heard of 'fusion centers' (read about them on Wikipedia) before I opened this book. The close connections between the regulators and the regulated in the world of finance were also news to me. And did you know that you might be labeled a terrorist for doing undercover reporting from some of the factory farms in the US?

I enjoyed huge parts of this book, but I have to admit that the chapter on finance/Wall Street was quite a test to get through. There are so many acronyms, terms and so on - and I found my focus slightly drifting. If you're looking for a crash course on what went wrong in 2008 this ain't it. Fortunaly I had read a bit about CDOs (again, Wikipedia) so I wasn't completely blown away - but it is the weaker link in the chain of the book. I guess it's the way it has to be, since the world of finance is the way it is - and Pasquale does say so himself, that he won't go into a look at who did what, when the market crashed.

The finance chapter makes good sense, though, when we get to the two last chapters, where Pasquale comes up with suggestions instead of merely pointing the finger. Pasquale believes that much can be learned from the ways in which health care fraud is being fought and that the Human Genome Project can act as a kind of role model on how many and which resources an oversight might require.

I'm not sure if you can fully enjoy/understand these chapters without the Wall Street chapter - but read it and hang in there, you'll make it, I'm sure :-)

As will all books with footnotes it's a 'notes on same page' vs. 'notes in the back of the book' discussion. Here the notes (and there are a ton of them!) are at the back of the book which makes me...not read them. Since there are so many, I would have preferred to have them on the same page as they are being referenced. That way my take-aways from this book might have been even better.

Don't let that fool you, it's still an interesting read, and if you ever touch or affect an algorithm (and we all do) you should consider reading it.


Some other reviews/mentions of 'The Black Box Society' you should definitely read:

* Steven Aftergood / Nature
* David Auerbach / Slate.com
* 'The Scoreboards Where You Can’t See Your Score' (Natasha Singer / The New York Times)
* 'Insure People Against Genetic Data Breaches' (Frank Pasquale / The York Times (opinion piece)
Profile Image for Philip Mlonyeni.
62 reviews9 followers
February 19, 2020
Absolutt verdt å lese. Så mange som påpeker hva som er galt med verden og så bare kommer med half-baked kombaya løsninger på ting, Frank derimot har to kapitler som detaljerer praktiske og gjennomførbare løsninger for hvordan den driten vi er i kan fikses. Chill, takk for tips
Profile Image for Ian Vance.
58 reviews7 followers
May 10, 2015
"...when powerful actors are profiting from failure, we can probably expect a good deal more of it in the future." (p. 191)

"...The grand illusion of contemporary finance is that endlessly processing claims to future wealth will somehow lead to a more productive economy... there is a good reason that these entities strive so hard to keep their methods secret: pull the curtain, and the economy's wizards look like little more than organizers of contests they'd never be able to compete in." (201)
27 reviews
June 12, 2019
Easy to read, lots interesting facts and thoughts about Internet, finance and US Health sector. The only thing missing is the point the author was supposed to make. The author wanders freely from finance to internet to US health sector and then back. Invasion into our privacy is bad, but in the next page it is secrecy which is bad, then it’s lobbying, and then something else. Interesting, but no coherent message.
Profile Image for George.
27 reviews23 followers
August 18, 2020
Derivative work that is alarmist in how it identifies a very real problem, and either one-sided or vacant in its attempts at a prescription for the challenges.
72 reviews11 followers
March 31, 2020
This is a very thoughtful and scholarly book about problems caused by the lack of transparency in the tech and finance sectors and how to fix those problems. The Black Box Society addresses the problem of secrecy in the domains of finance, search engines, and “reputation” which author Frank Pasquale uses to refer to the ways that algorithms infer details about your life and your thinking based on other information gathered about you. The “Black Box” refers to the vault of information that is used to run search engines, finance, and reputation algorithms that is unavailable to the public or even to the government. The problem is how this information is used to achieve considerable financial returns for a small number of people even while manipulating and harming individuals and small businesses.

The problem of intrusive surveillance by private tech firms is now an issue with much greater visibility. We are all aware of how corporations like Facebook surreptitiously gather data about people so that the data can be used by advertisers to manipulate users into buying something or feeling a particular way about a political cause. Unfortunately, much of the writing on this problem of “surveillance capitalism” is both alarmist but also resigned to the idea that society can and will acclimate to corporate surveillance without public oversight and regulation. What makes Frank Pasquale’s book so outstanding is that he suggests legal and political solutions to the problems caused by secrecy in tech and finance. Moreover, he makes a a compelling argument that the U.S. Government already has the capacity to limit the damage of “reputation” algorithms, search engine monopolization, and high stake financial arbitrage. This is a huge contrast to the many many writers who believe that we must acclimate ourselves to the power of private and unaccountable tech and finance barons who own the “black boxes.” When you know there are solutions, and you know we have the technical capacity to implement them, all that you need is the political will to make it happen.

Pasquale is realistic that the political will is lacking in U.S. politics. His diagnosis for the lack of political will is brilliant. He believes that our politics is mired in a stale debate about whether the government should grow much at the expense of private firms. In fact, as Pasquale states, the private firms and the government are increasingly intertwined, particularly because the people who run these institutions are often the same. It is therefore not a question of “more government” and “less business” but whether public policy should direct both to act more in the public interest when the public interest is not served only when these tech and finance giants make as much money as possible.

I appreciated that Pasquale included finance with search and reputation algorithms even though finance is not part of the tech sector to the same degree as the other two. A critic of this book could argue that including finance makes this more of a progressive activist argument for more government oversight of the economy rather than book about the problems with secretive information technology. Ultimately, the book makes a compelling case that the problem of secrecy in te. Like search and algorithms, finance is obtaining and creating complex, secretive information that it uses to make lots of money with little oversight.

I thought The Black Box Society was an insightful and hopeful book about an issue that is crucial to the maintenance of a free and open capitalistic society. I hope people read this instead of some of the more bleak and dystopian books about the topic.
Profile Image for Morgan.
868 reviews25 followers
November 18, 2017
This book, particularly the first three chapters, really made me think critically about society, technology, information, and freedom. The idea behind a "black box" didn't originate with Pasquale, but he is referring to it in two ways: first, the keeper of information (e.g. a flight's coordinates), and second, a mysterious box with inputs and outputs--computers, the human brain, cars, and so on. The problem is that increasingly these boxes, in the latter definition, are increasingly opaque and obscure what's really happening. These technological differences are also scary because, unlike mechanical objects, which can be stripped to their barest parts, newer technologies have no obvious parts. We are also largely unaware that we live in a black box society.

Pasquale points out the many ways our data is stored, shared, and cataloged without our knowledge or consent. Even when we do agree to certain practices, there is literally nothing preventing Big Data (as he calls it) from breaking those agreements. Facebook doesn't actually honor its privacy policies, which was demonstrated by court cases he cites. Employers buy data to determine if job candidates have higher inclinations towards diabetes or other diseases, which will cost the employer more in health care costs (and reject the applicant). Credit cards use internet searches to determine a couple's marital status--a couple in counseling (determined by internet search algorithms) to raise a couple's interest rates. And so on. Nearly everything we do is tracked. AND WE HAVE NO WAY OF KNOWING THIS.

There are 86 pages of notes. Pasquale uses numerous lawsuits, court cases (at many levels), peer reviewed articles, interviews, government hearings, and a variety of other sources to support his claims, so this is far from a Fox Mulder-like conspiracy theory.

This isn't all terrible--he isn't a Luddite arguing for the end of technologies. In fact, he highlights many positive aspects of improved technology in our lives. Yet there is this entire sector of the business world that is dedicated to buying, selling, collecting, harvesting, and controlling our data. Our social media usage, who we like, and what we say. Our health and medical records. Our phone calls. Our credit rating. Our shopping preferences. Target even figured out when shoppers were pregnant and began targeting (haha) their ads with pregnancy-related items.

The government is in many ways helpless, because they lack the power to do real damage or enact any serious sanctions on Google, Facebook (primarily those), and others. Google was fined $25 million in a case, which is an astronomical fine, but when they make more than that in a day, what does it matter? Facebook might change its policies, but that doesn't mean those policies won't change again. And how are we as consumers to know?

Pasquale's writing style is highly accessible, free of theory and academic jargon (though there are plenty of tech jargon and government acronyms) and so therefore it's easy to follow his points. This book offers some solutions, but overall, this book highlights many, many issues that we are unaware of, no matter how closely we watch our credit reports, pay attention to the news, or attempt to keep information secure. Our lives are impacted in ways we are completely unaware of, and it's frightening to realize that. But, as the saying goes, the first step towards fixing a problem is admitting there's a problem, and in many ways this book is that first step.
Profile Image for Simon Crowe.
6 reviews6 followers
November 26, 2017
In cybernetics a black box is a potentially complex system that is inscrutable and opaque to us but for the information that is fed into it and that which emerges from it. The book contains a considerable array of information on opaque systems and practices within credit ratings, search engines and algorithmic trading and sets out the flaws in these practices. Instead of proposing radical solutions, it suggests a raft of sensible policy changes that may ameliorate some of the damage done by opacity within finance and big data. In this sense, is written in the vein of James H. Moor's influential 1980s theorising of computer ethics, in that it concerns itself with filling the policy vacuums that arise from the introduction of new technologies.

There is no doubt that there is good material here, and much needed critical thinking on the vast and powerful institutions of big data and finance. What costs Pasquale a four star rating is that in my view the introduction fails to see the wood for the trees and the body of the text fails to do the opposite. The introduction doesn't adequately set out an argument let alone posit a compelling one and the following chapters lose themselves in a litany of facts and accounts without much of an overarching structure.
Profile Image for Nestor.
463 reviews
December 1, 2023
This might seem to be a conspiracy book for some people, though it is actually true, is what really happens(ed), and has a lot of information and very accurate sources. We should not be left without thinking about how our information is used and we should let our eyes and minds open.

If governments don't want to curb or limit finance or IT sectors is not because they can't, it is simply because the government's employees, whether they are elected(politicians) or not, need the stream of money to live or to finance their campaigns, so they choose to look to the side, or play the ball, or lie to public.

It's not that medicine has not advanced since the '70s because the low-hanging fruits have already been collected, it's because Wall Street greedy for the quick buck in the next quarter punishes R&D investing as an easy cost-cutting strategy. R&D cost per drug rose from a few million to a billion for the lack of proper investment in key resources favoring automated-brainless cheaper methods that cost more since no one is thinking just low-cost people behind machines pushing next-buttons brainlessly.

Interesting that the same companies that collect data are the subjects to hackers themself to learn what they know about us.
Profile Image for Paul.
43 reviews7 followers
May 15, 2021
This book collects together arguments about the challenges brought by industries that function as a black box: reputation, search, finance.

My main comment on this book relates to how it is written. The book reads essentially as a series of blog posts sewn together rather than a more rigorous tour of the domain. I think the title gave me an expectation that the content would be slightly more technical and focused than it actually was. However, the writing skims across ideas quite quickly, shares some one liners, and goes to the next topic. I also thought the book lacked any abstraction that would enable: (i) the reader to take away a framework for thinking about the topic more clearly; (ii) provide appeal to readers outside the USA. On the latter, the content is largely embedded in US news and culture which means that as a European you really have to persevere with this book to make it to the end.

A fair introduction to some of the central problems in this area.
Profile Image for Uffe Jon Ploug.
17 reviews
August 4, 2020
Frank Pasquale raises some very important points that are worth further investigation. Sadly, he decided to add a large number of suggestions for how to solve it, however they are based on his political leanings and some bias. He knows the criticism is coming. "I know, the tired rhetorical dichotomy between good old fashioned American capitalism and the evils of socialism will be wheeled out(...)" and a few pages later assumes that those people are paid off to hold their opinions. It is sad that an interesting topic with good basic research drowns in political rhetoric (for the critics are not offering rhetoric but good science - for the most part).
Profile Image for Jacob Naur.
61 reviews12 followers
August 11, 2018
Gives a great overview of the problems in tech and finance.

The book could have been a bit longer with more in depth explanations, but as a starting point if you are interested in the effects of digital black boxes on politics, the economy and society I do not think anyone has done a better job.

The book basically asks questions and we fail to respond adequately at our peril. In the EU we hope the GDPR will cure some of the diseases in tech. We will see about that.
Profile Image for William Crosby.
1,394 reviews11 followers
January 16, 2019
Fascinating investigation of algorithms used in many aspects of current life (e.g. for credit, social media, searches, personnel hiring and evaluations, financial industry) the coding of which is usually kept secret and which also means there is very rarely any means of appeal or pointing out that a decision or ranking made by a computer was wrong or based on inadequate measures.
Profile Image for Sarang Shaikh.
Author 3 books4 followers
April 21, 2018
“Facebook defines who we are, Amazon defines what we want, and Google
defines what we think.” - A complete book to understand how machine learning, big data, and artificial intelligence will be revolutionizing our lives in near future to cause data privacy, cybersecurity and bias issues.
6 reviews2 followers
October 20, 2020
Somehow I couldn't get into this book. The subject is very interesting and is even close to what I study on a day to day basis. I especially liked learning about black boxes used in the financial sector. But, it was a struggle for me to get through this one.
Profile Image for Ryan Maynard.
28 reviews3 followers
April 3, 2023
Working in software and infosec, the topic isn't a new one to me. That said, it was an excellent resource for crafting the questions we need to be asking ethically in our society and political landscape.
Profile Image for Nathaniel Hendrix.
16 reviews5 followers
June 3, 2017
Disappointing. Proceeds as a series of anecdotes with commentary, rather than having any sort of overarching story to it.
Profile Image for Andy Oram.
622 reviews30 followers
September 17, 2016
This book is both a litany of problems caused by the unethical use of data and a call for reform; both are useful. The first 140 pages of the book boil down published news and technical reports about credit ratings, privacy violations, biased search engine results, rogue spies, and similar issues in data mining. These chapters will furnish a review for people who have kept up with such trends for legal, policy, or technical reasons, but will probably provide all readers with new information as well. The book really unveils its unique value in Chapter 5, which suggests practical ways to fix (or at least ameliorate) the problems. As a lawyer, Pasquale concentrates on social and policy issues, so he doesn't explore the technical aspects of deploying algorithms, such as the use of genetic algorithms that take them even further from human values. Among Pasquale's ideas worth discussion are Institutional Review Boards for data brokers (on the model of IRBs for academic research), a requirement that reputation brokers reveal some of their criteria (which will rule out crude and biased criteria), making immutable logs of the database queries scoured by government spies in search of terror suspects, and requiring a regulator to approve new financial instruments. I like one of his fundamental principles: if an algorithm is too complex to be understood by reasonably intelligent people, it's too complex to use. More fundamentally, he persuasively challenges the notion that the processes used by online firms and brokers to rate and classify people should be opaque and protected from scrutiny by trade secrets or other intellectual property rights. I also like his calls for directing our resources away from tracking citizens and toward tracking financial and Internet firms, and his suggestion that computing power could help direct society's resources toward beneficial use.
11 reviews
February 19, 2017
While I can't say that I agree with everything in the book, it is most definitely eye-opening and suspicion confirming at the same time. I feel that folks who want to be informed of things in our rapidly changing society ought to engage with the ideas presented in this book.

In particular, I think some focus on the issues here rather than some of the other circuses we've focused on in the past couple of years would be beneficial in many ways for our nation.
85 reviews2 followers
June 20, 2020
The Black Box Society is such a great book! I happen to agree on most of the concerns raised by the author. Many people overlook the dangers of misuse of technology but you don't need to go very far in history to see that we as a society do not have a good track-record of using powerful technologies.
Lots of information presented was new to me and I think I now appreciate books which meticulously list the references at the end!

We do not have an easy way to find and follow up on how our data is filed, stored and sold. The example of a person whose credit report erroneously mentioned that she was convicted for drug charges - ended up not only being denied for jobs, apartments but even credit for buying a dish-washer. What this highlights is that for every person who is actually aware of the errors in their "credit files", there are thousands who remain oblivious to this notion that such a thing could go wrong. And it might be costing them hundreds of dollars and worse interest rates.

Another thing that comes up - we as users/customers completely open up our lives - what we search, what we buy, where we go, what our likings are and it helps these big giant companies to build a profile of us with these data points and it is then sold. Irony is these companies are completely opaque to us about how this data is being used, to who it is sold.

Lastly, if you are to believe that regulators can help - so far it has been disappointing. They don't have enough resources first of all. An annual million dollar budget to oversee bunch of hedge funds is not going to help stave off another financial crisis by keeping financial derivatives in check. Secondly, the rate at which things are going to become complex and sophisticated - the regulators will often be left behind and companies cannot be relied upon to self-regulate, can they?
674 reviews18 followers
September 14, 2016
Have you ever why your credit limit differs from your friend who earns around the same? Why do creepy ads follow your social media and email(Facebook/Gmail)? What are the drawbacks of taking your activity online whether it is medicines purchase or online shopping? Why do technology companies insist on full data/rights/cookies from you but protect their own trade secrets?Why is your resume getting no interview calls(hint-this could be a simple tweak). In this book covering multiple domains, Frank makes a passionate case about why we should bother about the impersonalization of decisions and the need for more human judgement and regulation for areas to avoid bias. While we may not be able to presently 'code for neutrality', this is where we may need to increase human intervention or localization, and perhaps reduce the jobless growth. A book which led me thinking way after putting it down
Profile Image for Keith Dodds.
6 reviews4 followers
July 22, 2020
A must read for those looking into the current and future of Artificial Intelligence and the many dystopian results that are waiting for us (and already upon us). With a particular focus on the financial services industry but going way beyond that, Frank Pasquale outlines the mad schemes driven by the same profit motive that brought us the 2008-2009 financial catastrophe, from which the world's banks have learned nothing... except to double down on profits for the next one.

Profile Image for Jen Watkins.
Author 3 books23 followers
July 17, 2015
I got rather tired of the google-bashing, but overall this is an important book to internalize. Here is a question. On page 114, Pasquale writes, "Just as an unduly high credit score could help a consumer get a loan he had no chance of paying back...". Don't credit scores determine your interest rate, not the amount of the loan you qualify for?

Profile Image for Hannah.
112 reviews5 followers
February 20, 2015
Pasquale walks the line between alarmist and wolf-crying. While he makes many good points about the rise of algorithms, his blatant leftist political views may make them harder to swallow for some. The selection of topics (and at times their treatment) can feel scattershot, but the writing is mostly peppy and well-paced.
Displaying 1 - 30 of 35 reviews

Can't find what you're looking for?

Get help and learn more about the design.