Jump to ratings and reviews
Rate this book

System Error: Where Big Tech Went Wrong and How We Can Reboot

Rate this book
A forward-thinking manifesto from three Stanford professors—experts who have worked at ground zero of the tech revolution for decades—which reveals how big tech’s obsession with optimization and efficiency has sacrificed fundamental human values and outlines steps we can take to change course, renew our democracy, and save ourselves.

352 pages, Hardcover

First published September 7, 2021

Loading interface...
Loading interface...

About the author

Rob Reich

8 books14 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
144 (28%)
4 stars
191 (38%)
3 stars
126 (25%)
2 stars
28 (5%)
1 star
8 (1%)
Displaying 1 - 30 of 68 reviews
Profile Image for Will Byrnes.
1,327 reviews121k followers
September 8, 2022
Technologists have no unique skill in governing, weighing competing values, or assessing evidence. Their expertise is in designing and building technology. What they bring to expert rule is actually a set of values masquerading as expertise—values that emerge from the marriage of the optimization mindset and the profit motive.
--------------------------------------
Like a famine, the effects of technology on society are a man-made disaster: we create the technologies, we set the rules, and what happens is ultimately the result of our collective choices.
Yeah, but what if the choices are not being made collectively?

What’s the bottom line on the bottom line? The digital revolution has made many things in our lives better, but changes have come at considerable cost. There have been plenty of winners from the digitization of content, the spread of the internet, the growth of wireless communication, and the growth of AI. But there have been battlefields full of casualties as well. Unlike actual battlefields, like those at Gettysburg, many of the casualties in the battles of the digital revolution did not enlist, and did not have a chance to vote for or against those waging the war, a war that has been going on for decades. But we, citizens, do not get a say in how that war is waged, what goals are targeted, or how the spoils or the costs of that war are distributed.

description
Reich, Sahami, and Weinstein - image from Stanford University

In 2018, the authors of System Error, all professors at Stanford, developed a considerable course on Technology, Policy, and Ethics. Many Technical and Engineering programs require that Ethics be taught in order to gain accreditation. But usually those are stand-alone classes, taught by non-techies. Reich, Sahami, and Weinstein wanted something more meaningful, more a part of the education of budding computer scientists than a ticking-off-the-box required course. They wanted the teaching of the ethics of programming to become a full part of their students’ experience at Stanford. That was the source for what became this book.

They look at the unintended consequences of technological innovation, focusing on the notions of optimization and agency. It is almost a religion in Silicon Valley, the worship of optimization uber alles. Faster, cleaner, more efficient, cheaper, lighter. But what is it that is being optimized? To what purpose? At what cost, to whom? Decided on by whom?
…there are times when inefficiency is preferable: putting speed bumps or speed limits onto roads near schools in order to protect children; encouraging juries to take ample time to deliberate before rendering a verdict; having the media hold off on calling an election until all the polls have closed…Everything depends on the goal or end result. The real worry is that giving priority to optimization can lead to focusing more on the methods than on the goals in question.
Often blind allegiance to the golden calf of optimization yields predictable results. One genius decided to optimize eating, so that people could spend more time at work, I guess. He came up with a product that delivered a range of needed nutrients, in a quickly digestible form, and expected to conquer the world. This laser focus managed to ignore vast swaths of human experience. Eating is not just about consuming needed nutrients. There are social aspects to eating that somehow escaped the guy’s notice. We do not all prefer to consume product at our desks, alone. Also, that eating should be pleasurable. This clueless individual used soy beans and lentils as the core ingredients of his concoction. You can guess what he named it. Needless to say, it was not exactly a marketing triumph, given the cultural associations with the name. And yes, they knew, and did it anyway.

There are many less entertaining examples to be found in the world. How about a social media giant programming its app to encourage the spread of the most controversial opinions, regardless of their basis in fact? The outcome is actual physical damage in the world, people dead as a result, democracy itself in jeopardy. And yet, there is no meaningful requirement that programmers adhere to a code of ethics. Optimization, in corporate America, is on profits. Everything else is secondary, and if there are negative results in the world as a result of this singular focus, not their problem.

How about optimization that relies on faulty (and self-serving) definitions. Do the things we measure actually measure the information we want? For example, there were some who measured happiness with their product by counting the number of minutes users spent on it. Was that really happiness being measured, or maybe addictiveness?

Algorithms are notorious for picking up the biases of their designers. In an example of a business using testing smartly, a major company sought to develop an algorithm it could use to evaluate employment candidates. They gave it a pretty good shot, too, making revision after revision. But no matter how they massaged the model the results were still hugely sexist. Thankfully they scrapped it and returned to a less automated system. One wonders, though, how many algorithmic projects were implemented when those in charge opted to ignore the down-side results.

So, what is to be done? There are a few layers here. Certainly, a professional code of ethics is called for. Other professions have them and have not collapsed into non-existence, doctors, lawyers, engineers, for example. Why not programmers? At present there is not a single, recognized organization, like the AMA, that could gain universal accedence to such a requirement. Organizations that accredit university computer science programs could demand more robust inclusion of ethical course material across course-work.

But the only real way we as a society have to hold companies accountable for the harm already inflicted, and the potential harm new products might cause, is via regulation. As individuals, we have virtually no power to influence major corporations. It is only when we join our voices together through democratic processes that there is any hope of reining in the worst excesses of the tech world, or working with technology companies to come to workable solutions to real-world problems. It is one thing for Facebook to set up a panel to review the ethics of this or that element of its offerings. But if the CEO can simply ignore the group’s findings, such panels are meaningless. I think we have all seen how effective review boards controlled by police departments have been. Self-regulation rarely works.

There need not be an oppositional relationship between tech corporations and government, despite the howling by CEOs that they will melt into puddles should the wet of regulation ever touch their precious selves. What a world: what a world! A model the authors cite is transportation. There needs to be some entity responsible for roads, for standardizing them, taking care of them, seeing that rules of the road are established and enforced. It is the role of government to make sure the space is safe for everyone. As our annual death rate on the roads attests, one can only aim for perfection without ever really expecting to achieve it. But, overall, it is a system in which the government has seen to the creation and maintenance of a relatively safe communal space. We should not leave to the CEOs of Facebook and Twitter decisions about how much human and civic roadkill is acceptable on the Information Highway.

The authors offer some suggestions about what might be done. One I liked was the resurrection of the Congressional Office of Technology Assessment. We do not expect our elected representatives to be techies. But we should not put them into a position of having to rely on lobbyists for technical expertise on subjects under legislative consideration. The OTA provided that objective expertise for many years before Republicans killed it. This is doable and desirable. Another interesting notion:
“Right now, the human worker who does, say $50,000 worth of work in a factory, that income is taxed and you get an income tax, social security tax, all those things.
If a robot comes in to do the same thing, you’d think we’d tax the robot at a similar level.”
Some of their advice, while not necessarily wrong, seems either bromitic or unlikely to have any chance of happening. This is a typical thing for books on social policy.
…democracies, which welcome a clash of competing interests and permit the revisiting and revising of questions of policy, will respond by updating rules when it is obvious that current conditions produce harm…
Have the authors ever actually visited America outside the walls of Stanford? In America, those being harmed are blamed for the damage, not the evil-doers who are actually foisting it on them.

What System Error will give you is a pretty good scan of the issues pertaining to tech vs the rest of us, and how to think about them. It offers a look at some of the ways in which the problems identified here might be addressed. Some entail government regulation. Many do not. You can find some guidance as to what questions to ask when algorithmic systems are being proposed, challenged, or implemented. And you can also get some historical context re how the major tech changes of the past impacted the wider society, and how they were wrangled.

The book does an excellent job of pointing out many of the ethical problems with the impact of high tech, on our individual agency and on our democracy. It correctly points out that decisions with global import are currently in the hands of CEOs of large corporations, and are not subject to limitation by democratic nations. Consider the single issue of allowing lies to be spread across social media, whether by enemies foreign or domestic, dark-minded individuals, profit-seekers, or lunatics. That needs to change. If reasonable limitations can be devised and implemented, then there may be hope for a brighter day ahead, else all may be lost, and our nation will descend into a Babel of screaming hatreds and kinetic carnage.
For Facebook, with more than 2.8 billion active users, Mark Zuckerberg is the effective governor of the informational environment of a population nearly double the size of China, the largest country in the world.

Review first posted – January 28, 2022

Publication dates
----------Hardcover - September 21, 2021
----------Trade paperback - September 6, 2022




This review has been cross-posted on my site, Coot’s Reviews. Stop by and say Hi!

=============================EXTRA STUFF

Links to the Rob Reich’s (pronounced Reesh) Stanford profile and Twitter pages
Reich is a professor of Political science at Stanford, and co-director of Stanford’s McCoy Center for Ethics, and associate director of Stanford’s Institute for Human-Centered Artificial intelligence

Links to Mehran Sahami’s Stanford profile and Twitter pages
Sahami is a Stanford professor in the School of Engineering and professor and associate Chair for Education in the Computer Science Department. Prior to Stanford he was a senior research scientist at Google. He conducts research in computer science education, AI and ethics.

Jeremy M. Weinstein’s Stanford profile

JEREMY M. WEINSTEIN went to Washington with President Obama in 2009. A key staffer in the White House, he foresaw how new technologies might remake the relationship between governments and citizens, and launched Obama’s Open Government Partnership. When Samantha Power was appointed US Ambassador to the United Nations, she brought Jeremy to New York, first as her chief of staff and then as her deputy. He returned to Stanford in 2015 as a professor of political science, where he now leads Stanford Impact Labs.

Interviews
-----Computer History Museum - CHM Live | System Error: Rebooting Our Tech Future - with Marietje Schaake – 1:30:22
This is outstanding, in depth
-----Politics and Prose - Rob Reich, Mehran Sahami & Jeremy Weinstein SYSTEM ERROR with Julián Castro with Julian Castro and Bradley Graham – video - 1:02:51

Items of Interest
-----Washington Post - Former Google scientist says the computers that run our lives exploit us — and he has a way to stop them
-----The Nation - Fixing Tech’s Ethics Problem Starts in the Classroom By Stephanie Wykstra
-----NY Times - Tech’s Ethical ‘Dark Side’: Harvard, Stanford and Others Want to Address It
-----Brookings Institution - It Is Time to Restore the US Office of Technology Assessment by Darrell M. West
-----New York Times - Feb. 3, 2022 - A Change by Apple Is Tormenting Internet Companies, Especially Meta By Kate Conger and Brian X. Chen - Apple allowing people to opt out of advertiser tracking having an impact

Makes Me Think Of
-----Automating Inequality by Virginia Eubanks
-----Chaos Monkeys by Antonio Garcia Martinez
-----Machines of Loving Grace by John Markoff
Profile Image for BlackOxford.
1,095 reviews69k followers
October 21, 2021
Name Yer Poison: Corporate Greed or Political Incompetence?

According to the three authors from Stanford University, America in particular and the world in general faces a stark choice. Either we allow Silicon Valley entrepreneurs and venture capitalists to create a technocratic dystopia of mass delusion and surveillance; or we create an alternative, equally dystopian, bureaucratic regime of corporate regulation to stop these greedy people producing the economic and social externalities that are now becoming overwhelmingly apparent. Not much of a choice really. But the authors want us to remain optimistic.

The three agree that the issue is about values and the sacrifices in some values that have to be made in order to promote others. They clearly don’t like the way those trade-offs are being made today. The technocrats only care about making money. Meanwhile the externalities caused by their success hurt enormous numbers of people, particularly those already on the margins of society. They suggest that the techies and tech-supporters in charge must develop a conscience about these externalities and act to mitigate them. They also would like the rest of us who are not involved in the industry to get back to thinking about what the common good really means. But mostly they insist that the government take action to rein in the capitalist excesses and to improve governmental technological skills. Really they’re writing about a reboot of an entire society not just an industry.

At times, the authors sound like they want to blow up capitalism. At other points, they appear to believe in the potential of a divinely omniscient power of democratic government to rationally sift through the intricacies of arcane technology to identify and address potential issues. And they surely want all of us to become au fait with the things that they think are most important about the technologies. But when it comes to explicit actions that might be beneficial, they get really vague, not to say puerile. In their hand and flag-waving about values, trade-offs, and “harnessing technological progress to serve rather than subvert the interests of individuals and societies,” they bring little of significance to light.

They want debate, for example. I’m not at all clear what this debate would be about, who would organise and participate in it, or where it would take place. They want some sort of governing body for coders and engineers, something like a high-tech American Medical Association that would grant licenses to practice, or at least create and enforce codes of ethics. They want regulators and prosecuting attorneys who aren’t intimidated by the political power of big tech companies. They want to stop self-regulation, increase data-protection, promote stakeholder capitalism, severely restrict insider share-dealing, pursue anti-monopoly suits relentlessly. But they provide few details about the who, what, and where of any of this.

In the final chapter, the authors flip from concerns about grasping capitalism to concerns about inept democratic politicians, agencies, and institutions. They seem to implicitly recognise that the externalities, the unintended consequences, of government intervention in the industry are also real. “Despite our enthusiasm for the role of democracy in governing technology, our democratic institutions do not always inspire much hope,” they say. This is where they get a bit more specific. They would like to see the Office of Technological Assessment recreated at Presidential level. But can they or anyone else really believe that such a governmental body would be able to anticipate much less direct or even positively influence the work of tens of thousands of tech entrepreneurs and their backers much less enormous established corporations? Among other difficulties the revolving door would have to have an enormous capacity!

In short, the book has nothing new to say and nothing old that is worth saying again aside from a few self-justifying war stories. Joint efforts like this often seem to sink to a level of prosaic mediocrity. This could become a classic of the genre. Or perhaps as members of the Stanford faculty they feel hesitant about biting down too hard on the hand that feeds them. Their employer is not only physically at the heart of the problems they want us to know about, it also receives a great deal of funding from the folk creating those problems. And by the way, didn’t these guys along with their colleagues and students help to create these problems in the first place? So perhaps a certain ambiguity and frivolity is prudent. The faculty lounge will remain calm. The Stanford legacy committee will continue to pull in (and earn) big bucks. And no doubt the students will continue to sign up for their classes without fear of being type-cast as intelligent social parasites. So as a result of this little bit of light weight virtue-signalling nothing will change.
Profile Image for Marks54.
1,433 reviews1,180 followers
October 18, 2021
Did you hear the one about the three Stanford processors who taught a popular course on business and technology ethics at the Stanford Business School? So Mehran /Sahami, Jeremy Weinstein, and Rob Reich did that and the resulting book is “System Error: How Big Tech Disrupted Everything and Why We Must Reboot”. This is apparently a popular course in the business school. I was left wondering why. It is doubtful to me that many MBA students will suddenly change their tunes and adopt the more principled, regulation tolerant, and less self-enriching tone in the class (at least if the book is indicative) and go apply their VERY EXPENSIVE education to serve the public good or work for government, at reduced pay of course.

I am sure there is much more to the class than revealed in the book, which would likely serve as a principal reading for an earlier session of the class. There would have to be more. The book is a series of topical chapters, each concerned with some central topic related to the role of technology and tech businesses in the economy and broader society - oh yes, and with the vast sums to be earned from all of the new technology products. Chapters also concern the reactions of society to both the oversized compensation in the industry and the threats to privacy, freedom, and societal values posed by the growth of “big tech”. But in a short book, only the surface is skimmed on these issues and how to think about them. Meanwhile, students in the class would have other classes to attend, along with the fall interview season, and the need for continuing networking. What’s a student to do? At least they can be aware of ethical and business-society issues as they look for a new job.

So why be so grumpy? Isn’t it worthwhile to discuss these issues? Sure, but during the course of the book, I strained to keep track of who was the target in each chapter. Who is at fault for the societal threats that big tech is posing? Is it the business students, who will take the highest paying job, especially if there are good stock options? Is it the venture capitalists, who want to build up their “unicorns” quickly and sell for a big gain? Is it the managers of tech firms, whose compensation is tied to short term countable metrics? Is it the regulators who have not done their job well and left the tech sector free to do what it wants? The answer seems to be a bit of “all of the above”. These are difficult problems and it will take time and effort to build up an effective management and regulatory regime that balances the interests of multiple stakeholders.

Sure, this is a reasonable place to end up. But it also seems a bit convenient too. It is OK if we continue to participate in the system, as long as we have thought about the deeper issues - right? It seems that way, unfortunately. There is a larger literature on the ethical issues and business - society concerns of modern business. The story is much the same in many of these works. The one that comes to mind is “From Higher Aims to Hired Hands” (2010) which is a history of efforts to provide a principled ethical education at the Harvard Business School. The author is the Dean of Harvard College and the message of his history is very much consistent with what the authors of “System Error” end up with. It is frustrating but understandable.

Overall, I enjoyed “System Error” even if it was a bit frustrating. It reads quickly and there are lots of cites for those who with to read more.
Profile Image for Leah Hazard.
3 reviews6 followers
August 15, 2021
“System Error” is a remarkable opportunity for the reader to access an incredible teaching team, frankly and entertainingly walking you through some of the biggest challenges and opportunities in the big tech space. Each author (computer scientist, government practitioner, philosopher) brings a unique and complementary perspective to this work – a true testament to the power of writing in teams when done well. From the unique vantage point of Silicon Valley, the authors share firsthand perspective and anecdotes on some of the most pressing subject matters to American democracy today: algorithms, data privacy, free speech, and market regulation of our information ecosystem. Throughout, they offer us probing questions for reflection as both voters and consumers of social media, and a variety of practical, real-world solutions for action. This is a must read for all American citizens interested in the health of democracy (to clarify, that’s everyone!).
Profile Image for Stephen.
94 reviews
October 16, 2021
Underwhelmed. Lots of pointing at obvious issues and making weak recommendations.

I wished it had placed a heavier sense of social responsibility on developers, but it largely seems like the authors see the role of developers as being to simply acquiesce to the regulatory schemes that are always going to be playing catch-up with technological development, rather than proactively avoiding negative externalities.

I see measures that they gloss over, like requiring technologists to be licensed to practice by the ACM, as being important in shifting the responsibility back towards the proverbial Victor Frankensteins creating AI in the first place, akin to the professional certification requirements for medicine or law. Society is harmed far more by a doctor losing their license than a developer, and the developer almost certainly invests less in education getting to that point, so the rationale for the bar for hackers being lower fades as technology gains prominence in our lives.

I think this might be a good pedagogical tool to get computer science students engaged in digital ethics and philosophy. As direction for policymakers, it falls short.
Profile Image for Mark.
121 reviews10 followers
October 1, 2021
I should have known better. It really is no wonder that this book is based on the authors' entry-level US university class. It shows.

Every book on this subject coming out of US universities, and I have read so many of them by now, claims to present a unique take on contemporary technology, big tech, and the future of the Internet. However, they are all the same book. This is all you have seen before but even more oversimplified, generalized, and rooted in the certainty that lack of diversity is the biggest issue the planet is facing, in spite of the avalanche of complications we find ourselves steeped in.

The authors make a thinly veiled attempt at looking as if they are concerned with the global repercussions of the subject they are discussing, but that fades away even faster than in the average work of this genre, and from then on, it’s all about pretending that the US’s situation is the global situation, and promising the reader that their superficial and unsubstantiated answers for these direly complex systemic problems we face are what will set the world free.
Profile Image for Steve.
641 reviews28 followers
June 3, 2021
I loved this book. The authors explain clearly and without technical jargon how social media and artificial intelligence are impinging on our lives. They also offer potential ways to mitigate these impingements, without coming across as zealots. I had recently read and reviewed an advance reader copy of “Evil Robots” which covered similar material, but I enjoyed “System Error” much more. I recommend “System Error” for anyone who is curious or concerned about where social media and artificial intelligence are bringing us. Thank you to Edelweiss and HarperCollins Canada for the advance reader copy.
1 review2 followers
September 7, 2021
You know when you're reading a book that you're sad to put down and excited to have time to keep reading? That's what System Error was for me! It's full of accessible and fascinating anecdotes about the tech-driven world we live in, the rise of tech superstars, unsung hackers fighting for public good, and where we fit in as users and consumers and how we can think about the technologies we use everyday.

The authors, all Stanford professors, are accessible, empathic, and funny. They break down how we got here, how start-ups are driven, how computer scientists are taught to think, and how we (regular citizens and consumers, who care about our own privacy, money, democracy, and just general well-being) can think about the power we have in shaping where we go from here. It touches on everything from education, data privacy, free speech, venture capital, equity, access, and much more.

Obviously we're living through the rise of the digital economy, where I assumed Big Tech and Venture Capital have all the power, but this book helped me understand that's not the case. While I'm not a computer scientist, academic, or technologist, EVERYONE should read this book about who and how we are building "solutions" to problems and start questioning whether they are actually problems that need to be solved.
Profile Image for Adam.
23 reviews1 follower
September 26, 2021
Caveat: I know one of the authors personally.

As a 25-year veteran of the tech industry, I can say with confidence that the authors of “System Error” have accurately captured the fundamental mindset of Silicon Valley. I agree wholeheartedly with their diagnoses of the vast impact - much but certainly not all negative - that that mindset has had on our society over the past couple of decades.

The fact is, engineering, particularly computer engineering, is all about optimization - the removal of friction to decrease cost and improve the speed, scale, and precision of any task in the world, from communications between people to delivering goods and services to making critical decisions from hiring to prison sentencing. The exponential improvements in the speed and power of computing technology have enabled us to optimize these tasks to an almost unimaginable degree.

The problem is, all too often we fail to stop and think about what we’re optimizing *for* and why. Optimization is not value-neutral - the choice of a metric or metrics can have vast impacts on society as a whole. Even the best-intended technology solutions can have massive negative impact if the wrong metrics are chosen.

Reich, Sahami, and Weinstein don’t offer simple answers to these challenges, because those answers don’t exist. They do, however, paint a clear, accessible picture of how we’ve gotten where we are, and provide some ideas as to how we can improve our situation. Strongly recommended reading for anyone in technology, policy, or anyone else who is concerned with how our society is developing.
10 reviews
December 21, 2023
Thoughtful…

Where we are in terms of social networking technologies in particular and their impact on democracy is frightening. The multiplicative effect on misinformation, irrational thinking, and misplaced anger has been tremendous and traumatic.

The book System Error thoughtfully points out the importance of democracy and legal oversight. Getting our politicians to stop focusing on lobbyists and billionaire funded PACs and the establishing apolitical but government funded policy advisory groups on technology and science could be a good start to fixing the problems.

And maybe we could also get big money out of politics and start taxing the super-rich like we disintegrated the 1950s and 1960s…
Profile Image for Kent Winward.
1,757 reviews58 followers
September 17, 2021
Serviceable overview of how we need to look at Big Tech. One huge gaping hole was what would happen if laws were passed that created individual ownership of their own data. The implications of being able to own and charge for your personal data would really disrupt big tech and represent a huge power shift and it wasn't discussed.

Remember that property ownership from land to intellectual property is a government creation. Nothing is stopping a government somewhere from bestowing property rights for personal data on its citizenry. Nationalize and privatize to the individual the data . . . now there is a disruption.
Profile Image for Amy Cui.
2 reviews1 follower
January 30, 2022
A refreshing and accessible read on the big tech landscape as we currently know it. The authors provide enough history, context, and perspective to reveal what shouldn’t surprise us about where the industry has arrived — and what we do have cause for alarm. System Error invites readers to inspect our own relationships with the services and products we consume and consent to, and our roles as individuals, communities, and voters. I’m currently in the process of making all of my software engineer friends read my copy (several have already incorporated self-awareness of their own optimization mindsets in regular conversation).
60 reviews1 follower
December 24, 2021
Loved the first 80% of this book, which provides an excellent overview of the ways that social media and big tech are problematic, but the last 20% outlining solutions to these problems left me a bit sad and itching for more from the authors (who are prominent experts on the topic).
45 reviews
April 9, 2023
Hmm. It took on a lot. Did a good job of introducing a lot of cool topics in technology. Idk it didn’t feel very novel but maybe it was 2 years ago. It also jumped around so much that it wasn’t a very enjoyable reading experience. I guess this is my book review. Gotta go write 750 more words!
Profile Image for Lil pisso.
106 reviews1 follower
March 30, 2024
I liked this book, but it is ironic how they made good arguments for diversity in tech while simultaneously doing a poor job at including women and poc povs and stories (excluding as victims).
44 reviews
November 24, 2023
i now want to go to stanford. sue me! but bring back and/or normalize non-fiction reading because at the very least, it’ll make you feel like an intellectual.
Profile Image for Justin Murphy.
84 reviews7 followers
January 30, 2023
A gift from my co worker Adelae Esposito because we often debate the role of technology in society.

System Error made me realize the difference of working in tech and being a technologist. It also reinforced the speed at which technology is evolving, and how far ahead the cutting edge is from the long tail of humanity.

The relationship between technology, government, humans, and how those things manifests together into the society of the future is something I ponder. System Error provided a point of view and evidence backed examples that helped make that philosophical thought exercise more real.

I would recommend to anyone who grapples with how humanity and society are going to evolve over the next few decades. A lot of the context will become out dated in the next 5 - 10 years. However, the questions the book poses will be prevalent for much longer.
Profile Image for Richard Thompson.
2,249 reviews118 followers
November 29, 2021
This is a book by Stanford professors about a very important topic that should concern all of us. They discuss algorithmic decision making, privacy, surveillance capitalism, machine learning, content moderation, the social and ethical issues that arise in these areas, the failure of private enterprise to address them and the possibilities for dealing with them through a combination of informed government oversight, reinvigorated antitrust law and improved professional ethics among technologists. You would think that this would be seminal book that I would be urging everyone to read, but it's not. It's a big whiff. A great opportunity lost. There's not a single new idea in this entire book. If this is the best that we can come up with from the great minds of Stanford, then we really may be going to hell in a handbasket. C'mon guys. You can do better than this.
Profile Image for Kate.
58 reviews2 followers
January 22, 2023
Stopped at page 101. Didn’t read anything new. The chapters are organized in less than logical ways. It basically repeats many of the known problems, yet whenever a solution is proposed it’s very vague. “The algorithm needs to fair and look at the many biases.” — I could’ve written this too.
Profile Image for Ken Dowell.
217 reviews
December 9, 2021
Three Stanford professors who teach a course in ethics, technology and policy, offer to tell us what went wrong with big tech and how we can fix it.

Perhaps the state of the science is best expressed in this sentence: “We shifted from a wide-eyed optimism about technology’s liberating potential to a dystopian obsession with biased algorithms, surveillance capitalism, and job-displacing robots.”

Big tech is managed with an engineer’s perspective and the authors emphasize how that leads to a perpetual drive to optimize. That might mean optimizing the number of users or user engagement or, when that mentality is taken to the boardroom, optimizing profit and shareholder return. In neither case are consequences much taken into consideration. As an example, the authors point to algorithms used by social media companies that, having been developed for optimization, end up promoting the most outrageous and heinous content, because that very content snatches the most clicks.

One of my pet peeves when it comes to technology is the quest for quantitative measurement, a focus that can lead to the setting of goals that are geared toward the measurable, as opposed to what might really be important. “A relentless focus on what’s quantifiable within the narrow view of any organization fixated on growth doesn’t necessarily provide insight on what’s good for individuals, society and the world.”

Some solutions are offered. Privacy issues could quite simply be resolved if data sharing by the likes of Facebook and Google and Amazon would require an opt-in by users rather than the current tedious and sometimes not very transparent opt-out system.

The authors see no alternative to government regulation, certainly not self regulation. One of the reasons is that they believe a competitive marketplace is key to a so-called reboot.

I enjoyed this comment: “Automation might be threatening the jobs of blue-collar and white-collar workers alike, but the advance of AI is proving to be a boon for at least one tiny employment category: philosophers.” You have to read the book to understand how that makes sense.

If you’ve followed, even casually, the news about the giant tech organizations, this book probably doesn’t offer much of any revelations. And having seen two of the authors at a local literary festival, I suspect they’ve better lecturers than writers. But what System Error does provide is a smart and thorough account of where the tech giants have taken us and what the alternatives are for the future.
25 reviews
February 14, 2024
This book tries to answer the question: how can we save liberal democracy from the power and influence of Big Tech? Chapter by chapter the authors explore some of the biggest social and political issues surrounding big tech companies, social media and technological advancement including bias in algorithmic decision making, artificial intelligence, privacy and freedom of speech. The authors argue that the values of big tech companies, their optimisation mindset, libertarian ideals and relentless pursuit of profit, combined with their monopoly command of technology that is essential to our everyday life is threatening the foundation of liberal democracy. To counter this, the authors offer sensible policy prescriptions, some narrow such as strengthened GDPR style regulation that gives users more control over their personal data and some designed to target systemic issues around capitalism, such as arguing for workers to get a seat at the board of the company.

It is hard to disagree with most of the suggestions and arguments laid out by this book, although most of it sounds familiar and has been talked about in some form or another in hundreds of liberal think pieces before. Where this book really falls down however, is in failing to capture how very different this current moment feels to, say, the robber barons of the nineteenth century. Few people were questioning whether the railroads were good for our wellbeing. No one worried whether oil drilling was a threat to free speech.

What frustrated me was when this book dismissed the popular documentary The Social Dilemma, and the idea that social media is a engineered form of addiction, as sensationalist and that its call for non-enragement with these technologies as "extreme". This grossly misses the point of what the documentary was trying to communicate. At no point does this book question the very value of social media, whether the services that Meta offers demands us spending 3+ hours on their platforms and giving up vast quantities of our personal data. Because the book refuses to reason with this argument, I think it fails to understand how we can reboot form big tech.
Profile Image for Scott Johnson.
413 reviews10 followers
February 6, 2022
This was a depressing read because of how accurately it portrays the toxicity of startup culture and how it echoes what I've been saying for years about how business has become all about min-maxing short-term returns rather than planning for long-term sustainability.

There was far more about the wider-reaching externalities of the tech industry than I expected, but that wasn't a bad thing. I guess I was just craving more of that validation of how I feel about my career and that nonsense I've run into that directly follows from the VC mindset. There was definitely some of that, but I expected more of it.

There wasn't as much repetition as you normally get in this sort of book (at least in my recent experience). Perhaps that was a benefit of having three authors (and surprisingly, despite that, it still felt coherent and had a single "voice" throughout).

It's a must-read for everyone in my life who doesn't understand why I can be making an honestly sacrilegious amount of money but hate everything about what I do. That in and of itself is a symptom of the underlying cause behind a lot of this: that we have culturally upheld financial success as the pinnacle to strive for, at the expense of all else. That so many can't understand the idea of prioritizing empathy, social responsibility, and personal fulfilment over optimizing earnings is the entire point of this book.

EDIT: Looking at other reviews after posting mine (something I always do to remain as objective as I can) after noticing this only had an average of like 3 stars (which is basically zero on goodreads), I really should have expected the shill 1-star ratings from the libertarian twats that are the heart of this broken industry.
15 reviews
October 2, 2021
I was looking forward to analysis of online and technology culture, perhaps some insight into the mechanisms and motivations behind how big tech works and evolved. I was particularly looking forward to a treatment of the subject from the three different perspectives of the authors.

What I found in this book amounted to a set of rather good introductory lectures about the subject but very little, if any substantial analysis. The authors provide many lists of examples identified any number of potentially interesting case studies and scenarios. I keep waiting for one to be picked up and examined perhaps in terms of how the preceding industries or media set the ground for the present system, an examination of the circumstances and causations of the players or perhaps a contrast between the “Silicon Valley approach” and that of any other.

For example the first part of the book criticises the goal of optimization over considering the socialital impact of a technology; a reasonable position The blame for this is laid at the feet of the technologist and developer. The motivations of developers, their place in the larger organisations in which they are employed, or the financial, regulatory or legal frameworks in which both we and the developers exits are hardly mentioned. An examination of how and why the System was in error would be most welcome.

I am a little over half way trough the book so might update my impressions, but at this point the books is an opportunity lost.

I am open to other views so if I have missed something or in the later chapters there is more analysis, please do let me know in the comments.
Profile Image for Chris Boutté.
Author 8 books214 followers
February 12, 2022
I absolutely love Rob Reich’s books, and he’s taught me so much, but I held off on getting this book for quite some time. All I could think was, “Oh great. Another book to tell me why technology and social media are bad,” but I was severely mistaken. My absolute favorite types of books are the ones that lay out the full scope of a problem rather than pushing ideas and opinions on the reader, and then, it gets the reader to ask themselves questions. That’s exactly what this book does. These problems with Big Tech are far more complex and nuanced than just needing more government regulation or controlling the spread of hate speech and misinformation. These authors present problems in each chapter and dive deep into the history and current discussions around each of the topics. I guarantee if you go into this book with firm ideas on Big Tech, you’ll leave with a completely different outlook, and that’s exactly what we need as we work on these issues together.

I was unfamiliar with the other authors of the book, but knowing Rob Reich’s style, I almost feel like Sahami and Weinstein did quite a bit of the heavy lifting with the writing. I may be wrong, but in any case, this was by far one of the best books I’ve read on the challenges we face with technology and absolutely recommend it.
174 reviews
July 7, 2022
The book is good at showing many different angles of each of problem with big tech. It equips the reader to see through the overly simplistic ways these issues are usually presented. It lives up to the first half of its subtitle "Where Big Tech Went Wrong".

As for the second half "How We Can Reboot": it's complicated and there are painful trade-offs to be faced. Yes, a variety of mitigation efforts are underway or being talked about. They are likely to take years while technological advances continue at great speed. Don't read this book expecting concrete solutions.

It is chock full of stories and examples. That's good in a way, but it makes it harder to follow the chain of thought. It could benefit from summaries at the end of chapters.

The authors also inserted the ideas of philosophers from Plato to Bentham to Mill. This makes for a disjointed flow. Sidebars might have helped with this.

Sentence structure is awkward. Many sentences have to be read twice. I don't think this one says what the authors intended: "Did the internet help correct the lie that Barack Obama was not born in Kenya?"

Overall, there was enough good information for me to be glad I read it but muddling through the tangled presentation was not enjoyable.
Profile Image for Matthew.
14 reviews2 followers
February 21, 2022
This is an incredibly insightful account of ethics on technological advancement. A lot of the interpretations I’ve seen paint this book as a polar choice: regulation OR technocracy. I think we can have both. The biggest thing the authors are trying to help us identify is the line between quality tech advancement and placing tech money above human well being.

Each chapter touches on ethics and philosophies, with valid case studies to support them, to highlight how tech corporations don’t necessarily take our well being into account. Yes, there are some companies that use tech specifically for our health and wellness, but there are others (i.e Facebook) that sell our data and sell their consumers toxic media in return. It’s a fine line, and we need to start considering more of the ethical dilemma we’re facing in the tech movement.

This is a fascinating topic overall, and this book is well-written and knowledgeable!
Displaying 1 - 30 of 68 reviews

Can't find what you're looking for?

Get help and learn more about the design.