Suppose that one day, you purchase a new GPS for your car. You get behind the wheel, punch in your destination, and set off on the open road. Hours pass, but when you arrive, you find the GPS did not take you where you intended to go. Thinking this a temporary glitch, you reenter your destination and try again, only to meet the same result. No matter how many times you try, the device never takes you where you want to go.
It is with this analogy that James Williams, a former Google strategist, begins Stand Out of Our Light. Williams argues that our modern technologies are akin to this incompetent GPS. He asks us to think of the goals we have for ourselves and then question whether Apple or Meta or Google share those goals for us.
Further, the book looks in depth at the ideas and implications of existing within the ‘attention economy’ where money is generated not by the production/exchange of goods, but by getting as many people to look at your product as long as possible. Williams does not merely look at the impact of this shift on the micro scale of the individual but broadens his lens to the macro scale of nation-states. What does the attention economy mean for mature democracies?
Taking inspiration from the ancient story of Diogenes and Alexander the Great, in which the emperor offers Diogenes anything his heart desires and, in return, is asked to stand out of the philosopher's light, Williams divides one's attention into three 'lights.'
The first, spotlight, is what allows you to sit down and attend to the tasks you have before you. When you say you're going to study for 2 hours, but find that you, instead, watched YouTube the entire time, you have suffered from an obfuscation of your spotlight.
The second, starlight, allows you to navigate your life based on your goals and values. In the same way that age-old navigators traveled the seas by starlight, so too do we navigate our lives in the light cast by our values.
The final, most important, level of attention is daylight. Only in the light of day can you define your values. It requires a deep level of attention to figure out what you want to want. Our notification-heavy lives inhibit our ability to reflect on our goals or priorities. 40 years ago, a long sit in the waiting room may have caused us to question where we were at in our lives. Nowadays we do not have such problems.
It is the obfuscation of our daylight, Williams argues, that splinters the foundations of our democracy the most. He discusses the online tendency to anger inducing content (writing in 2018, he did not have access to the pertinent term ragebait.)
While anger itself is not bad, it is only helpful when it is channeled into meaningful reform. The argument you have with your spouse is only helpful if you both walk away with an understanding of how to be a better partner. Contentious congressional debates are only valuable when they result in better representation. Flame wars (as we used to call them) and Tik Tok virtue signaling never result in reform.
As a society loses its daylight and understanding of its own values, it becomes increasingly difficult to get them to unify around common goals. Users are no longer citizens of a given country, but, instead, subdivided into microcultures. A cursory glance of our modern politics shows the consequence of such subdivisions.
In the book's closing pages, Williams provides a refreshingly concrete solution to these problems. Without claiming them as panacea, he discusses four types of changes necessary to ensure that these technologies do not leave us attentionally destitute.
The first is a rethinking of the nature and purpose of advertising.
Second, concepts around the use of technology must evolve. Viewing the use of technology as analogous to addiction inevitably places a moral burden on the user, rather than an ethical burden on the trillion-dollar companies doing their best to manipulate them.
Third, policy must be advanced which changes the determinants of design. Williams argues that technologies should be graded on how well they assist users in achieving their goals. Distraction technologies like social media could receive low scores and be fined until bringing their tech into alignment with their user's goals.
Finally, Williams suggests that tech companies must use the information they are already gathering to help users rather than fetter them. Facebook already knows if you have an addictive personality, if you're part of an at-risk group, etc. It should be required that the algorithm recognize these facts and steer users away from content which may cause them to spiral. If we are willing to regulate advertising to children, shouldn't we be willing to regulate advertising to the child in us?
Some may argue that our technologies still give us more than they take away. Social media may connect us to people we may have otherwise never have met. Aren’t these tradeoffs worth a little fractured attention?
No, not in Williams’ view. The idea that our technology would be adversarial to us in the slightest is untenable. If you were to pick up a hammer and it, on occasion, burned your hand, you would not think that a good tool just because it sometimes got your nails in the wall. Like the GPS in the opening example, our technology has an obligation to us; it must take us to where we want to go.