Last Friday night after my wife and I put our boys to sleep, we pulled up the new Netflix documentary The Social Dilemma. Featuring a number of noted authors, scholars, tech leaders, and activists, the film helps explain the growing influence of algorithmic technology, especially in social media.
Led by Tristan Harris, former design ethicist and president of the Center for Humane Technology, The Social Dilemma explores how these technologies are specifically designed to serve up a perfectly curated and addictive online world where companies profit from tracking our every digital interaction (often called surveillance capitalism).
The film focuses in part on the artificial intelligence (AI) technology behind the tools that drive our social-media feeds, email platforms, and most of our “smart” devices. As Harris explains, our concerns about AI are often centered on when it will overcome our strengths and outperform us in various tasks (“the singularity”), rather than focusing on how it has already overcome our points of weakness by fostering addiction and fueling dissent. Many of these systems control what you see in your social-media feed, when you receive notifications, and even what you type—all in order to modify your behavior, whether in what you buy or what you watch.
Wake-Up Call
The film examines how tech giants like Google, Facebook, Twitter, and others are able to bend our will toward company profit by perfectly curating our online experience. This curation in turn creates social bubbles that wreak havoc on our mental health and social fabric—amplifying things like anxiety and group polarization.
Often our concerns about AI are centered on when it will overcome our strengths and outperform us in various tasks, rather than on how it has already overcome our points of weakness by fostering addiction and fueling dissent.
Harvard professor Shoshana Zuboff, featured prominently in the film, surveys the negative social effects of algorithmic technology in her recent book The Age of Surveillance Capitalism. She aptly states that some of the most fundamental ethical questions of this new smart economy are “who decides” what we are exposed to, “who decides who decides” these things, and to what end such decisions are made.
The Social Dilemma is a needed wake-up call to the power and influence of algorithmic technology. It’s a film Christians should watch and engage, simply because these tools are already shaping us profoundly and, in many cases, forming Christians in decidedly unchristian ways. But in a bit of subtle irony, the filmmakers actually rely on these same tools to spread the word about the film—through social media and even the Netflix recommendation engine.
Significant Omission
The Social Dilemma is helpful in highlighting some of the fundamental ethical problems of social-media algorithms, particularly as they serve profit-driven corporations. Yet it fails to address the core problem. In the opening scene, the interviewer asks various experts a simple question: “So what’s the actual problem here?” Many respond in awkward silence as others fumble around with half-baked answers. In a moment of honesty, Harris admits there are so many problems he doesn’t know where to start.
Even though this question is posed to spark curiosity in the viewer, it encapsulates a major shortcoming of the film. The interviewed experts focus on the many symptoms associated with social media and its outsized influence, but they don’t pinpoint the underlying cause of the disease. The Christian worldview actually has the answer these leaders can’t seem to locate: the deep-seated nature of sin, which infects all aspects of humanity, including our technological tools.
In contrast to the Christian call to orient one’s life around loving God and neighbor, sin orients us around personal autonomy and serving self. The me-centeredness of sin led to the creation, and addictive popularity, of curated “iWorlds” in the first place. Now that we’re seeing the damage done by these technologies, calls for reform are rightly growing. But true change will not come until we admit these technologies did not arise and do not operate in a morally neutral vacuum—but within a pervasive environment of sin.
True change will not come unless we admit these technologies did not arise and do not operate in a morally neutral vacuum—but within a pervasive environment of sin.
Many of the film’s experts cast the battle for our souls as an unfair fight, wherein most of humanity is simply outmatched by the power of a few tech companies. While there is some truth to this view, we can’t abdicate responsibility and shift the blame for fake news, polarization, and other maladies solely on these technologies, without acknowledging that these tools actually function like jet fuel poured on a society already aflame with sin.
It’s true that algorithmic technologies have the power to not only respond to our behavior but actually to modify it, conditioning us to act in troubling ways to greater and greater degrees. But we are not powerless pawns, and our behavior online is not a foregone conclusion—however sneaky the algorithms become. Nor are we merely innocent victims in AI-driven societal disintegration. Humans chose to create these tools, and we can choose how to use them—or not. Indeed, the bigger dilemma for Christians—given what we know of the nature of sin and our vulnerability to temptation—might not be how to reform social media, but whether it is reformable at all.
Going Forward
The film ends with the interviewees giving practical recommendations for how we can navigate the challenges of social media. These include turning off notifications on your devices and limiting the time spent on social platforms, especially for kids and teenagers. Some offer public policy recommendations like calls for greater regulation, federal privacy legislation, and antitrust legislation.
While Christians will debate the merits of these proposals and may disagree on the best path forward, we must not forget that the real social dilemma isn’t happening in Washington or Silicon Valley but in our own hearts and homes. It centers on the decisions we make each day and how, for example, we use technology to either follow the greatest commandment (Matt. 22:36–39) or ignore it, to either serve the self or sacrifice our autonomy in order to serve God and others.
Read More
The Gospel Coalition