The promise of the new digital frontier was to expand human potential and our connections with one another, to work together to create a better future. But as the dot-com boom refocused the Internet from creativity and connection to growth and monetization, the unanticipated side effects began to have a strikingly destructive impact on the way we behave and relate to one another, and by extension our social institutions.
We have spent significant time researching and understanding the harmful effects of the internet because it is a negative externality of the industry we operate within. Just as factories have a responsibility to eliminate pollution and minimize waste, we have a moral responsibility to correct, and then avoid, any harm we uncover in the course of doing business.
And as it turns out, this particular issue might be the root cause of many things that pain us today: from the election of our narcissist-in-chief and climate action paralysis to the recent dramatic rise of hate, violence, and suicide in the industrialized world after decades of trending in the right direction.
So, what exactly is happening?
The web started as a platform for human interaction and instantaneous, global communication. But now, we are no longer the messengers. We are the medium — just a collection of data points to sell to advertisers. The algorithms that control what content you see and when you see it are designed to maximize your monetary value. When you become a more predictable and controllable data point, you generate more revenue for the platform you are on (e.g. Facebook, Twitter or YouTube).
While the extent of the damage being done by this system can be far-reaching, we’ve identified four “destructive digital forces” that are a good starting point for understanding the behavioral conditioning that happens when you engage on digital platforms.
[Technology such as social media] lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view … It’s super important. It’s turned out to be more of a problem than I, or many others, would have expected.
— Bill Gates, Quartz, 2017
- Filter Bubbles: A term coined by Eli Pariser referring to the intellectual isolation that occurs when you are only served content that re-affirms your already-held biases. Because you are more likely to both share and engage with this type of content, the algorithms are more likely to serve it to you. Now you are trapped in a re-affirming sphere of like-minded people and content that trains you to close your mind to other approaches and dismiss compromise as “bargaining with an enemy.” This dehumanization comes from struggling to empathize when you are not exposed to others different from you.
- Skew to Sensationalism: Content that evokes emotion, and does it quickly, gets preferential treatment by the algorithms because you consume more of it and are more likely to “digitally” react to it by commenting, liking, or sharing. These actions make you more valuable to sell to advertisers. The problem is not all discourse is best served by drama, especially the grand challenges we must face as a society. For example, the complex dynamics of our criminal justice system cannot be distilled into snappy, emotionally charged headlines. The over-simplified drama surrounding these issues prevents them from being addressed in a solutions-driven mindset and they simply become parts of a surface level identity war (black lives matter vs blue lives matter) with no common ground in sight.
- Binary Thinking: The algorithms attempt to fit you into neat compartments because machines are binary both in their most fundamental programming and in the rules and procedures their creators make: yes or no, red or blue, off or on, 1s or 0s. For example, you may exhibit behaviors that indicate you are “likely to buy a house.” Once the algorithm determines you belong in this classification, you are now served content and advertising that effectively pushes you even further toward this behavior, conditioning you and making this “likelihood” even more likely. Replicate this effect across millions of data points that comprise your identity and you see where we start losing our capacity to operate in the grey areas. In the context of politics, you are conditioned to be on one “side” and not the other with no spectrum in between. Consider also what this does in radicalizing young men toward white supremacy, Islamic extremism or conspiracy paranoia when amoral machines are determining which box to put them as they sit on the “likely” borderline.
- Unclear Authority: While there are many benefits to democratizing media, giving everyone a voice to share ideas and a platform to launch enterprises, it’s become clear that even the most educated people struggle to distinguish signal from noise. It’s difficult to parse who is an expert and who is pretending to be one, what is the truth, a half-truth and a lie, and what is deliberately manipulating you to sell you a product or idea. And it’s too easy (albeit justifiably) to perceive those who attempt to vet and validate for us as the manipulators themselves: the real news is fake news, scientists have ulterior motives, politicians just want to control you, corporations just want to monetize you. The result is that we give authority to those who confirm our held biases because it “feels” right. Snap judgment is an evolutionary skill that humans developed to survive in the wilderness, but it’s now working against us in the complexity of our society and the firehose of information drowning us all.
While humans have always been victims of their own biases and snap judgments, what has changed is: (A) we now have a device tied to us every waking minute of the day that is doing everything it can to pull us toward these bad behaviors and (B) we now have exponentially more media noise than ever before making it too complex and too exhausting to consciously process all of it.
Solving these problems won’t be easy, but we have to start somewhere. We need to get back to the original intent of social media to facilitate human connection, and not simply treat people as data points for shareholders. There’s still plenty of profit to be made and room for advertisers to exist responsibly without the loss of humanity in the process.