Eli Pariser, the head of the viral content website Upworthy, noted in his book The Filter Bubble the emergence and danger of social media and internet search engine algorithms selectively feeding users information designed to suit their preferences. Over time, these “filter bubbles” create echo chambers, blocking out alternative viewpoints and facts that don’t conform to the cultural and ideological preferences of users.
Pariser recognized that filter bubbles would create “the impression that our narrow self-interest is all that exists.” The internet brought people together, but social media preferences have now driven people apart through the creation of preference bubbles—the next extension of Pariser’s filter bubbles. Preference bubbles result not only from social media algorithms feeding people more of what they want, but also people choosing more of what they like in the virtual world, leading to physical changes in the real world. In sum, our social media tails in the virtual world wag our dog in the real world. Preference bubbles arise subtly from three converging biases that collectively and powerfully herd like-minded people and harden their views as hundreds and thousands of retweets, likes, and clicks aggregate an audience’s preferences.
When users see information that confirms their beliefs, they like it, share it, and discuss it. Social media subtly creates large-scale confirmation bias—the tendency to search for or interpret information in a way that confirms previously existing beliefs or preferences. When users see information that challenges their beliefs and desires, they may block the disseminator of the content. Or if they want to challenge the disseminator, they’ll seek out and promote content directly disputing it. The information employed to settle such disputes may or may not be true, but what is important is the validation received by the person pushing it. Confirmation bias gets worse when users get emotional, triggering their instinctive fight-or-flight tendencies. Competitions, disasters, and defense of our core values make users reach for this preferred information that helps them feel secure.
Social media amplifies confirmation bias through the sheer volume of content provided, assessed, and shared. But social media also connects users to their friends, family, and neighbors—all people who, more often than not, think like they do, speak like they do, and look like they do. Social media users see news, information, and experiences contributed by their friends and followers. They naturally tend to believe this information as a result of implicit bias—the tendency to trust people we consider members of our own group more than the information of an outside group. Users trust the sender and transitively trust the information being sent, regardless of whether it’s accurate or not.
Confirmation bias and implicit bias working together pull social media users into digital tribes. Individuals sacrifice their individual responsibility and initiative to the strongest voices in their preferred crowd. The digital tribe makes collective decisions based on groupthink, blocking out alternative viewpoints, new information, and ideas. Digital tribes stratify over time into political, social, religious, ethnic, and economic enclaves. Status quo bias, a preference for the current state of affairs over a change, sets into these digital tribes, such that members must mute dissent or face expulsion from the group. Confirmation, implicit, and status quo bias, on a grand social media scale, harden preference bubbles. These three world-changing phenomena build upon one another to power the disruptive current bringing about the Islamic State and now shaking Western democracies