When I started this article, I had a few questions and some provocative issues I wanted to present. However, as I started writing, and writing usually has a life of itself, it started to appear several other questions and issues I felt I had to address. As a result, the article became too long and it is now going to be published in three parts (“Anatomy of disinformation and fake news”, “How we made it to this point and the role of online platforms”, and “Why and how we should regulate social media and criminalize fake news”). Each part can be read independently, but to understand the whole and to connect all the ideas, you should read all three.
1. How we made it to this point and the role of online platforms
“Nothing vast enters the life of mortals without a curse” by Sophocles. Displayed at the beginning of the Netflix documentary “The Social Dilemma”, this quote truly represents the level of issues we find in social media and the way they operate. In this article, I am considering the political implications of fake news, its role in politics, and the effects on our social fabric. There are other issues related to social media just as or more important, but they are not part of this topic.
But let’s start from the beginning, how we perceive the world politically and how we identify with certain ideas. I will cite the work of George Lakoff, neuroscientist, who perfectly explains it in his article “Understanding Trump”, which you can access here, as well as in his book “The Political Mind” (Penguin Books). He says: “we tend to understand the nation metaphorically in family terms: We have founding fathers. We send our sons and daughters to war. We have homeland security. The conservative and progressive worldviews dividing our country can most readily be understood in terms of moral worldviews that are encapsulated in two very different common forms of family life: The Nurturant Parent family (progressive) and the Strict Father family (conservative). What do social issues and the politics have to do with the family? We are first governed in our families, and so we grow up understanding governing institutions in terms of the governing systems of families.“ So, some people need a strict father, and some need a nurturant father, according to their prevalent moral worldviews and values, if they are conservative (the strict father) or progressive (the nurturant father). This explains why so many people so easily believe in whatever comes from certain people or groups they identify with: it matches their world view and how they perceive to be the moral, the correct way of being in society, prospering, and so on. It is easy to understand how people will naturally congregate with like-minded people and not every like-minded people carry the same values and perspectives. Lakoff continues: “Family-based moral worldviews run deep. Since people want to see themselves as doing right not wrong, moral worldviews tend to be part of self-definition — who you most deeply are. And thus, your moral worldview defines for you what the world should be like. When it isn’t that way, one can become frustrated and angry.”
During the 2010s, the world witnessed the growth and elevation to power (in certain regions) of hard conservative or even ultra-right ideas. The USA had its share with the election of Donald Trump in 2016 and Brazil followed right behind, electing Jair Bolsonaro, a corrupt populist ex-military with radical conservative ideas in 2018, just to mention two. Both elections were tainted with foreign interference, campaigns relied mainly on social media and fake news, and their vitriol created a disrupted, polarized society. Their devoted followers built their assumptions and collected information on fake news and blatant lies.
The way social media, messaging platforms, and emails have been designed and the way their monetization models work allow and scale this type of issue. The Center for Humane Technology states: “Social media platforms are incentivized to amplify the most engaging content, tilting public attention towards polarizing and often misleading content. By selling micro-targeting to the highest bidder, they enable manipulative practices that undermine democracies around the world.” Democracies and societies are not for sale for the highest bidder. By engaging users by their emotional responses — any emotion including fear, hate, anger — the algorithm will show more of the same in a permanent and damaging never-ending circle and the users engage more and more each time. Fake news is also designed to elicit emotional responses: social media and fake news were the perfect marriage: more fake news created emotional responses, more social media engaged the user. Radical, extremist groups had their voices of hate amplified and a few bots well-positioned did amplify these messages dangerously. In fact, bots and trolls did control the political narrative with fake news across platforms, as confirmed by many studies and investigative articles. In fact, as reported by Buzzfeed, “In the final three months of the US presidential campaign, the top-performing fake election news stories on Facebook generated more engagement than the top stories from major news outlets such as the New York Times, Washington Post, Huffington Post, NBC News, and others, a BuzzFeed News analysis has found. During these critical months of the campaign, 20 top-performing false election stories from hoax sites and hyperpartisan blogs generated 8,711,000 shares, reactions, and comments on Facebook.” It is undeniable the role Facebook played in the dissemination of fake news in past political campaigns and how it altered the election results and, consequently, the society for the worse. Groups were more polarized than ever before and violent extremist groups found a space to congregate and organize, as well as to promote their ideas. The damage was done.
The messaging platforms (WhatsApp, Telegram, Parler, Messenger) with their encrypted and even disappearing messages, are especially damaging because their users have no exposure to anything different of their own views of the world, even a picture of someone’s baby on someone’s else newsfeed, as in Facebook. These platforms grew exponentially in the past years and are home to many extremists and hate groups. They played a key role in the Brazilian dissemination of fake news and in the coup attempt in Washington D.C. past January 6th. I call them tunnel social media — all a user can see is the ‘tunnel’ where he/she is, the person or group with which he/she is interacting with and the content shared in that ‘tunnel’. There is no exposure to anything else or to different views: one will see only what is being shared in that tunnel as if the rest of the world didn’t exist. If there is a group with a counter-narrative, the users don’t have access to find it or to join it unless invited by another user. The radicalization of any point of view is a reality in these groups and the emotion-eliciting fake news are being distributed by bot farms into these chatrooms without any constraints or control. Some platforms have the option to delete the chats and messages without a trace, making it even easier to disinform and to commit crimes. The emotional reaction is immediate and users share content, including fake news, without thinking twice. Tristan Harris, in his seminal presentation at Google, “A Call to Minimize Distraction & Respect Users’ Attention,” states: “When we lose that moment to consider before acting on our impulses, we lose what set us apart as thinking humans”. In these messaging platforms more than other social media, users behave like robots sharing impulsively, mindlessly, without any judgment of what is presented to them, moved by emotions only. Everyone in looks and do things like you, share similar emotionally-loaded posts and fake news appear constantly, plus the algorithm selects the ads for you. You lose control of your choices and the freedom to be.
That calls for neuroethics. According to the Stanford Encyclopedia of Philosophy, it is “the examination of what is right and wrong, good and bad about the treatment of, perfection of, or unwelcome invasion of and worrisome manipulation of the human brain. (Marcus 2002: 5)” and the right one has of cognitive liberty, also called freedom of thought, which includes the right everyone has to privacy and to autonomy — you have a right to you own ideas and choices. Cognitive liberty is also hard to spot, according to Richard Glen Boire’s “On Cognitive Liberty” series: “In contrast to the usual visibility of government restraints on physical liberty, restraints on cognitive liberty are most often difficult to recognize, if not invisible.”
Social media’s reach to our brains is definitely invisible but the results are clear and concrete for everyone to see and, despite all the good social media has brought to our lives such as connecting people and families, shortening distances, celebrating life events, and laughing with silly cats and dogs, it also did very questionable and wrong things by the very way they were designed.
Social media’s recent efforts to curb fake news, including users’ feedback, are insufficient: if these users are prey to fake news, how can they spot and report it? Or are these efforts a way to create some evidence the social platforms are “doing something”, such as the Facebook Oversight Board? They need a better incentive to do a more thorough work, maybe financial (via taxes) or via shared responsibility. Lastly, social media terms of service, privacy policies, and community guidelines don’t reach the speed and the broad range of wrongdoing by users in their platforms, it doesn’t control users’ behaviors (criminal laws don’t stop criminal activity), it doesn’t cover bot farms, and it doesn’t even come close to the depth and reach of their artificial intelligence, it does not fix the fact that their operation invaded and messed with human life and society. We need to do better.