The Eroding Foundation of Truth in America
Truth, a necessary building block in any society, is eroding and splintering in the United States of America. Truth is a fundamental expectation within a society. If that society cannot determine and agree on what is true, economic and cultural progress will be difficult if not elusive. Philosophers have debated over the topic of truth for thousands of years and those debates tend to wade into the deeper meaning and origins of truth. For the purposes of this paper, truth will be defined from the Merriam-Webster dictionary as "the body of real things, events, and facts." In other words, truth in this context is referring to objective truth. An objective truth is something that is true for everyone and is the opposite of a subjective truth, or opinion.
Misinformation, also known as falsehood, could be seen as the opposite of truth. Merriam-Webster defines misinformation as "incorrect or misleading information." Misinformation is often spread without intent to mislead or bring harm on others. A good example of misinformation is the satirical news site The Onion. Headlines often appear at first glance to be factual but in fact are not. A headline such as "GOP Warns Loophole In New Bill Could Still Allow Teachers To Sing About Critical Race Theory" is clearly meant as a joke. The Onion is not out to mislead or harm anyone but if a person mistakenly believes that headline to be factual, they will be misled, nonetheless.
Disinformation, which is a much larger issue in America today, is a subset of misinformation with an intent to deliberately mislead or cause harm to others. Don Fallis is a professor of philosophy and computer science at Northeastern University who has extensively researched and studied epistemology and disinformation. Fallis writes that disinformation is "misleading information that has the function of misleading someone".
There are many ways that disinformation can mislead. A person could have a true statement but leave out a key piece of information, which would mean they are misleading by omission. An example of this would be a media outlet cherry-picking statistics without providing the rest of the information. This missing information would prevent viewers from seeing the context of the statistics and could skew the impact. A person can also generate false information and spread that information with the intent to mislead others. An example of this would be a person creating a statistic or event that is untrue and spreading that with others in the hopes that people will believe it to be true. Another possible way to mislead with disinformation is by neglecting to set the record straight. For instance, as Fallis explains, "If the source of a piece of information sees that people are misled, she/he will often have an obligation to correct their mistake. Therefore, if she/he has an opportunity to set the record straight and fails to do so, it is no longer an accident that people are misled".
The effects of the spreading and belief in disinformation can have harmful and sustaining effects on people and a society. For instance, this spread can cause people to lose trust in each other. A lack of trust within a society, community, or even a household can have dire consequences for its future. Fallis states that "disinformation can harm people indirectly by eroding trust and thereby inhibiting our ability to effectively share information with one another". Without the ability for citizens to effectively share information with each other, that society will struggle to prosper. Sean Richey, a professor of Political Science at Georgia State University, notes that conspiracy theories, a subset of disinformation, can create doubts around our democracy. Conspiracy theories, Richey notes, "threaten the fundamental legitimacy of democratic discourse". He goes on to say that conspiracy theories "poison the public sphere and prevent a reasoned and open discourse whereby disagreements occur within the context of reality".
Another impact of disinformation is the trust in our institutions. America has seen this loss of trust up close during the COVID-19 pandemic. During the pandemic numerous people, social media groups, and even media outlets have spread disinformation about COVID-19 and the vaccines that protect us from the virus and its variants. Mark Zuckerberg, the founder and CEO of Facebook, strongly believes in the efficacy of the COVID-19 vaccine and was determined to use the reach and influence of Facebook to vaccinate 50 million people. However, the rampant spread of and belief in falsehoods about COVID-19, the pandemic, and the vaccines made this task nearly impossible, as noted by Sam Schechner of the Wall Street Journal. Schechner states that antivaccine activists "used Facebook's own tools to sow doubt about the severity of the pandemic's threat and the safety of authorities' main weapon to combat it". In doing this, those activists created doubt in the minds of people and reduced the trust in government agencies such as the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC).
A final impact of spreading disinformation is how it can lead to physical violence. In Ethiopia, there is currently a civil war taking place. In at least one instance, disinformation has directly led to the loss of life. As reported by NPR, a post was created and widely spread on Facebook on September 27, 2020 that "falsely blamed members of an ethnic minority group for carrying out murders and kidnappings." The next day, an eyewitness said the village was ransacked, burnt to the ground and the inhabitants were murdered. On January 6, 2021, thousands of supporters of Donald Trump gathered in Washington, D.C. to protest the results of the 2020 election. The protest became restless and eventually morphed into a raid on the U.S. Capitol while Congress was in session to certify the election results. At least 5 people died during this siege and many more were hurt. This protest and subsequent siege was sparked by Trump himself. He falsely claimed that his loss in the election was due to rampant voter fraud and thus the results were invalid. It has been shown that people who blindly believe, respect, and submit to what strong leaders say and do, known as the authoritarian trait, are more prone to believe in conspiracy theories and falsehoods. Trump weaponized this trait throughout the 2020 election and a majority of the Americans that voted for him believed what he was saying was true. David Rand and Gordon Pennycook, professors at MIT and the University of Regina, respectively, have studied the effects of Trump's false claims. A survey they conducted showed that “77 percent of Trump voters believed in widespread voter fraud". They would also go on to say that Trump continuously repeating the false claims only further cemented the beliefs of that 77 percent.
Users
America and its citizens clearly have a problem with misinformation and disinformation. Oxford researchers have learned that fear of being misled by false information is the "single most common fear of internet and social media users around the world". More than that, however, humans have not learned how to handle the amount of falsehoods to which they are currently exposed. One of society's biggest issues with disinformation involves the human brain. The human brain is not equipped to process the amount of information social media provides. When it can handle loads of information, the brain tends to rely on what is known as System 1 thinking, rather than System 2 thinking. Daniel Kahneman, an Israeli psychologist and winner of a Nobel prize for his work on judgment and decision-making, describes the two systems of the human brain as distinct and fundamentally different. System 1 thinking is fast, emotional, and instinctive. System 2 thinking is slow, logical, and deliberate. The people that tend to rely less on System 2 thinking are often those that jump to conclusions quickly. Those same people also tend to be duped by misinformation more often than others. Carmen Sanchez is an assistant professor at the University of Illinois at Urbana-Champaign with a focus on decision-making. She states that these people, dubbed "jumpers," "do not engage in controlled system 2 reasoning to the same degree as nonjumpers". This lack of System 2 thinking can have severe negative impacts on a person's ability to overcome belief in falsehoods. Without consistent and reliable System 2 thinking, users exposed to disinformation will oftentimes accept it as truth. As Sanchez states, "It's system 2 thinking that helps people correct mental contaminants and other biases introduced by the more knee-jerk system 1".
A lack of truth in a society can also have negative effects on its collective memory. Collective memory is the shared knowledge, information, and memories of a particular group. A society's collective memory can be a fragile thing, with the slightest bit of incorrect information having large impacts. Laura Spinney, a science journalist with a focus on history and memory, writes that psychologists are now beginning to study collective memories, how they form, and what can make them weak. These psychologists have shown that "social networks powerfully shape memory, and that people need little prompting to conform to a majority recollection — even if it is wrong". When belief of disinformation alters collective memory, that alteration can then seep into society and have lasting impacts. Daniel Schacter, a psychologist who studies memory at Harvard University, says that social media sites such as Facebook and fake new sites have "the potential to distort individual and collective memories in disturbing ways". One recent example of this is in Israel. Two large groups of Palestinians, those living in Israel and those in the West Bank, have "gravitated to different versions of their past, despite a shared Arab–Palestinian identity". This shows how collective memories can change within a society when the information is changed. The alarming part is there need not be blatant spreading of propaganda to affect a society's collective memory. Simply intentionally omitting key components of a story is still disinformation and can remove those pieces of the event out of future knowledge. This is one example of retrieval-induced forgetting, which means "related information that goes unmentioned is more likely to fade than unrelated material".
It has also been shown that networks of people within a society, also called social networks, have very large influences on how people within a society think about, learn, and remember events. Humans tend to believe their peers over outsiders, as evidenced by a study in 2015 by psychologists Alin Coman at Princeton University and William Hirst of the New School for Social Research. Coman and Hirst reported that an individual "experiences more induced forgetting when listening to someone in their own social group — a student at the same university, for example than if they see that person as an outsider". The report by Coman and Hirst continues on to mention that "62% of US adults get their news from social media". With Facebook and other social media sites allowing users to filter out or remove disagreeable sources of information, it is no surprise that Benjamin Warner and Ryan Neville-Shepard, assistant professors at the University of Missouri and Indiana University-Purdue University, respectively, found that for "those selecting only ideologically consistent media, there is likely to be a significant polarizing effect". The human brain and users of social media, however, are not the only issues with respect to disinformation. After all, someone is responsible for creating and publishing this disinformation.
Creators
Creating and spreading disinformation has never been easier than it is today. Anyone with an internet connection has the ability to create or spread disinformation. It can even be done for free, using the available resources of many public libraries. The vast majority of social media sites are free and often do not require extensive, if any, identity verification. The act of creating a website has become so simple that it can be done in minutes on a mobile phone. Many companies such as Squarespace, GoDaddy, and Wix market their products towards people that want or need a simple tool to quickly create a web presence. With that, it is no wonder Richey is concerned that the "ease of creating websites could permit manipulative elites to intentionally spread conspiratorial ideas to achieve political goals". This ease of access has led to a small number of creators manipulating and abusing the systems created within the social media platforms. For instance, when Facebook took down a large number of user groups for spreading misinformation about the COVID-19 pandemic, it was found that only 5% of the nearly 150,000 users posting within those groups were producing 50% of the posts. Additionally, less than 1% of the members of the groups were responsible for inviting 50% of the groups' new members. This outsized influence by a small minority of users means the loud voices of the few are outweighing the many. Facebook's own researchers took a random sample of English-language comments over a span of two weeks and found that about 67% were "anti-vax." Compared with a recent poll, that number was inflated by around 40 points.
Individual creators and small groups of large-output users are not the only creators causing issues. Traditional media outlets and the news industry as a whole has undergone a dramatic shift in revenue streams and with that, a drastic change in priorities. The bulk of today's news outlets are in a race to be the first to publish a story, with the hopes that theirs will be the link that is shared across social media platforms. When users click on that link to read the story, they gain revenue through advertisements displayed on their site. In short, more clicks means more money. As Karl Bode, a journalist who has been writing about technology, information, and privacy for more than 15 years states, that relatively new and simple incentive structure can then be exploited by "everyone from ordinary trolls and partisan hacks to poorly-regulated industries and giant corporations". A great example of that is an event that occurred in September 2021. A fake press release stating Walmart had partnered with a cryptocurrency was published using a well-known press release distribution site. This press release was seen by many news outlets and then aggregated through their own publishing network. This publicity raised the price of the cryptocurrency, inevitably making a number of people a great deal of money. This event clearly has content creators seeking to spread disinformation for monetary gain. However, there is another aspect that is being overlooked. The news outlets that aggregated and reported on this fake press release never verified the claims. Even a minimal amount of fact-checking would have alerted the news outlets that this was a false claim. Instead, they all raced to be the first to publish in order to gain the most clicks. Bode summarizes this event well by saying "These screw ups aren't necessarily malicious, they're just representative of a broken U.S. ad-based press for which speed, inaccuracy, and inflammatory headlines make more money than measured, sometimes boring, often complicated truth" . While individual creators and news outlets carry some of the blame for the spread of disinformation, the true amplification and uninhibited proliferation of disinformation is the responsibility of another party.
Platforms
Social media platforms such as Facebook, Twitter, Instagram, and TikTok are the true amplifiers of information in today's society. Without these social media platforms, the disinformation created by individuals or corporations would not spread very far or fast. Roddy Lindsay, a former Facebook data scientist who designed algorithms, states that two key innovations have aided this spread. Personalization and algorithms have been the two primary factors in the spread of disinformation on social media. Personalization, "spurred by mass collection of user data through web cookies and Big Data systems", has allowed the platforms to create very detailed and accurate profiles of their users. This data can then be used to sell advertisements to companies and other interested parties, who can target very specific groups of people of their choosing. Algorithms are used by the platforms to keep users engaged by leveraging the "use of powerful artificial intelligence to select the content shown to users". The more a user is engaged and interested in the content they see, the longer that user remains on the platform. When a user remains engaged on the platform, that user will also see more targeted ads, thus providing more revenue to the platform itself by way of ad sales. This business model that the social media platforms employ is not inherently bad, but it does have negative byproducts.
Disinformation creators can leverage personalization and the algorithms the platforms employ to spread their falsehoods to very large audiences in a short amount of time. In fact as Sinan Aral, a professor at MIT and the head of MIT's Social Analytics lab, describes in his book The Hype Machine, it was uncommon for truthful news and information to reach 1,000 people. On the other hand, falsehoods would routinely reach up to as many as 100,000 people. On Twitter, there is evidence that truthful posts took nearly six times as long to reach 1,500 people compared to falsehoods.
As the evidence of these issues continues to grow, social media platforms are increasingly being asked to address and correct the problem. The key issue with relying on the platforms to find a solution is very few have a willingness to actually address it, as it could have a negative impact on their business model and profits. During the COVID-19 pandemic, Unicef staffers were continuously seeing negative and anti-vaccine comments on their pro-vaccine posts. When these staffers reached out to Facebook, the platforms advice was to continue creating content on the platform. Rather than attempt to address the issue of disinformation on its platform, Facebook's response keeps creators and users engaged and active in order to keep the business model intact. Facebook does claim to be working on this issue but there is little evidence of significant progress, particularly when it comes to comments. The majority of the systems built by Facebook to detect and demote or remove falsehoods were not built to function on comments, leaving a large amount of disinformation available for spreading and engagement on the platform. Facebook's automated demotion service, which is designed to reduce the spread of disinformation by hiding or burying posts or comments with falsehoods, suffers from technical issues. In one instance, Facebook's demotion system overlooked an anti-vaccination post that had been shared over 50,000 times and had over 3 million views. The automated system mistakenly thought the post was written in Romanian rather than English.
Conclusion
There are a number of ways our society can begin to move towards a better and more trusting relationship with truth, trust, and disinformation in this new and digital world. Individual American citizens can make changes to ensure disinformation does not further corrupt our society. One impactful change many people can make today is how they interact with others. Often, people interact with others in a more respectful way when they are face-to-face but that respect and courtesy tends to be reduced when applied to online interactions. Bringing good faith, respect, and tolerance into online interactions could help move Americans back to a less polarized society. In addition to better interactions, people can learn and practice using their System 2 thinking more often. Something as simple as taking slightly more time to analyze a piece of information could have significantly positive consequences in the long run and prevent major mistakes from occurring. Education, with a focus on critical thinking, can also help our society move away from conspiracy theories. This education need not always be formal, as parents and families have a large impact on how future generations view and interact with the world. Continuously training to identify falsehoods and target biases can help keep people prepared to combat disinformation when it arises. This alone may not be enough as America will have to first overcome its current cultural and political differences, which could take generations.
Another path of change worth pursing relies on the platforms making significant changes to their frameworks and potentially their business models. One key idea that has helped keep Wikipedia as a source of truth and trust is what Wikipedia's incoming CEO Maryana Iskander refers to as radical transparency. If a person views a page on Wikipedia today, they will see a number of things that may not be immediately obvious. The first is the vast amount of citations on nearly every page. These citations provide the necessary context and expertise required to build trust with the users of Wikipedia that the information they are reading is accurate. The second is what Wikipedia calls the Talk tab. While Wikipedia's Talk tab is not a forum, it does have a meaningful function of opening a topic up for good faith and respectful debate. Social media companies could borrow and implement these ideas of radical transparency into their own platforms in an effort to provide their users with a more healthy and truthful experience. Facebook has made one change to combat the ability of a small number of highly active users to spread vaccine disinformation and abuse the platform. According to an internal memo, Facebook reduced the limit of comments a single user could make on posts from authoritative health sources. That number was previously 300 per hour and has now been reduced to 13 per hour. The social media companies can also take the necessary time and effort and begin to inspect and possibly adjust the algorithms they have created that continue to promote and spread disinformation. After all, these algorithms were created in a matter of decades while the human brain has been developing for centuries. Tweaking algorithms will clearly be easier and quicker to change than fundamentally adjusting how the human brain works. In fact, changes to algorithms have been shown to have positive impacts. In an effort to reduce disinformation regarding the COVID-19 pandemic, Facebook implemented what they called "break the glass" measures. After a person manually reviews content and identifies it as false, that information would be demoted in the ranking by 50% by the algorithm. Facebook has also provided its users with canned factual responses on COVID-19 related posts, which were viewed over 10 million times in August 2021. While these measures have shown success, social media companies can continue to build and substantially increase the amount of similar innovations and changes. Facebook and others must go further to prevent the spread of disinformation on their platforms. For instance, while it was previously thought that debunking information would simply fall on deaf ears, studies have shown that users who believe in conspiracy theories often either outright reject the theory or its effects are reduced when continuously exposed to debunking and truthful content. Facebook and other social media platforms can do a better job of elevating and promoting truthful content, especially if it is contrary to the existing disinformation, while also reducing the prevalence of falsehoods. However, as these could become fundamental changes to their business, it is hard to envision the platforms implement any significant modifications on their own.
If any actions the American populace or social media companies take are not sufficient the local, state, and federal governments in America have options that could be effective in slowing the spread of disinformation. Coman proposes one relatively simple change that can be made without passing any new laws or regulations: governments can create a playbook. Include a comprehensive list of key points, ensure all the officials have the list, and repeat them often. At some point there is a good chance that laws and regulations will be required to help solve this problem. A number of laws and regulations could be implemented, as it appears the social media companies either do not care to fix their disinformation problem or cannot control the spread of disinformation. Facebook has been building and modifying its algorithms for over 15 years. At this point Facebook should be able to address the shortcomings of these algorithms as well. One opportunity is to create a reporting and storage apparatus for the data used to create, test, and operate these algorithms. A similar and very effective system is currently in place in financial markets. Additionally, the state and federal governments can pass laws that would hold the platforms and social media companies more accountable. One suggestion is a law that holds the platforms accountable for the content their algorithms promote. The largest and most comprehensive option for the American government would be to create an independent body to study, assess, and propose solutions to the issue of disinformation. This model has been effective in the fight against climate change and could be effective in this arena as well.
Not one of the proposed changes or solutions can be implemented exclusively. Ideally a complete package of these changes are implemented as soon as possible, to aim for the quickest and strongest reaction to the growing issues of disinformation in America. If America hopes to continue to be a beacon of democracy and prosperity, it must act swiftly and decisively to combat the spread of disinformation and rebuild the foundation of trust.