Skip to content

Mis/DisInformation

Executive Overview

You would think we should have been enlightened by the past few decades of the Knowledge Age, so why do people seem badly misinformed, confused, emotional and unreasonable? Many do not believe in evolution, climate change, vaccination and other established science. 

Roughly one-third of Americans have accepted conspiracy theories and the “big lie” that the 2020 US presidential election was stolen. Statistica reports that 70 percent of Internet users think fake news causes doubt and confusion, with social media the least trusted news source worldwide. And 83 percent of people believe disinformation negatively affects their country’s politics. [6]  Norman Lear, the famous TV producer, said: “We just may be the most-informed, yet least self-aware people in history,” [1]  and Senator Ben Sasse worried, “We are living in an America of perpetual adolescence.” [2]

Extensive studies confirm that attitudes, beliefs and even rational decisions are largely shaped by a variety of well-known biases, political party allegiance, and other extraneous factors.[3]  Even hard-nosed business people admit that bias in decision-making is a major problem.[4] These irrational tendencies explain why demagogues successfully use the lure of self-serving fantasies that blind people to the truth and mobilize them into violence.[5]  

This dilemma poses one of the great ironies of our time. The digital revolution has created a wealth of knowledge that is almost infinite. The smartphone alone has made the world’s store of information available at the touch of a finger. There is no shortage of knowledge, but its power is badly limited. Although the world is covered with an abundance of communication, it is not a very happy place. Just as the Gutenberg printing press unleashed a flood of information that led to disruptive change, brutal conflict and the Protestant Reformation, this deluge of digital knowledge has brought a “post-factual” wave of nonsense, fake news and conspiracy theories that pose global threats.

The graph below summarizes results for the 10 questions (Qs) posed for respondents. One of the most striking conclusions is that people think mis/disinformation has devastating impacts. The data to questions 1, 8, 9 and 10 rate the issue above 7 on a 10-point scale, well above any thought that mis/disinformation is of middling concern. Further, responses to Qs 8, 9, and 10 show that the magnitude of concern is not expected to diminish with time.

Another major conclusion is the distrust of institutions to remedy this situation. Qs 2 and 3 show a marked distrust in big tech companies and the federal government to help. In contrast, Qs 4, 5, 6, and 7 suggest people are more confident that education, culture, leadership, and AI are likely to be more useful. Better education stands out as the most powerful force for change.

Finally, we can see bipolar distributions of the data showing the prevalent divide in attitudes toward government, culture, leadership AI and education.

In short, the mis/disinformation problem is deadly serious, it is here to stay, and companies and governments are not likely to help. But there seems to be hope in reforming education, social cultures, leadership and AI applications.

This analysis is supported by the many comments also included below, but opinion varies widely, as noted the bipolar data. To fully appreciate the richness of this complex issue, please look over the comments and savor the wide diversity of thoughtful viewpoints. You will be grateful for the thought poured into this crucial study.

A final conclusion runs through many comments. There seems to be a serious and compelling suggestion that mis/disinformation should be penalized through fines, dropping service, penalties or some way to discourage misuse of the media.  An authoritative body, aiding by good AI, would have to judge when mis/disinformation has occurred, of course. But that is needed in any case, and “internalizing” costs has been shown to be a powerful regulator of behavior at minimal social cost.

 

The Mis/DisInformation Ecosystem

“Mis/DisInformation” includes misinformation (honest errors) and disinformation (intended to deceive). There is a constant stream of mistakes, distortions, false news, and endless other information corruption in public media. Facebook, Twitter and other social media giants have faced mounting criticism for allowing inflammatory and even violent traffic to spread dangerous falsehoods. Even where care is taken to avoid mistakes, ethical behavior is hard to enforce.  Revelations of widespread surveillance by the US National Security Agency (NSA) and other intelligence organizations have brought demands for public exposure of data-gathering practices.  Transparency in government is often promised but seldom delivered. In parts of the world, rampant corruption is taken for granted.

A New York University study found that mis/disinformation on Facebook gets 6 times as many clicks as factual information.  Also, 68% of Republican posts are mis/disinformation while 38% of Democratic posts are mis/disinformation. This confirms our study showing the problem is deadly serious. The article appeared in the Washington Post

The spread of Mis/DisInformation” involves individuals making choices. They comprise an ecosystem. This suggests there are about 3 major “causes” involved: The environment and the individual interact in a social environment to determine the information that is accepted as valid and passed on.

 
The Information Environment

This includes all the various sources of information that surround us, good, bad or otherwise.

The individual Making Choices

People bring a variety of values, beliefs and other predispositions that encourage particular choices of information.

The Social Environment

Significant others, leaders, culture, etc. all influence individual’s choice of information.

This is just a rough and simple framework. I do think it moves us toward a better way to think about disinformation. Note that this Mis/DisInformation Ecosystem can become a vicious cycle. A tasty bit of disinformation appears in the environment. Individuals pass it on to friends and family. They in turn spread the disinformation further to create “buzz.” The buzz takes on a life of its own and stimulates more related disinformation. And so on. The reverse can occur when accurate information spreads to dispel Mis/DisInformation in a virtuous cycle.

Robert Finkelstein has produced two chapters on the disinformation ecosystem for the SAGE Handbook of Evolutionary Psychology (published in 2021). One chapter is about Evolutionary Psychology and Cyberwarfare (and the other is about Evolutionary Psychology and Robotics).  Contact Bob at bobf@robotictechnologyinc.com if they might be of interest to you.  

 

Forces Driving Mis/DisInformation

Here are some major sources of Mis/DisInformation:

Willful Ignorance 

It does not help that large parts of the public embrace confusion out of sheer perversity. TV and the Internet have produced what has been called “the dumbest generation” with a brazen disregard for books and reading, often favoring religious and political beliefs. [6] Following are choice bits of willful ignorance in the US and other modern nations.

  • The US ranks near the bottom of nations whose citizens believe in evolution, with less than 40 percent saying they accept the science.  [7]
  • Two-thirds cannot name the three branches of government. [8]
  • Half of Trump voters believe President Obama was born in Kenya. [9]
  • Thirty percent of people think cloud computing involves actual clouds.[10]
  • Twenty-five percent don’t know the Earth revolves around the Sun.[11]
  • Fewer than half know that humans evolved from primitive species.[12]
  • Two-thirds of undergraduate students score above average on narcissism personality tests, up 30 percent from 1982.[13]

Corporate Misbehavior  

Despite a drive to increased openness, corporate secrecy and cover-ups remain common. In the US, whistle-blowing events have increased, and SEC actions against public companies hit an all-time high. (Bloomberg, Feb 14, 2017)

Public Complicity

In many countries, corruption remains a major problem. In India, the most corrupt country in Asia, almost 70 percent of the population accessing public services report having paid a bribe to do so. (Forbes, Mar 8, 2017)

Fake News Proliferating

Intentionally false articles and slanted reporting have proliferated in recent years. US intelligence agencies found a widespread Russian program of fake news and disinformation, although they are unsure of the impact. Fake news imposes real social costs and serves to destabilize society. (Journal of Economic Perspectives, Spring 2017)

Deep Fakes Are Hard to Detect 

Increasingly sophisticated machine learning can create, often in real-time, convincing but fake audio and video. While being developed, tools to guarantee the authenticity of a given video or audio clip are lagging behind. (MIT Technology Review, May 1, 2017) Ted Gordon thinks the technologies will increasingly make voice, facial, gait, aroma, provenance, and the like, indistinguishable from reality. How will we be able to tell “the real thing?”

Post-Factual Mess

The advent of today’s “post-factual world” carries the problem to an extreme by forcing us to sort through fake news and conspiracy theories. An entire cottage industry has sprung up to produce books titled “Assault on intelligence,” “The death of truth,” “A world without facts,” “The death of expertise” and “Truth decay.” [14]

Inƒormation War

Commercial firms conducted for-hire disinformation in at least 48 countries last year — nearly double from the year before, according to an Oxford University study. The researchers identified 65 companies offering such services.

 

Governments and Corporations Responding 

Throughout Western democracies, governments are requiring disclosure of corporate information to ensure ethical dealings. Many shareholders want even more.


DARPA Working to Spot Fake Media

With machine learning algorithms becoming adept at generating believable fake audiovisual content, it’s important to be able to detect the fakes. To that end, DARPA has launched a project aiming at catching the so-called “deep fakes”.  (MIT Technology Review, May 23, 2018)

Shareholder Demands 

Investors in growing numbers are demanding greater transparency from corporate boards and executives in matters of compensation, company operations, and political contributions. 

Social Media Companies Responding to Fake News

In response to concerns about fake news’ impact on society and even election results, organizations are taking further steps to prevent fake news from spreading. Facebook, for example, has created algorithms that automatically flag suspicious stories, are then sent to fact-checkers. If shown to be false news, the company attempts to limit their spread across the social network. (Advertising Age, Aug 3, 2017)

Crises Forcing Corporations to Act

Recent corporate scandals have highlighted the need for greater transparency in business, recruiting even top executives to the cause. Corporate scandals ranging from dishonest mortgage practices (Bank of America, et. al) to bribery in Mexico (Walmart) and fixing of LIBOR trading (at least 10 multinational banking firms) have brought growing demands for business transparency. So have legitimate but largely hidden activities such as political contributions. (Inc, Jul 8, 2016)

Executives Endorse Ethics  

A survey of business executives in 30 countries found that 79% believe their companies have an ethical duty to fight corruption. Executives said the most effective anti-corruption tool was the transparency enforced by investigative journalism. (Institute for Global Ethics, Jul 8, 2016)

Public Demands

Roughly half of U.S. adults (48%) say the government should take steps to restrict such misinformation, even if it means losing some freedom to access and publish content. That is up from 39% three years ago. A majority (59%) say tech cos should restrict disinformation. (PEW Research Center, Aug 21, 2021)

 


[1] “Norman Lear calls for leap of faith,” The New Leaders (May/June 1993)

[2] Ben Sasse, The Vanishing American Adult (St. Martin’s, 2017)

[3] Elizabeth Kolbert, “Why Facts Don’t Change Our Minds” (The New Yorker, Feb 27, 2017). Yuval Harari, People have limited knowledge. What’s the remedy? Nobody knows,” (New York Times, Apr 18, 2017)

[4] Tobias Beer et al., “The Business Logic in Debiasing” (McKinsey, May 2017)

[5] Harari, “Why Fiction Trumps Truth,” (The New York Times, May 24, 2019)

[6]  Statistica (June 16, 2021); Latterly (2021); Mark Bauerian, The Dumbest Generation (New York: Penguin, 2008)

[7] Ker Than, “US Lags … Acceptance of Evolution” (Live Science, Aug 11, 2006)

[8] Susan Jacoby, The Age of American Unreason (New York: Pantheon, 2008)

[9] Catherine Rampell, “Americans … believe crazy, wrong things,” Washington Post (Dec 28, 20015)

[10] Mark Morford, “Human stupidity is destroying the world,” alternet (Mar 20, 2013)

[11] John Amato, “25% of Americans don’t know the Earth revolves around the Sun,” National Science Foundation (Feb 25, 2014)

[12] Ibid

[13] Bauerian, Op. Cit. The Dunning-Kruger effect is a well-established phenomenon in which those who know little actually believe they know more than others. Angela Fritz, “What’s Behind the Confidence of the Incompetent?” (Washington Post, Jan 7, 2019)

[14] Hayden, The Assault on Intelligence (New York: Penguin, 2018) Anne Applebaum, “A world without facts,” Washington Post (May 20, 2018)   Tom Nichols, The Death of Expertise (New York: Oxford, 2018) Jennifer Kavanagh and Michael Rich, Truth Decay, (Santa Monica: The Rand Corporation, 2018) Adrian Chen, “The fake news fallacy,“ The New Yorker (Sep 4, 2017) George Will, “The high cost of cheap speech,” Washington Post (Sep 21, 2017)

 

Questions

We invite you to read over the above analysis, and then answer the following questions. Send your responses to Prof. Halal at Halal@GWU.edu. Please answer all questions on a scale from 0 to 10, and provide comments as well.

 

1. How serious is the Mis/DisInformation problem now?

 

 

Owen Davies: Catastrophic. It feeds “conservative” nihilism and justifies any extreme in their holy war against liberalism. Without mis/disinformation, for example, the current movement among Republican state legislatures to overturn elections that did not go their way and replace electors with their own partisans would be shouted down by their own voters (assuming their claims to love America, freedom, rights, etc., have any basis in reality.) This is an existential threat to American democracy, and democracy is losing.

Nir Buras: Most likely worst since the late Middle Ages.

Fran Rabuck: A more serious problem is a swing to the other side. Control of communication has never had good outcomes in the past. Before we worry about policing our media, we need to get focused on policing our streets. I believe that over time the masses will respond and fractioning of information and followers will evolve.

Xin-Wu Lin: When Covid-19 get into the US, some local Chinese groups flew into superstores collected groceries, just because of disinformation in some Chinese discussion groups. It did not happen again since it was clarified.

Margharita Abe: This problem affects multiple areas of current life, not just the political arena, and makes discussion and implementation of solutions to ongoing problems (like climate change, COVID vaccine use and other issues related to public health measures, widespread environmental degradation, to name just a few) almost impossible to imagine.

In the current legal environment, these social media companies have no constraints on their behavior and face little if any pressure to effectively manage their media content. Expecting them to “police” their content is extremely optimistic and possibly naive

Craig Boice: The problem has always been serious, and uncontrollable, but belief systems had ways of dealing with it. In a closed-minded belief system like the one we are moving to now, there is only the reflected murmuring of like-minded group members, and static. Information and disinformation are both merely data. The group will still react to stimuli, like a flatworm, but it is a form of unconscious life.

 

2. How effective would it be to have media companies self-manage their content? 

Ian Browde: Can we address the challenge by certifying sources of information more accurately? For example, sources of propaganda, while ok under the 1st Amendment, should not be labeled “news organizations,” “news people” and so on much like fiction is not non-fiction.

Owen Davies: We already know the answer. Self-management programs work only to the extent that they do not reduce company profits. That is to say, not well enough to have any benefit.

Nir Buras: The hybrid digital communications/information/news platforms—so-called “social” media have no long-term use other than spreading misinformation. Do you trust the people who created the problem for their personal gain to fix it?  

Milind Chitale: The general comment on this topic of self-regulation of and by the media is that IT CAN BE BIASED. Period.

Fran Rabuck: Not very effective now. Social Media has its Bias and will execute under that assumption. They also have financial interests to favor their advertisers. And they have obvious political leanings. I believe they have a right – under current laws – to do this.

Peter Von Stackelberg: Media companies, if we include social media (i.e Facebook, Twitter, etc.), content aggregators, search engines, as well as traditional media outlets like TV, newspapers, etc. are doing a terrible job of self-managing content. Social media and search engines are the worst, doing 0 out of 10 in terms of managing content. They pretend they are just carriers of information, but their use of algorithms to focus the attention of users on content belies that assertion. Traditional media (i.e. New York Times, Washington Post, local newspapers, etc.) are somewhat better, but there is wide variability depending on ownership, political perspective, etc

Peter King: Many of the social media companies skirt around the laws governing the media by claiming they are messenger services.

 

3. How about Federal regulation of media companies?

 

Owen Davies: Not a prayer. The Republican hardcore would view Federal regulation as proof that liberals and the “Deep State” (whatever that is) were conspiring to pollute their precious bodily fluids. This would further harden their positions, assuming that remains possible. At best, it would convince the less rabid minority among them, and also many genuine independents, that “government overreach” had become a problem requiring correction.

Dennis Bushnell:   Only ‘fix’ that makes sense IMHO is to have the Govt. develop a site where folks could send the stuff to get a read wrt the accuracy, so your query wrt an AI to do this is cogent. ….not a panacea, in a democracy there cannot be such. Cannot make folks do the checking, or believe what the site tells them.

Milind Chitale: Media is like a wildfire. Regulation of Media by Governments is a double-edged sword.

Fran Rabuck: First Amendment. Social media is now media and needs revision. I fully expect changes to Section 230 of the Communications Decency Act. Some bills are now in Congress on this — https://hbr.org/2021/01/are-we-entering-a-new-era-of-social-media-regulation

Peter von Stackelberg: Regulation is going to be difficult in any political climate because of issues of free speech, censorship, etc. In today’s polarized political climate, government regulation will be difficult if not impossible.

Leopold Mureithi: Fraught with bureaucracy and corruption. 

Peter King: The biases of media owners and advertisers will remain regardless of government regulation.

Margherita Abe:   If these companies were held liable for their content then the Federal Government would be able to “police” them. A change in their liability would help…requiring that they have the same constraints as a utility company would work here.

Craig Boice:  How about criminal charges against individual executives who fail to execute their public duty in assuring that their services are safe? Public trust is a vital asset we hold in common. Without trust (the ability to accept information based on its source, rather than our personal verification) our society comes apart. Oxycontin, jet aircraft, and Facebook all seem necessary to our society  —  and all of them pose immense risks. 

Owen Davies: I doubt that effective regulation of content validity is constitutionally possible. Otherwise, technology provides too many ways to bypass regulated media: private servers, ad hoc networks of private websites, “alternative” social media based in countries that ignore American regulation.

4. How effective would it be to develop social cultures that encourage a social ethic of “Seeking Truth”?

 

Owen Davies: Not at all. The great majority of so-called conservatives already believe that they seek the truth. They of course recognize truth when they see it because it supports their political positions. Claims that do not are almost by definition untrue–sincere error at best and more likely partisan lies. We also see this problem on the far left, but to a much smaller degree.

Nir Buras: That is an individual choice. We would likely need a new Continental Congress for that. Goodness and truth always win out. sometimes it takes them a very long time.

Milind Chitale: In the 21st century, seeking the truth has completely lost its meaning, as truth is a tinted vision of reality. As beauty is in the eye of the beholder, truth is similarly tainted by our beliefs, experiences (good and bad), personal views on important things… all these are greatly varied even in the most ethical people of the world, leave alone meek and mere mortals.

Fran Rabuck: Where does “truth” begin and end? Will we seek truth in religion? Science? If we encourage a single mindset for all – we stifle innovation, progress and even overall government.

WE ALL HAVE Bias of some type.  “Trust” Factor could be included in other business/metrics – BBB, Sustainability measures (many), and even Financial Statements. If this is truly a problem all these external groups could share in collective pressure to report the “truth.”

Xin-Wu lin: Transparency and hard evidence could help!

Peter von Stackelberg: I am somewhat hopeful that American society as a whole does value the truth, although reading the news often shakes my confidence in American society. I think western European nations are more likely to have a social ethic of “seeking truth”. Part of America’s problem is that it has had more than a century of being the dominant “truth” globally. Americans are used to dominating the world in terms of culture, with the American version of the truth being the only truth. It will take a lot to shake Americans out of their complacency and get them to see that American social and political values are built on a mountain of myth, propaganda, and outright lies that are deeply ingrained in the American consciousness. Americans as a whole are very reluctant to accept the negative aspects of their history.

Leopold Mureithi: Culture change is intrinsically difficult and long, long term. 

Peter King: Not only a social ethic is required but also the willingness of individuals to question their own biases.

Craig Boice: Seeking truth has long been a quest some individuals have chosen. However, the search for truth is personal, and cannot improve either the quality of information or our society’s ability to process information properly. Accepting truth might be a skill to develop for some; others could try to master the dangerous skill of creating truth. Respect for the truth is different and could be a learned social norm, with appropriate rewards and punishments.

Owen Davies: But only if it were possible. I do not believe it is. If it were, such a change would require two or three generations at best, rather than being available when it’s needed. Any such attempt today would be rejected by the truth-challenged as liberal indoctrination.

 

5. Would strong national leadership help?

 

Owen Davies: We have strong national leadership. They are called Republicans. They don’t help.

We also have weak, incoherent, and narcissistic national leadership. They are called Democrats. They also don’t help.

Some progressive replacement for the Democrats–a party with a shared sense of priorities, coherent policies, and enforceable party unity–would, in my view, have a chance of being effective. Let’s not hold our breath until it appears.

Nir Buras: Dictatorship, no. Ethical politicians? I pray for that. people who are for the constitution—I pray for that.

Milind Chitale: Having strong leadership at the national level always helps, as good leaders are normally very well developed in establishing the sense of right and wrong at the outset in the top line of command, this helps trickle down to the common man, the establishment of truths and righteousness. But a strong leadership with a bent vision will be equally hurtful in this endeavor.

Fran Rabuck: I think you’re leading the question here. What is strong leadership? Dictatorship? Should our leaders be religious? Boy Scouts? Are the leaders in cities that have strong lockdowns, right? Name an example of a strong leader in the US now? NY? LA? Seattle? Chicago? White House?

Xin-Wu Lin: It depends on how national leadership makes people trust.

Peter von Stackelberg: National leadership is vital. However, I have serious doubts that we will see it. The political right is not willing to address the issue of misinformation and disinformation because it goes against their political interests. A vocal minority on the political left is also problematic, particularly with the “cancel culture” that has sought to silence diversity in thought and speech.

Carlos Scheel: Find a straight, transparent and capable leader. We need a Champion.

Leopold Mureithi: Leading by example, yet endemic suspicion of politicians dent credibility. 

Peter King: One could argue that strong leadership aka dictatorship is part of the problem, rather than part of the solution.

Craig Boice: It depends on what kind of strength, and what arena of leadership. We need leadership by example  —  leaders who respect the truth and seek it. We need leaders in journalism (e.g., Walter Cronkite), education (e.g., Mr. Rogers), and science (e.g., Drs. J. Robin Warren and Barry Marshall).

Owen Davies: It depends on whether those strong leaders oppose extremism or find catering to it politically useful. The United States has too few of the former and an endless supply of the latter. 2 is an upper limit.

 

6. How about AI systems that automatically detect and remove inaccurate content?

 

Owen Davies: There are at least three issues here:

AI replicates the prejudices and serves the purposes of its designers. This means that computerized content moderation developed by social-media companies will never be effective enough to hurt the developer’s profits. It therefore will never be effective enough to benefit American politics and government. There will never be enough resources to make content policing, whether by humans or by AI, effective, even if it theoretically could be. 

The social media and some elements of traditional mass media earn significant revenue by serving up half-truths and baseless propaganda. Until change is forced upon them, mis/disinformation seems likely to remain the strongest single force in American politics.
 
Unfortunately, there are few options for their correction, and none seems likely to be effective. Their consumers will never ask for change. Their sponsors will act only after catastrophes like the January 6 insurrection, and that response seems likely to prove limited and temporary. No attempt by government to enforce standards of truthfulness will be Constitutional. The FCC’s abandoned Fairness Doctrine, if resurrected, would die within 24 hours after Congress next changes hands, and today’s Supreme Court would gladly ignore precedent to declare it unconstitutional.
 
In the end, our present situation may change only by slow, spontaneous evolution. What force could make truth and fairness the fittest to survive is unclear.

Peter von Stackelberg: With technology enabling ever more realistic audio, video, and virtual experiences, how do we deal with real vs. fake?

Tom Ables: Can AI/ML engines on social media detect what my predilections are as opposed to another’s, and thus take inflow and sort it into different ideological “bins” for distribution? Articles A,C,Z for person 1 and articles B,Q, X for person2? und so weiter?

Jerry Glenn: AI systems are improving that will identify and trace deep fakes. Create a “cognitive immune system” for the individual and community.

Young-Jin Choi: I’m afraid that the problem is bigger than that it could be solved by AI or social media regulation alone – the critical reasoning capabilities and the worldviews of at least a critical mass. Those who are currently misinformed need to be elevated by a public awareness/education campaign.

Milind Chitale: AI has become a dangerous tool. As we have seen Deepfake videos that completely take real people and put words and phrases into their mouths in the most believable videos ever even though we know they are fake.

Hannu Lentinen: Artificial intelligence will certainly be used to verify the information. It will become a competitive factor in the media industry. We don’t have time and we don’t want to read or listen to bad information.”

Fran Rabuck: First, I suspect this is already happening behind closed doors at media companies now. How do you think they discover those questionable posts to begin evaluation? Historically, insurance companies would have digital monitoring of all outgoing messages. They didn’t want agents saying the wrong things legally. It was useful, but not perfect.  Language translations and idioms in cultures make this more difficult. And as many in the AI industry now fear, we are building bias into all AI systems. AI may help and I expect application will surface, but it’s not the silver bullet to the problem. We can probably prevent George Carlin’s – 7 words from leaking out, but the full content of messages is a huge challenge.

Peter von Stackelberg: It really depends on who is programming the AI systems and whether they can effectively compensate for their own bias.

Leopold Mureithi: Who can you trust to control it? Can be abused. Combine with question 2 above. i. e. self-regulation by a representative body of the private sector, civil society and governments, etc.

Peter King: The concern would be who gets to write the algorithms and to what extent their own biases are included.

Margherita Abe:  A big issue with AI is that it mirrors the world view/beliefs/attitudes of the entities that create/train it…If trained by Facebook, for example, it would reflect FB’s worldview….This may result in a very skewed assessment of content being reviewed rather than an objective assess

Owen Davies: Witness the report several months ago that QAnon members had evaded Facebook’s AI by merely avoiding the term “QAnon” and Facebook’s temporary ban on the word “breast” even when followed by “cancer.” AI will deal effectively with human communication written to avoid obvious trigger words only after it is too late to help with this problem. And, of course, if it worked the social-media companies would never use it.

 
7. How effective would educational institutions play in developing critical thinking? 

 

Milind Chitale:  Overall improvement in the pre-primary and the primary school education to invoke and seed ethics very early in life

Imposing a structured course that is compulsory for all students of tertiary education which will empower them with the tools to seek the truth and separate the chaff from the grains of information is an immediate need of the hour, and must be part of all nations with any form of education policy.

Knowledge is power, and Knowledge is the truth, so any attempt to keep society knowledgeable and ethically bound will be good to solve major issues here.

Hannu Lentinen: Schools should teach checking information and “why and how to doubt information.

Ian Browde:  Is mis/disinformation a topic that should be taught in school, at all levels? Can this be overcome by learning/teaching critical thinking?

Art Murray: I’ve hesitated to respond because I think framing the question in terms of mis/disinformation doesn’t go deep enough.  Instead, I’ve been thinking along the lines of… ”How do we overcome learning apathy?”

Jonathan Kolber: The same kids who are bored and unengaged with industrial education systems will play well-designed video games with full intensity for hours. I submit that the major difference between such kids and those who change the world is that the latter figure out how to bring that intensity to a real-world game, which they are committed to both playing and winning. An educational environment that cultivates student curiosity, purposeful play, risk-taking. and self-directed learning may support this outcome. 
 

Fran Rabuck: Education! Starting at a young age, we should begin to educate students to be critical media consumers. We can start by exposing them to different opinions. Courses in Logic would be most useful at later ages. (A college course in this has proven to be most helpful for me thru the years). Focus on creating independent thinking. Teach the ideas of Critical thinking: analysis, interpretation, inference, explanation, self-regulation, open-mindedness, and problem-solving. Most learn the Scientific Method – which is just the start. Understand the basic theory and interpretation of probability and statistics. There are several efforts and games available now that are just starting to make their way into the system. More later.

Work to eliminate Bias in education, workplace, government, etc. Not just racial bias – but all bias that we naturally have. At a minimum, we all need to recognize and accept Bias – not necessarily agree.

Xin-Wu Lin: Training the trainers is the first step and very important.

Peter von Stackelberg: Educational institutions could play a significant role in developing critical thinking. The question is whether there is the political will to do so. K-12 schools are doing a very poor job of developing critical thinking. Colleges and universities are doing a slightly better job. However, the focus on STEM curriculum neglects developing critical thinking in students. Many STEM curricula seem to assume that the courses they teach are free of social and political values. The de-emphasis of the arts, humanities and social sciences in the educational system from start to finish contributes significantly to the lack of critical thinking skills in American society. 

Leopold Mureithi: Education is the foundation of character formation.

Peter King: Education is part of the answer, but educational institutions would need to be transformed first.  In addition, everybody would need to adopt a lifelong learning approach as education does not stop at school/university

Margherita Abe:  Of all the possibilities for a change I am most optimistic about education as an effective way to develop critical thinking in its students. This would be a decades-long commitment but an overhaul of the US educational system is long overdue…   

Owen Davies: Again, right-wing fantasists would consider this to be liberal indoctrination. If it were seriously proposed, it would be shouted down. If it were enacted, it would be contradicted at home and would promote the spread of homeschooling. It seems likely as well that most who completed such a course would forget it as quickly as students forget geometry and foreign languages.

 

8. How probable is it that mis/disinformation will sway US Presidential elections in 2024?

 

Xin-Wu lin: The conflict interests between the US and other countries like Mainland China are becoming more critical. Disinformation will from everywhere.

Hannu Lehtinen: In the U.S. presidential election, it is customary to spread false information about candidates. While the candidates are usually equal and the winner takes all the votes of a state, then the current electoral system is sensitive to fake news. – The electoral system should be improved – e.g. by counting only direct votes for the candidate on a US-wide basis. It will probably take years to change the electoral system. The probability of the result turning false due to mis/disinformation is 20% in 2024 and 2028, but 10% in 2032.

Owen Davies: Whatever the source, the social and mass media pick up the most incendiary lies and propagate them as widely as possible. Thus, there seems no practical difference between domestic and foreign mis/disinformation. The same means will be used to combat them, with equal success or failure.

Leopold Mureithi: This is the nature of politics, no?

 

9. How probable is it that mis/disinformation campaigns originating in other countries will change the outcome of US Presidential elections in 2024?

 

Hannu Lehtinen: The US has been warned in the 2016 election. It is unlikely that foreign parties will succeed in changing the outcome of the election in the US for the second time. However, domestic election campaigns can do that

Owen Davies: In 2016, we came as close as possible to having the outcome of a Presidential election changed by mis/disinformation. However, not even the unprecedented efforts of Vladimir Putin and his proxies could get the job done. In the end, putting Donald Trump into the White House still came down to last-moment manipulation by the FBI director, then as trusted a figure as could be found in government.

It can be argued that a year of mis/disinformation prepared the way; without that, even the biased announcement of a renewed probe into the Clinton campaign probably would have failed to sway the election. If others view mis/disinformation as the critical factor, it is a reasonable interpretation.

My own view is shaped by 2020, when Mr. Trump and his allies lacked that final push. Despite a campaign of mis/disinformation, with tactics presumably refined by the experience of 2016, this time including serious efforts by China and Iran, Trump lost. I suspect this sets the pattern for future elections. However, for 2024, the odds against changing the outcome appear no better than 55-45.

They should improve in 2028 and beyond. Although I cannot imagine how, it seems likely that some modestly effective counterweight to mis/disinformation will be developed in the next few years. More significantly, the electorate will include more voters from the internet-native generations, many of whom are used to filtering out the worst of the garbage the net brings them. These factors will make only a small difference in voting, a few percent at best. Yet, even this will be enough to offset any growth in the mass or sophistication of mis/disinformation.

I believe a much greater menace to democracy is the attempt by state legislatures to give themselves authority to overturn elections that don’t go their way. Republican assets on the Supreme Court are likely to ratify these efforts as part of the states’ Constitutional authority to set the “manner” of elections. At that point, mis/disinformation and efforts to defeat them become irrelevant.

I like to believe that the growing influence of the internet-native generations will slowly turn extremist propaganda into one more category of spam. It seems at least as likely that those raised by extremists will, on average, believe that the ways of their fathers are the ways of the gods, offsetting any potential benefit from generational change.

 

10. Considering all of the above, how serious generally is the Mis/DisInformation problem likely to be in 2030?  

Leopold Mureithi: History repeats itself. 

Peter King: It can only get worse

Margherita Abe: Even if efforts are made to change the current situation these efforts are likely to require time (more than a decade) to bear fruit

Craig Boice: Disinformation will be more serious than it is now, because messing with human perception will continue to become easier and cheaper, and we are unlikely to impose more serious criminal consequences on those who do so. We are in a phase of legalizing drugs, not restricting them. Other belief systems will continue to attack us because they can, and because we’re attacking them. In the past few decades, we have revealed to the world that as a nation, Americans can be deceived rather easily, and there are no consequences for doing so.

Many great religions and cultures find deception to be among the worst evils. The first step for us would be to recognize that wisdom. We could condemn and prosecute those who do not respect the truth: not those who have well-reasoned opinions different from our own, but rather those who deliberately lie and mislead, and those who deliberately manipulate or distort the search for truth. Today, our laws are much harsher on criminals who attempt to harm our bodies than those who attempt to harm our minds.

Margherita Abe: Even if efforts are made to change the current situation these efforts are likely to require time (more than a decade) to bear fruit

Craig Boice: Disinformation will be more serious than it is now, because messing with human perception will continue to become easier and cheaper, and we are unlikely to impose more serious criminal consequences on those who do so. We are in a phase of legalizing drugs, not restricting them. Other belief systems will continue to attack us because they can and because we’re attacking them. In the past few decades, we have revealed to the world that as a nation, Americans can be deceived rather easily, and there are no consequences for doing so.

Many great religions and cultures find deception to be among the worst evils. The first step for us would be to recognize that wisdom. We could condemn and prosecute those who do not respect the truth: not those who have well-reasoned opinions different from our own, but rather those who deliberately lie and mislead, and those who deliberately manipulate or distort the search for truth. Today, our laws are much harsher on criminals who attempt to harm our bodies than those who attempt to harm our minds.

Fran Rabuck: Much of the future direction for the US and the world depends on Government and Education systems. Hopefully, we’ll see more open opinions and better-educated consumers/students over the next 10 years. Going back to my comment on DRIP – I think the future will supply us with more Data than we can realistically consume – and we’ll just decide who/where to get our information based on our personal Bias.”

 

11. What other solutions are possible?

Leopold Mureithi: Code of conduct mutually enforceable.

Yair Sharan: Defining disinformation as a crime with significant punishments.

Aharon Hauptman: Teaching media literacy from kindergarten. Using social media “influencers” to promote awareness and recommend reliable sources. Establishing “self-correcting” mechanisms that combine human & artificial intelligence. (The imitation model is something like Wikipedia, but should be more sophisticated)

Hellmuth Broda: I do not see a straightforward solution to the growing issue that truth is being sacrificed for the dominance and power of a minority. By ignoring facts and building a parallel reality populists and strongmen around the world are trying to toll the death knell to democracy-building oligarchies and kleptocracies in its place. Democracy can only work with an agreed base of facts that voters agree on. Discussions and different opinions must center on options for policies and not on different facts and “realities.”

Xin-Wu Lin: Should encourage more NGOs from different domains for assisting to clarify and report the truth of information. Those NGOs could apply “Civic Tech”, for encouraging or incentivizing people from different domains to clarify information voluntarily. The public could label some NGOs are trustable. NGOs should have good governance, at least with a transparent process.   To prevent or early detect Mis/DisInformation won’t just count on Federal and media companies, civil participation and engagement will be a big help. However, they need easier use of tech and incentives for doing that.

Chris Garlick: Civil and Criminal penalties should be considered for those who pass misinformation or misrepresent events

Hannu Lehtinen: Search engines sort out low-quality information – Google has started this. All media check their pieces of news in advance and (at first) automatically by checking the sources and the quality of the sources.

Ian Bowde: Form a privately and publicly (nonpartisan) funded consortium chaired by a Board of diverse representatives against mis/disinformation with representatives of the top 500 companies as well as representatives from educational institutions, small and medium-sized businesses, nonprofits, community organizations to create a strategy for minimizing mis/disinformation in American society, turn it into an online easily accessible forum and share it with the rest of the world.

Peter von Stackelberg: First, I think there needs to be a range of solutions. The misinformation/disinformation we are grappling with are symptoms of deep systemic problems coupled with the rapid pace of change with information and computing technology. Addressing these problems effectively will require systemic solutions.

Dennis Bushnell: A govt. supplied AI capability where folks could send stuff for a believable evaluation.

Carlo Schell: A strong rule of law that promotes an inflexible and universal education based on ethics of the truth.

Leopold Mureithi: Moral rearmament: Thou shalt not lie; The Golden Rule. Do unto others as you would have others do unto you. Again, a long haul in civic education and mutual fairness. 

Peter King: A constant campaign to name and shame purveyors of disinformation – taking every opportunity to call them out and show who is calling the shots behind the scene.

Craig Boice: The only solution, as philosophers have long recognized, is the development of citizenry who (1) believe there is truth and respect it, (2) reason well, (3) recognize and respect evidence, (4) remain open to new reasoning and evidence, (4) link judgments to values, and (5) dialogue with one another effectively. Unless individuals learn to use their minds, trust never expands beyond the group, and the group rarely learns.

Owen Davies: Three approaches occur to me. Two are tangential and simplistic to a degree that makes them automatically suspect. Neither would have many benefits by 2024, nor by 2028. The third is a never-ending game of whack-a-mole that would be of limited effectiveness and could incur retaliation the US to date has been unwilling to face. I very much hope someone else will suggest something better.

One is to restore the civics classes that were standard fare in grammar school six or seven decades ago. They taught not only the basics of government but that citizenship brings more than rights. It brings responsibilities to the community that are not captured in the phrase “blood of tyrants.” That last is a lesson too much of the population seems to have forgotten. Call it two generations to make such a scheme work, assuming it could. Raising and training teachers who truly believe such classes are essential would be a long process.

Second, attack the worst of the extremists. Treat the militias and their political kin as the FBI once treated organized crime. Investigate vigorously. Prosecute energetically on any charge that presents itself–illegal weapons, incitement of violence, armed confrontations like the Bundy standoff of 2014, or whatever. Seek the maximum sentences possible. Convince those who are open to the message that extremism is an unattractive aberration. Convince the rest to keep their heads down. No event will supply a better opportunity to apply this approach than the insurrection of January 6. Position it as getting tough on crime.

The third is to identify and block foreign sources of disinformation. This effort would be essentially identical to any anti-hacking program but on a much larger scale. It is one place where AI might be helpful. This would be an expensive long-term program, and it would invite retaliation from the source countries, much as an all-out effort to stop Chinese cyber-espionage would. However, I rate this problem more serious than Chinese hacking because it strikes directly at the foundations of American democracy. Accept that, and the price becomes worth paying.

Clayton Rawling  I would suggest criminal penalties for clear misinformation that causes serious harm to individuals or businesses. Freedom of speech is not license to use speech as a weapon or strategy to destroy when saying things that are verifiably not true. Civil penalties and private lawsuits are ineffective because lawyers (me) will not sue and obtain judgments against indigent bad actors because they are uncollectible and it amounts to a pyrrhic victory at best. While this gives the state a very serious hammer in the commons of discourse, we presently have really nasty people ruining lives with immunity. I do not see any good options but the present status quo is untenable as this escalates.

Dr. Peter Bishop has a project called “Teach the Future” where he is trying to get a foresight curriculum into the high schools to teach critical thinking to our young people. I think AI, if used appropriately, could go a long way to exposing what is going on. 
 
We are clearly a world civilization in transition, which creates a lot of fear in many people. The lunatic fringe can connect with others and using powerful communication tools can create a lot of destruction. The easiest example is Covid 19 death numbers. A plague that should have killed less than 100,000 people in the USA now totals 650,000 dead. Another example is QAnon where there are people who claim that liberal democrats have sex with infants and then eat them. Suing a belligerent nut job with no assets or income will do nothing to stem the tide of this onslaught against the age of reason. Without true accountability, we will soon be drowning in this. Each year it becomes more oppressive. How do we get quality people into government when they know that service comes with death threats and constant slander as a part of your public life? Death threats are clearly against the law but law enforcement is slow to act unless it is the most egregious. We need the law enforced against these people. They are actual criminals, plain and simple, yet most act without consequences. For some reason, we have allowed our metaverse to become lawless.
 
Back in the 60’s Big Tobacco declared war on science and claimed there was no link between smoking and lung cancer. They got away with it for several decades until the state attorney generals threatened them with criminal prosecution for fraud. The lung cancer deaths were over 400,000 per year before the government acted. The following tobacco litigation, by state governments, then forced Big Tobacco to repay all the Medicaid cancer treatment costs born by the states for their activity. 
 
Big Oil ripped a page from the Big Tobacco playbook and went to war on science to deny global warming, to allow for unrestrained burning of hydrocarbon. It has spilled over into anti-vaxxers and conspiracy lunatics claiming Bill Gates put a chip in the vaccine to track them. This unethical and irresponsible behavior is not without consequences. Private lawsuits could not combat Big Tobacco and are now unable to combat Big Oil. It will require the state to intervene if we have any hope to reverse the damage. I do not view big government as benign. The alternative is to leave us exposed to the mob. As I said, I see no real good options at the moment.

 

General Comments

Ian Browde

My sense is that the platform company business model is the problem. Here is my rationale.

Section 230 is being used by platform companies like Facebook, Twitter, etc., to evade responsibility for curation and editing information. Consequently, we find ourselves plagued by claims of 1st Amendment support for any idea, thought or opinion, whether or not supported by any evidence, advanced by people with an ax to grind, an ideology to push or a belief system to maintain. When these ideas are either promulgated by elected officials or famous people, they tend to carry more weight than they would typically, and there is little the public can do to mitigate their effect.

Companies that are supported by advertising could and should be regulated like advertising companies. Facebook, Google, Twitter, et al fall into this category for the most part. Others that push ideology and/or propaganda (Fox News a lot of the time, CNN some of the time) should be regulated as newspapers or entertainment companies and hence not permitted to use the word “news” in their name.

The key to understanding this whole issue is an understanding of what a platform company is. It is a 3-legged stool of technology (the infrastructure), data (knowledge about users, subscribers and their likes and dislikes, etc) and community (where folks feel they belong and resemble others in the same group). The key to a platform company’s success is the “network effect” i.e., people generating buzz and telling others about it. Going viral is the most obvious example of the network effect. In order to keep people engaged and enhance the network effect, content needs to be more and more edgy, more and more titillating, more and more outrageous. Until we understand that the business model is the problem things will only get worse.

Mis/disinformation in society is a ‘wicked’ problem. In other words, it is a social and/or cultural problem that is so difficult, impossible even, to solve that it requires investigation and addressing in a complex way.  Some of the reasons for its ‘wickedness’ are incomplete or contradictory knowledge, the number of people and opinions involved, the background context of uncertainty and fear, the prevalence of demagoguery (this might be a result of the problem, not the cause too), the large economic burden or opportunity, and the interconnected nature of the wicked problem with other problems.

So other questions might be:

  1. Is mis/disinformation a topic that should be taught in school, at all levels?
  2. Can this be overcome by learning/teaching critical thinking?
  3. Is there such a thing as “accurate information?”
  4. Are there certain facts that are uncontestable, for example, the earth revolves around the sun? If so, should teaching/communicating other possibilities be banned?
  5. Can we address the challenge by certifying sources of information more accurately? For example, sources of propaganda, while ok under the 1st Amendment, should not be labeled “news organizations,” “news people” and so on much like fiction is not non-fiction.
  6. What is the difference between a fact and an opinion?
  7. What is evidence of mis/disinformation and what types of evidence are trustworthy?
  8. Is the phase (in the USA especially) we are going through of mis/disinformation really a transition from democracy to autocracy and the way that occurs is for people to develop consensus around an alternative reality?
  1. Is the disease we are identifying, mis/disinformation, the precursor to a world where people are not trusted and AI (artificial intelligence) is? 

NOTE – BJ Fogg at Stanford warned against computers as persuasive technology many years ago. Are certain societies more or less prone to mis/disinformation than others and can we learn from them?

 

Peter King

There have always been snake oil sellers but now they have the communication means to reach millions anonymously.  I think your questions may elucidate the answer but if not, an additional question may be: “What is the most significant fundamental reason, or reasons, why mis/disinformation is influential, persuasive or effective?” 

The disinformation situation has become so bad in Thailand that they have had to create a new Anti-Fake News Agency. There is massive concern however that cracking down on “disinformation” that may actually be true but shows the government in a bad light, is akin to State censorship. One area where disinformation is rife is in the “comments” section of online newspapers like the Bangkok Post, where trolls cut and paste the same anti-vaccination nonsense every day and evade the censor by changing their avatar every few hours.

More than 25 celebrities and influencers are being investigated for insulting the government over its handling of the Covid-19 pandemic, deputy Metropolitan Police Bureau (MPB) commissioner Pol Maj Gen Piya Tawichai said “Under the Computer Crime Act, offenses, such as putting false information into a computer system, which causes damage to people, carry a fine of no more than 100,000 baht and/or a jail term of no more than five years. Bangkok Post today.”

The Government has threatened to invoke Article 9 of the Emergency Decree, which was enforced on July 15. According to the decree, strict action will be taken against people spreading false information or fake news to cause fear or shake the state’s stability. However, the media organizations pointed out that this announcement aims to limit the freedom and rights of people and the press. Also, it said, branding reports as “fake news” is only an excuse for the authorities and is calling on those in the press to demand the Government stop using this excuse to control the public. The Nation today.

The Thai Government is even going further with its crackdown on “fake news”.  It is intensifying a ‘fake news’ crackdown despite outcry from media, netizens, shrugging off complaints by Thai media organizations and netizens of an ongoing state crackdown on free expression. 

So, a key question, in my mind, is where is the dividing line between disinformation and factual information that makes the government, a company, or an individual look bad? Where are the independent fact-checkers or should that be a public peer review process like Wikipedia? How can fact-checking be made fast enough to head off the disinformation before it has done the damage and gone viral on social media? How can we ensure that anyone posting information online (true or not) can ultimately be traced by law enforcement agencies using some form of digital fingerprint? Under what circumstances should law enforcement agencies be allowed to track down and close purveyors of disinformation?

The EU approach may be worthwhile referring to in the next round: Online disinformation | Shaping Europe’s digital future. The Commission is tackling the spread of online disinformation and misinformation to ensure the protection of European values and democratic systems.

 

John Meagher

These are excellent questions.
I think your questions may elucidate the answer but if not, an additional question may be: “What is the most significant fundamental reason, or reasons, why mis/disinformation is influential, persuasive or effective?”
Mis/Disinformation been a problem with long historical roots in many societies, past and present.

 

Carlos Scheel Mayenberger

The theme of fake news and disinformation is quite complex and impossible to clarify if the population does not have a solid “values and believing structure.” I think is not a problem of the source but of the receptor. A person who is well-informed and educated will detect immediately that the news …the earth is flat…is not correct, no matter the source, or at least may give any well-informed argument against it. Although the informant believes 100% the truth of this sentence… but who knows what is the purpose to inform this? 

So I think, this theme is so complex because it goes directly to the “believing system” of each individual. And on the intention of the news ….not on the news itself. And to be able to separate the true from the false is not a matter of the description of the news or the statistics but on the “intention behind it.

 

Peter von Stackelberg

While misinformation and disinformation are bad. I think we need to look deeper.

One of the ironic aspects of the Information Age is that we have massive amounts of accurate information available to us almost instantaneously, yet our ability (or our willingness) to use that information seems extremely limited. In the classroom, I see on a daily basis how a generation of students who have access to more information than I could have dreamt of when I was their age, are simply ignorant. It’s not that these students are stupid or unintelligent. In fact, it is quite the opposite. However, many of them have tuned out and checked out.

It’s not just students and the current generation of young people. I am continually baffled by the overall ignorance present in American society. The biggest problem, from my perspective, is not that many people are misinformed or “disinformed”, they are uninformed. I don’t think this ignorance is uniquely American, but it is painfully clear that a significant portion of Americans revels in their ignorance, wearing it proudly and publicly. 

Many American institutions, including the educational system and churches, either fail to teach critical thinking skills or actively discourage it. For all of America’s talk of liberty, I think there is a deep, wide authoritarian streak in our society that finds critical thinking a threat. For more than a century, educational institutions have been focused on the transmission of information for application to industrial production. Critical thinking, particularly at the primary and secondary levels of the educational system, has not been seen as particularly important. In fact, I think it has often been seen as a pain in the butt to have students engage in independent critical thinking. 

I think some important questions that need to be asked are:

  • What role should state-run educational institutions play in developing critical thinking?
  • What role should private institutions (educational, church, and others) play?
  • What is the role of the news media, social media, and popular media in developing critical thinking among their audiences/users?
  • How can our society bring about a rapid change in the level of critical thinking? Can it be done in a relatively short time or will it take a generation or two to instill critical thinking into a majority of the population?
  • Is there a political will to ensure people are able to think critically, particularly when that leads to disagreement with prevailing social beliefs and values? (NOTE: We see a sustained campaign on the right to eliminate critical thinking from the educational curriculum. In my opinion, the ultra-left is also not a big fan of critical thinking and independent thought.)

I think it is really important to go beyond the issue of misinformation, disinformation, deep fakes, and so on as they are only the surface layer of the problem. Perhaps questions to ask are:

  • How can society deal with a glut of information? 
  • Is too much information — whether it is true, false, or somewhere in-between — part of the problem?
  • If so, how do we fix that in a democratic society?
  • How did social media become part of the problem? 
  • Are the news and popular media also part of the problem?
  • With technology enabling ever more realistic audio, video, and virtual experiences, how do we deal with real vs. fake?
  • What do we actually mean by “fake”? Is fiction fake? Is artwork developed with the assistance of AI fake?

So starts my list of questions I think are important when taking a deep look at information, misinformation, and disinformation.

 

Salvatore Fiorillo

How much does our (… put here any name like deep state, social media companies) needs entropy in information? I was in Dubai in the last five years and I could feel I was under an information umbrella, but not with entropy at all. Of course, that one is a monarchy and we all understand why there is not entropy there: but for the western democratic world is information chaos going to be a management tool?

 

Dennis Bushnell

Much requires folks’ interest in trying to learn whether is correct or not, much depends upon “information control”, much concerns beliefs and personal motivations.

My learnings from renewable energy and much else is that a prime motivator is PROFIT, folks will check into things that they could obtain a profit from, or could lose money on….other issues are the educational level and interests of the reader etc., many believe what they want to believe,  then there is the herd mentality. As you unpack this issue it gets really complex and really deep very rapidly. Propaganda warfare is  ancient, and now with the IT age a very rapidly developing art. My previous post stated what I thought we could do about it in a democracy with freedom of speech etc. that might be acceptable to many.

 

Jerry Glenn

Thousands if not millions of infowar stuff is already doing damage by the time it is identified and shown to be false. Improved identification is just treading water and the long-term consequence of treading water is drowning. We have to get ahead of the problem and intervene.Here is a 6-minute edited video one anticipation/intervention approach from an hour talk for the South Korean Chosen Newspaper’s centennial 

Use info warfare-related data to develop an AI model to predict future actions; identify characteristics needed to counter/prevent them; match social media uses with those characteristics and invite their actions, feed results back into the data bank to continually improve the model. Here are some more drawn from the Global Futures Intelligence System; each of these alone will not solve the problem, but together will have impact:

  1. Internet platforms should create automatic prompts when a user is about to forward information that is from known source disinformation.
  2. AI systems are improving that will identify and trace deep fakes.
  3. Notify people when they forward proven disinformation about the originates from foreign toll farms; people should know when they are unwitting agents of information warfare.
  4. Explore how information systems could build in resilience features, and honey pots to waste info attacks
  5. Create a “cognitive immune system” for the individual and community.
  6. Expose, isolate, public shaming, deny visas, countervailing trade duties
  7. Make “pursuit of truth” fashionable, popular, cool, a road to success, a motto of schools of journalism
  8. Consider issuing “Letters of Marque to non-government actors to counter information warfare and cyber warfare.
  9. Foster ethics for online systems to clarify issues, show a range of positions on the issues and allow for pro and con arguments on the positions.
  10. Remove financial incentives in social media.
  11. Use a Risk Rating system with index based on content, operations, and context of publishers who spread disinformation based on:
  12. Metadata and computational signals of news domains with an AI program
  13. Blind review rating system of news sources based on credibility, sensationalism, hate speech, and impartiality.
  14. Analysis of how the site’s policies, standards, and rules abide by the Journalism Trust Initiative.
  15. Analysis of the practices, reliability, and trustworthiness of a site through an independent expert survey.

According to the National Defense University, a few elements that can be used to identify and deter threats are:

  1. A threat to something of value that exceeds the perceived gain of non-compliance.
  2. A clear statement of the behavior to be avoided or performed.
  3. Clear and unambiguous communication of the threat and the desired or proscribed behavior to the target.
  4. Credible threat, meaning that the actor is perceived by the target to have the will and capability to execute the threat.
  5. Situational constraints make it impossible for the target to avoid punishment.
  6. Controllability of the threat and its implications by the actor.

Tom Abeles

A lot of the responsibility is for the reader to have good screening tools. The problem is in those screening tools which could be developed with a “personal” AI/ML engine. Lacking such, most people use their limited time and capabilities by using trusted “others” as a screening mechanism. But all of these have multiple dimensions. For example, there is a large community that “believe” QAnon, Trumpers, etc as truth-tellers whereas most of those on this list use other “screeners” to filter the far-right materials out. 

 

Margherita Abe

Apart from leaving the onus on the reader or viewer of the “news”, maybe we should consider having social media organizations (Facebook, Twitter, WhatsApp, etc) assume liability for what they post.  Another version of this would be to consider these organizations to be utility companies and thus subject to the same rules that other utilities must follow….government oversight.

 

Young-Jin Choi

I suppose astute observers of our time are likely to conclude that we have entered a “dark information age” which has led to a dysfunctional public sphere and a further weakening of our already weak democratic institutions. Who would have thought at the beginning of the internet, that with all the world’s knowledge available at our fingertips, so many people could become even less well informed? This epistemic crisis is tragically coinciding with a truly dangerous period for the future of the human species and the possibility of continued human progress:

1) the risk of a possibly civilization-ending climate catastrophe (which may play out over centuries/millennia but might bring human societies to a breaking point already within the next couple of decades)

2) another possibly civilization-ending risk of geopolitical nuclear conflicts over increasingly scarce resources driven by rising temperatures.

Such a grave emergency would normally require the best of what our institutions can offer in terms of collective foresight, scientific rationality, ability to cooperate, wisdom and empathy.  But they were never designed to prevent or resist the assault on science by fossil fuel-funded think tanks, political operatives, or private bloggers.

I’m afraid that the problem is bigger than that it could be solved by AI or social media regulation alone – the critical reasoning capabilities and the worldviews of at least a critical mass of those who are currently misinformed need to be elevated by a public awareness/education campaign. 

In order to reduce the spread of conspiracy theories (misinformation), the authors of the conspiracy theory handbook suggest four communication strategies addressing the general public:

  1. Preventing/slowing down the spreading of conspiracy theories (e.g. by encouraging people to ask themselves four simple questions before sharing a post: Do I recognize the news organization that posted the story? Does the information in the post seem believable? Is the post written in a style that I expect from a professional news organization? Is the post politically motivated?),
  2. Preventively “inoculating” the public against the techniques of science denial (”prebunking”) by creating awareness about the risk of misinformation (see John Cook
  3. Debunking conspiracy theories by refuting weak pieces of evidence, and by exposing unjustified/unreasonable beliefs as well as logical/factual incoherences (e.g. through fact-checking, source analysis etc)
  4. Cognitively empowering people to think more rationally rather than relying on their intuition. This strategy requires more substantial interventions in terms of education and culture, which are examined in subsequent sections.

(https://yj-choi.medium.com/how-to-tell-the-difference-between-a-conspiracy-theory-and-a-theory-about-a-conspiracy-d89193ebab0)

The currently widespread lack of emergency awareness among many political leaders and older generations (many of whom seem to be stuck in the worldview of the 1990s) is explained by this terrific interview with Lee McIntyre.

“3:16:  Linked to this is the hot topic of post-truth. What is post-truth and why is it so horrendous? Is it a version of bullshit? It’s often linked with right-wing fascist and populist agendas but it seems to have many familiar traits associated with left-wing positions too – Derrida and the constructivists and cultural relativists and identity politics etc – do you think there is a lefty zeitgeist informing this stuff as much as a right-leaning one? It seems left populism is just as adept at using this tactic as the right populists. 

LM: In my 2018 book Post-Truth, I define this concept as the “political subordination of reality.” It is horrendous because it is in some ways the exact opposite of science. It’s deciding in advance what you want to be true, and then trying to bend the public to your side. But it is not a version of bullshit. If you read Harry Frankfurt, he quite clearly says that people who engage in bullshit do not care about truth. Well, post-truthers care a LOT about truth, because they are trying to control the narrative about what’s true and what’s not. A lot of them are authoritarians or their wannabes, who understand that the best way to control a population is to control the information they get. Historian Timothy Snyder said it best: “post-truth is pre-fascism.”

Now it’s a contentious question where post-truth comes from, but I think there are several roots. The main one is from science denial. Seventy years of awesome success by those who wished to deny the truth about evolution, climate change, etc., did not go unnoticed by political operatives. One day they said, “Hey, if you can lie about scientific facts, you can lie about anything.” Like maybe the outcome of an election? And yes, I think that one of the other roots is post-modernism, which is largely left-wing. Now they didn’t intend it. They were playing around with the idea that there was no such thing as objective truth, and that perhaps this meant that anyone making an assertion of truth was merely making a power grab. That all sounds fine when you’re in the university doing literary criticism, but at a certain point these ideas began to create the “science wars,” where humanists began to attack the idea of scientific truth. And from there, it leaked out even further and fell into the hands of right-wing political operatives. They picked up a weapon that had been left on the battlefield and began to use it against some of the same people who had invented it. Post-modernists get irritated with me sometimes for “blaming” them for the Trumpian attack on reality, but all I’m really saying is that even if their intentions were good, they caused some damage. George Orwell said it best: “So much of left-wing thought is a kind of playing with fire by people who don’t even know that fire is hot.” 

“3:16: And why do you think we’re in the dark ages about human behaviour and we should do something about it?

LM: One of my earlier books is called Dark Ages: The Case For A Science of Human Behaviour. In it, I argue that political ideology is doing to social science what religious ideology did to natural science about the time of Galileo. I am against any kind of ideological interference in scientific reasoning. To me, “dark age” thinking is emblematic of the type of mind that wants an answer — that wants certainty at all costs — and damn when the evidence tells you you’re wrong. To me, that’s the mark of an incurious mind. Scientists may make mistakes sometimes, but I think their hearts are in the right place. But do you want to know something sad? I wrote Dark Ages with the idea that natural science was pretty solid, and we should build on that to come up with a better way to study and explain human behavior. Then, while that book was making the rounds, science denial started to heat up and suddenly people were attacking the results of natural science! To defend that I wrote another book Respecting Truth, in which I took on evolution deniers and climate change deniers, and tried to highlight the stupidity of their attacks. Couldn’t they see that they weren’t reasoning scientifically? Well, then you know what happened. Things got worse from there. All of that unchecked science denial eventually metastasized into “post-truth”….into reality denial….under Trump. So my career has been marked by me wanting to defend science and extend it, and the world keeps pulling the rug out from under me and attacking science even where it is working. It’s pretty depressing actually. I sometimes wonder if I’m making any difference, yet I can’t give up the fight. When I was a kid and read that World Book Encyclopedia I used to mourn that I was born too late. All of the great ideas had already been discovered. Who was attacking science and reason now? All of the ideologues were dead. Boy was I wrong.” 

 

Milind Chitale

Mis and Disinformation are very powerful tools of the evil mind as explained below:

Most people today rely on information found on the internet and social media to make decisions and form ideas of what exactly is happening around them.

Depending on their level of education, awareness of real-life and real things, they can be swung to any point on the scale of complete lies to the absolute truth using the media they consume. Print media itself is also completely deregulated in many ways and has to pander to the sponsors who may want to seed print items with half-truths, blatant lies of even the truth itself in some cases.

When people follow these packets of information, they can be swayed any which way, as described in the paragraph above.

This makes understanding the truth very deeply connected to the source of information and with the cacophony of available sources, people find it very difficult to discern the sources that are generally truthful from those that are not.

As seen in using a common code of ethics in multi-national corporations operating around the globe, depending on the society, the levels of ethics, morals, standards, judgment and many things that affect one’s judgment are very skewed. So an internal panel that self-regulates the media may sound good to an audience on a matter in a certain locale, but that very point may be completely at odds with people in another locale. Examples of points that have variable sensitivity: Obscenity, blasphemy, religion, cultural aspersions ( clothing, style, methods, fashion sense, etc), Humour ( Australian vs British vs African vs India, etc),

Scientific discourse itself is open to debate in spite of being a rigorous and highly structured field of knowledge. Even today people have to debate whether climate change is a real thing or is actually a natural cycle of events spanning thousands of years and is actually on of earth’s natural cycles.

There have always been a group of people entrusted to deliver the verdict on various facets of truth like justice, societal rules, social equality, etc since the dawn of civilization. However, equally true is the fact that these entrusted dominions also do falter and deliver improper outcomes to the very society they seek to protect.

To put things back in perspective, this is still a necessary facet of society to keep a ring of reality, authenticity, and balance, in having wise people appointed to such circles.

In the hands of a worthy government is a very wise thing to do. But in the hands of Autocratic and off-centre government (either far right or far left) is a very bad thing to do. But the problem is that these governments themselves can appoint a committee on their own anyway, making the discussion kind of lop-sided

What should be done is to include a panel of citizens from a large range of professions and sectors along with government officials together with equal voice and authority to come out with a unified/unanimous regulation of media.

The problem has always been the appointments and how they can be twisted in favour of the governments.

My view on this is that even if it is not going to be very accurate, it still needs to be done as without this, people speaking a thousand lies can begin to drown out one truth after another and slowly change the entire landscape of reality.

Having strong leadership at the national level always helps, as good leaders are normally very well developed in establishing the sense of right and wrong at the outset in the top line of command, this helps trickle down to the common man, the establishment of truths and righteousness.

But a strong leadership with a bent vision will be equally hurtful in this endeavor.

AI has become a dangerous tool. As we have seen Deepfake videos that completely take real people and put words and phrases into their mouths in the most believable videos ever even though we know they are fake.

When AI  can create its own video games, and characters and change the complete sense of the character and what he is trying to achieve by hook or by crook, this affects kids and enthusiasts very deeply and these kids will eventually be the citizens of tomorrow with their skewed societal values and ethics. Many youngsters die for virtual gaming by giving up food and drinks for extended periods of time. This shows the extent to which gaming can be tweaked to change their reality.

Overall improvement in the pre-primary and primary school education to invoke and seed ethics very early in life.

“Appreciating that people are different all around us, and that is ok”- a programme similar to many multinational organisations to harness the spirit of oneness in a diverse population of race, regions, religion, society, etc is the need of the hour to reduce the pressure on communities in trying to outwit, fool and edge out the others by foul means gets reduced greatly.

Imposing a structured course that is compulsory for all students of tertiary education which will empower them with the tools to seek the truth and separate the chaff from the grains of information is an immediate need of the hour, and must be part of all nations with any form of education policy.

Knowledge is power, and Knowledge is the truth, so any attempt to keep society knowledgeable and ethically bound will be good to solve major issues here.

Countries that are trying to fudge each other’s social media to gain cheap military, economic and other gains have to be knocked down several notches for establishing a world order based on truthfulness.

Eliminate the free flow of funds to media is an issue that is urgently worth pursuing just like the blocking of funds to terrorism and its activities. Free flow of funding eventually tips the balance of the media to become biased towards the fund sources which are also the sources of the falsehoods.

 

Hannu Lehtinen

I think one- or two-party systems easily start to spread false information. A one-party system has control over information (the Internet) and the two-party system accepts and disseminates two sets of information because each party disseminates its own information to its supporters.

The open internet allows us to check information. We already do a lot with smartphones as we search for information about services and new concepts and devices. We are used to biased information in marketing.

The UN should defend the open flow of information more effectively.

Schools should teach checking information and “why and how to doubt information”.

Artificial intelligence will certainly be used to verify the information. It will become a competitive factor in the media industry. We don’t have time and we don’t want to read or listen to bad information.

 

Art Murray

I’ve hesitated to respond because I think framing the question in terms of mis/disinformation doesn’t go deep enough.

Instead, I’ve been thinking along the lines of… ”How do we overcome learning apathy?”

There’s a growing body of work in this area, and I’ve personally experienced it many times standing in front of a classroom (and less, by the way, when I got off the stage and sat right in the middle of the classroom, among the students).

Students absolutely hate, and are bored to death with, today’s mainstream approach of rote, industrial-age-based “education.”

Yet when we’ve tried to introduce new approaches, such as deep learning (the human, not the machine, variety) students resist because it requires them to think on their own.

Anyway, it should come as no surprise that this growing student apathy toward learning carries over into adult life.

Students keep asking: “what do I need to do in order to get a passing grade?” – for the exam, paper, course, whatever…

So it’s no surprise that when they become adults, they want instant answers to questions (for whom do I vote, what stocks should I buy, when should I retire, etc., etc., etc.) that would normally demand thinking, research and learning, rather than a “one-click” answer at the top of the search results.

If people had a true hunger for learning and seeking the truth, the matter of mis/disinformation would not be anything close to the problem it is currently…

That hunger for learning is inherent, and in many cases, lying dormant.

The question is, how do we awaken it?

Mark Sevening

How data is framed is something that has been troubling me.  Even the news outlets are biased.  CNN is left-leaning, while Fox News is right-leaning.  The rest fall somewhere in between.  What troubles me most is that a highly educated society like what we have now at times refuses to acknowledge facts and rely on opinions and editorials instead.  Those editorials show one point of view and may not present all the facts. 

There is also a troubling trend for an ‘informed’ consumer to choose which articles to read, which mostly reinforces their pre-existing views, going as far as rejecting anything that contradicts their beliefs. 

I will give this more thought and post soon.  Good thought exercise, by the way.

Jonathan Kolber

I believe that redesigning the educational system to encourage exploration of what people care about, rather than a fixed one size fits all agenda, will elicit a curious and questioning mindset. 

In researching my book, A Celebration Society,  our most startling finding was that superstars in all fields examined– including business,  sport,  invention, science and arts– characterize their engagement with their fields not as work but as the playing of games. 
 
Most of us can’t imagine playing that hard, so we view what they do as work. They don’t. 
 
The same kids who are bored and unengaged with industrial education systems will play well-designed video games with full intensity for hours. I submit that the major difference between such kids and those who change the world is that the latter figure out how to bring that intensity to a real-world game, which they are committed to both playing and winning. 
 
An educational environment that cultivates student curiosity, purposeful play, risk-taking. and self-directed learning may support this outcome. 
 
This is not only my view but that of Paul Graham, founder of Y Combinator; the world’s most successful startup incubation system.
 
To those who fear such a system will deprive kids of basic, necessary skills, I invite examination of the Finnish national educational system’s outcomes, which are in this direction and world-class. Likewise, the unschooling, movement,  now over 60 years old and in many countries, have data. 

 

Michael Mainelli

Enjoying this thread.  A bit of a gimmick against some deep thinking here, but on cyber-security and financial literacy I’ve been pushing for more gaming as well, but in earnest.  As an example, try to spot real cyber fraud or financial offers that are “too good to be true”.

I often point to an analog study, the Pacific Northwest Tree Climbing Octopus. 

 

Steven Hausman

To quote Prof Feynman: “The problem is not people being uneducated. The problem is that people are educated just enough to believe what they have been taught, and not educated enough to question anything from what they have been taught.”

 

Fran Rabuck

This is a great topic, although not sure what you are trying to “predict” for the future. I’ve spent a lot of time over the years in this area of “information” – both in data and “facts”. In my math and statistics training – I learned more about how/why/proper application of data – than the mechanics of stats itself. I’ve added some quick comments below – but under separate other msg threads, I’ll address some detailed ideas and excellent sources for information on this topic.

Starting [with right-wing problems] seems to have an internal bias/setup. Can you balance this with a left-wing example also? It also seems to base bias on political positioning. I highly recommend that the coverage of this topic NOT become a politically positioned focus.

I also recommend as background on this topic a little history on the speed of information delivery. Hyde/Town Square, Pony Express, telegraph, newspapers, radio, tv, internet, social media and the next generation of groupware (private chat sites, metaverse worlds, closed communities, etc.) Move from broadcasting to narrowcasting of information.

Also when I talk about data analytics/AI – I usually start with the idea from “In Search of Excellence” – We are Data Rich and Information Poor (sometimes just abbreviated to DRIP)

“fact” or information is interpreted by the individual and guided by his/her internal Bias. We all have Bias – and this difference is what makes the collective opinions on information so important. 
 
There are several movies that show an event from different perspectives. It’s fascinating to see the story unravel and see how each individual is right – from their viewpoint. Here’s a list of some:
 
Everything we experience creates an inner belief system and influences almost everything we do. It’s very difficult to be totally objective. 
 
Most attention recently has been on race and political bias. But there are many more bias systems that are generally recognized.
Here are a few and a discussion on the topic:
 
I might suggest that we do a survey of the TechCast group and ask for a scaled (1-10) response on where they stand personally on these 14 bias. It might be interesting to discover the bias we have as a group. I’d be glad to create the Google forms survey to publish and analyze.
 
BTW, the above article comes from a related site on Media Bias. They collect newsfeeds and create an unbiased news stream. I’m not judging the validity of this, just noting it. Note also their classification of news sources by bias.
 
Too much emphasis in the political sphere – has focused on just general Left vs Right, along with each party’s collective platform. One might argue that we have 4 “parties” now Extreme Left (Squad), Moderate Dem, Moderate Rep, and “Trump” followers. From my own viewpoint – I do have some Bias – but none of these 4 groups aligns perfectly with my total belief system.  I think I’m not alone for the majority of people in the US (and worldwide). 
 
Dale Deacon
The advent of the internet undeniably changed the Earth’s historical trajectory since it radically disrupted how homo sapiens (the progenitor of the Anthropocene) interacted with the information itself. The flow of information is everything. (See Shannon, Schrödinger and Von Neumann)
 
Importantly, with regards to information flow, since the 90s, more and more people have had access to validation bias
 
Today, we can all justify our suspicions, hence the astronomical rise of conspiracy theories (qanon, UFOs, pizzagate, flat-earthers, chemtrails, infowars, etc.) over the last twenty to thirty years or so. We live in a postmodern soup where everyone feels deeply vindicated in their convictions. 
So what is to be done? I defer to Aguilar’s (genius) PESTLE framework. 
Politically, we’re at the behest of free-market economics. By and large, Hayek crushed Keynes. Economically, monopolies dictate the terms of consumerism. Socially, we wallow in the muck and mire of social media-induced outrage. Technologically, we seek tools and techniques to address our ailments but unfortunately are largely incentivized by survival-of-the-fittest competition. Legally, we’ve tied our waning nation-states to outdated ideologies and lobbyists. And environmentally, we’ve Increasingly become psychologically detached from our ecology for centuries.
 
This is all very Orwellian, I admit, but I am yet hopeful. Our scientific and philosophical endeavors, I think, might produce solutions to the predicaments listed above. How so? 
 
Politically and legally, transparent and traceable blockchain ledgers have the potential to address corruption and graft. Economically, these same tools are already facilitating the greatest transfer of wealth we’ve seen since the colonial era. (Not to mention the industrial and intellectual productivity to be gained through further robotic and cognitive automation). UBI also holds much promise here, if unequivocally proven viable.
Socially, we’ve never had such vigorous cultural debate, fast-tracked by a litany of social media platforms (see the metaverse for the next iteration of this phenomenon). 
 
All of these advancements are made by scientific and philosophical breakthroughs made possible by an ever-increasing ability to access information. (Even misinformation spreads data, albeit with noise)
 
I despair on the ecological front and suspect we’re in a most perilous circumstance. Perhaps there’s an economic shift that may incentivize us away from peril, but it will take a memetic shift (information/misinformation/disinformation) away from the status quo to do so. With progressivism and conservatism apparently (politically) so evenly matched, I am curious as to how radical change might possibly manifest.
 
Either way, dystopia or utopia, we get the future we deserve, not necessarily the one we want.
 

John Freedman

In my experience and assessment, there are two distinct groups of no-vaxxers/conspiracy theorists, with very little overlap. The vast majority are the ‘innocent gullible’ – unable to distinguish between hearsay and evidence. This large, mostly benign ‘vaccine-hesitant group is actively preyed upon by a smaller ‘malignant minority’ of willfully ignorant misinformers/disinformers who exploit gullibility for personal gain. Although the two groups are morally and ethically distinct, In the end, the outcomes are equally dangerous and destructive:  the two groups together – one victim, one predator – are responsible for the prolongation of the pandemic and the massive human suffering incurred. 

The larger group – the gullible masses who are deceived but are not willfully (and thus hopelessly) ignorant – is the one to target. The time to target them is early in the development of cognitive skills and critical thinking abilities. Thus educational institutions are the only sphere where I can garner any hope of addressing the problem long-term in a robust way. 

Part of the pessimism that comes through in the comments from the group is due to the dark fact that the misinformation/disinformation problem is neurobiologically programmed into the human brain by evolution. Our brains are not fundamentally designed to seek truth or distinguish between fact and fiction. They are designed to seek survival and procreation, ends that do not require a search for or appreciation of truth. The Buddha (the world’s first evolutionary neurobiologist, unbeknownst to him) understood this and entreated humans to move beyond their inherent nature and cognition to seek truth and attain freedom from suffering.

We ingenious modern humans have in fact developed a powerful means to seek and identify truth. It’s called science. In its essence, it is simply a mode of cognition that tests falsifiable hypotheses and assesses the preponderance of evidence to identify truth. The essence of all misinformation and disinformation campaigns, including virtually all false conspiracy theories like anti-vax propaganda, is anti-science. Developing the cognitive skills to assess evidence is the only way out of this curse of anti-science that has been amplified by the internet. It is very likely that the actual biophysical neuronal substrate for this is set up in early life, and without it there can be few ‘conversions’ from anti-science zeal to evidence-based rational thinking. To wit, there are precious few COVID deathbed ‘conversions’ of anti-vaxxers – the very few which do occur are considered newsworthy.

Thus my hope rests with humankind overcoming its neurobiological evolutionary constraints, and seeking ‘the better angels of our nature.’  Peaceful non-tribalism and the rejection of anti-science (which are actually related) would be two of those better angels. Education and cognitive skills development are the means to that end. 

 

Ian Browde

This conversation is highly complex and I am inspired by the input and learning a lot. 

An article by Jill Lepore, Mission Impossible compares FB to Standard Oil back in the early 1900s. The Internet (read: the digital world) is more than media, it is a life extension, with young people increasingly becoming digital amphibians. 

Lepore’s article supports a comparison of the earth/physical world polluters – fossil fuel companies, cigarette manufacturers, etc and the digital world polluters – online advertising companies like FB, porno companies, human traffickers and now mis/disinformation distributors.  


Craig Boice

The immediate problem is less about disinformation (which has always been with us) and more about changes in culture and technology that have combined to make disinformation more virulent, particularly in the United States.

Perception, Information, Belief

Since Plato, philosophy (i.e., epistemology) has noted that perception and reality differ. Socrates was executed for providing disinformation to young people.

Perception and the analysis of perception are fundamentally individual and personal. What each of us consciously recognizes as “information” differs. Religious, cultural, and scientific schemes of analyzing perception (i.e., belief systems) can be learned, and lead groups of individuals to become similar. The identity of these groups is often based on their conviction that they (and only they) have the ability to perceive reality as it is. Belief systems can be who we are.

Being Open-Minded Can be Risky 

The United States was founded by distinct, radical groups who elected to live together with one another. American society saw itself as diverse and dynamic, within limits. There was an opportunity for invention. Our society became open-ended. We generally accepted that new information might arise, justifying changes in beliefs.

This occasional tolerance did nothing to limit the consequences of holding misguided beliefs. As Will Rogers noted, “It ain’t what you don’t know that gets you into the most trouble. It’s what you know that ain’t so.” The United States had witch trials, genocide against indigenous people, lynching, and centuries of persecution of others based on skin color, sex, sexual preference, and place of birth. There was a Civil War. The Mormons were expelled from Illinois. Immigration was restricted. Despite all of these tragedies based on bad information and savage beliefs, American culture was still committed to the proposition that today’s information may be tomorrow’s disinformation. Americans believed in progress: information could become true. Innovation could triumph. A belief system could adapt. Our society developed a great appetite for new information and respect for iconoclasts and rugged individualists.

Belief systems are aspirational, and the American belief system had us always learning. “What have you learned since we last met?” Emerson asked. Even today, our open-ended belief system leaves us in a constant tension between what we seem to know, what we learn, and what has changed since we last looked. That tension is immensely healthy and well-aligned with evolution, but it also leaves us vulnerable. So long as we had our bearings (e.g., some form of religion, the pursuit of happiness, Manifest Destiny, fighting fascism, the American Dream, fighting communism), we had a reason to sort out perceptions, and regard information carefully. We were on a mission. But today, our belief system lacks a compass and adapts to the winds rather than the stars. 

Many of us are now “politically correct,” believing that as long as somebody is willing to face the legal consequences, they have a right to personal beliefs, actions based on those beliefs, and advocacy of those beliefs. We might label Chicken Little ignorant or misguided, but not dangerous, or evil. We’d describe that whole sky-is-falling panic as a “difference of opinion.” We’d let the courts sort it out. Furthermore, as the Trump presidency and the pandemic have demonstrated, we’ve taken another step. In many cases, we’ve abandoned standards of verification, because “who’s to say?” It’s so hard to sort out every claim. We would kind of prefer sound reasoning, decisive evidence, and consistent ties to values, but we don’t insist. So we now make little distinction between well-grounded beliefs, and beliefs that have no basis in reality. We substitute the apparent intensity of belief, and whether our online “friends” seem to believe the claim, for verification. 

We had already accepted that reality was dynamic and information changed. We liked “rugged individual” points of view. Then we started thinking that it was only fair to regard any observations as potentially as good as any other, even if they lacked verification. At that point, our belief system is no longer adapting to a dynamic reality but is instead adapting to some version of a media construct. We let it all in, and we weigh the perceptions emotionally. We have lost track of the difference between adapting to reality and living in our group’s fantasies.

Closed-Ended Wasn’t Great, but Closed-Minded is Worse

Meanwhile, the most prominent and authoritative closed-ended belief systems on the planet (e.g., Chinese communism, Islam) have continued to survive as they always have, by reinterpreting novel information within their own frameworks and indoctrinating each new generation of believers. Across the last few decades in the United States, as “education” has become custodial and experiential, and religion has become ceremonial, we have largely abandoned indoctrination. We still recognize that there is a reality that is what it is and will be unforgiving, but only STEM students and ambitious athletes are required to pay attention to it. Everyone else is urged to discover the “best version of themselves” without much guidance about what counts as best.

Now receive a gift from technology: the simple and rapid ability of individuals to establish and nurture ‘like-minded’ groups. No miracles or sacred texts are required, although celebrity texts can assume the role of fatwas. No sophisticated Russian desinformatsiya is required; deception and misdirection can come simply in the form of virtual “friends”. Technology now allows any belief system to reinforce itself through social preference, and the substitution of digital messages for perception. No reflective thinking is required. The belief system has created its own hall of mirrors to reflect messages over and over. Technology’s gift can invigorate a closed-ended belief system. But technology’s gift will be toxic to open-ended belief systems. Rugged individuals have few friends. The open-end is replaced by those mirrored reflections. “Cancel culture” emerges immediately, as a means of screening out discordant voices, including the voice of reality itself. Minds close.

So a belief system without a compass encounters a technical environment where it seems that if we just align ourselves with a herd, we no longer seem to need a compass. In fact, we can obtain herd immunity against new information. The beliefs of other people become our perceptions. Isn’t it remarkable how many different kinds of people seem to be thinking just what I was thinking? We are no longer independent nodes of consciousness and insight. It’s what Socrates meant when he said “the unexamined life is not worth living.” We have an open-ended, closed-minded belief system that is technically enabled. So many kinds of diversity, but not diversity in belief. Equity among all those who believe. Inclusion in the belief system is the basis of both recognition and belonging. Our society becomes AI in a distinctive, dangerous sense. 

What Happens Next?

It’s not good. Especially in the United States, where we have developed little resistance to misinformation.

Elsewhere, closed-ended belief systems cope with change and stress through periodic resets, not admitted to be such. These resets are often initiated by a reinterpretation of history, followed by exile, imprisonment, or execution of the misguided. Behavioral economics shows us how cults continue even in the face of complete failure of their initial predictions.

However, our open-ended closed-minded belief system may simply dissolve, as change and stress aren’t recognized. Disinformation may prove to be fatal, a cultural “wasting disease”. The truth is no longer differentiated from illusion. Everything might be true or might be false; it looks like a choice. But in fact, the choice is only to recognize reality, or not, when establishing beliefs. In a contemporary open-ended closed-minded belief system, recognizing reality may take a while, or never happen at all. Society may never get to that option. In the meantime, beliefs become shallow and dispensible. Ephemeral opinions storm through the system as fads. 

Recognizing reality, recalling it, and planning for its consistencies comprise one of humanity’s most powerful skill sets. Disabled by the acceptance of disinformation as indistinguishable from information, these skills atrophy. Life gets a lot harder. Discipline, habit, aspiration seem to have no point. Into that confusion usually rides the leader of a closed-end belief system, who announces that prior perceptions have been flawed. Lost souls flock to the peculiarly compelling message. Citizens in practical difficulties sign up for what seems like a promising path. When gods and their ceremonies multiply, but ills continue, it’s time for a new pantheon.

Many of us have come to believe in climate change caused by human progress. We project dire consequences from our collective actions. We note how dependent human society is on maintaining the physical climate around us. Yet the information that swirls around us is even more vital than the weather for human survival and aided by technology, our information climate is changing faster than the weather.

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

More To Explore

Blog

America’s Crisis of Maturity

America’s Crisis of Maturity: Democracy or Autocracy ? The coming US Presidential election should provide a critical test of American democracy. It may be thought of

Thanks for subscribing to our newsletter, Bill’s Blog.

We suggest you place our address in your approved email list.
Look for your first issue soon.
The TechCast Team