74 Comments
founding
Nov 30, 2022Liked by Asha Rangappa

Asha, great lesson! I like that you add an audio reading to the written story!

In response to your first question, I think the lack of broadband internet in many areas of America has allowed disinformation to flourish because those areas are then only serviced by the entities of Fox “News”, OAN, etc.

These networks allow that Ease of Entry of disinformation to occur.

Fox continually platforms Christopher Rufo as an expert on Critical Race Theory--Nuggets of Truth.

And now they have Tulsi Gabbard in the fold who as a former “Democrat” now speaks to the Low Level of Trust of the Democratic Party.

And finally, Repetition is Fox’s forte. They continually hammer on those emotional issues, like CRT, grooming, immigrant convoys over and over.

Expand full comment

Piet must compliment your succinct summary of the first lesson. Tulsi Gabbard reference was brilliant as an example of the duplicitous nature bound in disinformation mentality, indeed. Great comment! So exciting to have this extraordinary teacher with us. Did you see Asha on today's Mary Trump Show? (YouTube - Politicon channel)

Expand full comment
Nov 30, 2022Liked by Asha Rangappa

One way that social media platforms have allowed disinformation to take hold is their ability to allow users to "curate" their information sources so that they only hear messages that support their world view. This allows bad actors to target sub-groups and create the "echo chamber" effect.

Expand full comment
author

Yes, filter bubbles definitely allow for "alternate realities" to feel true and also for repetition (and the ability to see "likes" etc., makes it seem as though the information is widely accepted)

Expand full comment

I guess the echo chamber effect is a form of repetition.

Expand full comment

The various social media sites have made Ease of Entry virtually universal and the tendency to share something that appears newsworthy makes Repetition a slam dunk. The video tells us of how the widespread disinformation regarding AIDS raised suspicions of other US activities and policy statements. I would not be surprised to learn that the "stolen election" stories had an origin in Russia for they are one of the main divisions I see today. It is especially disturbing that the Republican Party appears to have made a decision to make accepting it a requirement for members of the House of Representatives going forward.

Expand full comment
Nov 30, 2022·edited Nov 30, 2022Liked by Asha Rangappa

USDA breeds purple cow to reduce methane belching and fight climate change. Methane being a major driver of global warming and cows being a major methane source.

Now try not to think about the purple methane cow next time you see a cow in a commercial.

Creating disinformation is fairly easy. Getting it to spread is more difficult. Until the internet. Years ago (circa 1983) you needed to get it into the media mainstream. That was like finding a host and sticking a needle into its arm. Not easy to do. Today, information spread is viral. Sneeze it on Twitter and it can expand exponentially in a short period.

Every so often I get the urge to create some disinformation as part of satirizing something. Here are the elements that usually make it effective. (Any released piece includes tip offs that it is fictitious).

Find a subject of concern and interest where opinions are strong and information is already mixed.

Take a position that is on the fringe of the discussion and then push it further outward.

Include names of “real” fictitious people (e.g., Prof. Earnest K. Hollingsworth).

Place them in “real” fictitious settings (e.g., the University of South Massachusetts).

Add some bogus quotes from fictitious experts, including government agencies.

If appropriate, make reference to bogus research findings and use numbers in them.

Raise questions to which the answers seem certain to extend the falsehood rather than stating the falsehood outright. (See also Tucker Carlson)

Include references to non existent publications and web sites (this tips off a conscientious reader).

Use slight misspellings of real organizations, e.g., Federal Bureau of Instigation (a tip off).

If you omit tip offs, you have genuine disinformation or misinformation.

Caution: This is not an endorsement of doing what I just outlined. Some people are exceptionally gullible and may not recognize this form of satire even when imbedded with tip offs.

Now, what color was that methane reducing cow?

Expand full comment
Nov 30, 2022Liked by Asha Rangappa

(Q1): An important effect of technological change has been the elimination of gatekeepers. Formerly, to reach a wide audience, one had to submit to the judgment of editors of newspapers, magazines, publishing houses, academic journals, radio stations, or TV studios. This inhibited the craziest baloney, or channeled it into those fringe outlets that welcomed such stuff. But now everyone has equal access to the megaphone. That has had many wonderful effects. It has released unimagined wells of creativity, and news now travels with the speed of light, but it has also uncorked geysers of lies and propaganda, and at least on the internet, there are hardly any filters.

Expand full comment
author

Definitely -- to the extent that the KGB wanted to get into a "mainstream" publication they would have had to recruit or dupe a journalist...but as you note, now everyone is a "publisher." There's also been a flattening of authority due to the elimination of gatekeepers. Of course, in some contexts elimination of gatekeepers can make the information space more democratic, by allowing suppressed voices to be heard and shared (think Arab Spring) -- but it certainly has a lot of negative externalities.

Expand full comment
Dec 1, 2022Liked by Asha Rangappa

It's hard to think of a solution that doesn't do as much harm as good. The faint voice of the optimist in me hopes that, over time, there will be a Boy-Who-Cried-Wolf effect: Enough people will eventually learn who the liars are that democracy might survive. Case in point: election denialism seems to be losing steam.

Expand full comment

one word Dave: ROEVEMBER. The hens know who the cluckers are.

Expand full comment

Gatekeepers were taken away in AM radio. In 1987, the FCC repealed the Fairness Doctrine, which required that radio programs present both sides. In 1988, the Rush Limbaugh Show was launched. Right-wing talk radio (I listened - while driving, just to check in to see what they are saying - to Dennis Prager and Larry Elder in LA) led the way to Fox News (1996), which paved the way to Donald Trump in 2016.

For any who are interested, there is more on that here: https://www.poynter.org/reporting-editing/2021/how-rush-limbaughs-rise-after-the-gutting-of-the-fairness-doctrine-led-to-todays-highly-partisan-media/.

Expand full comment

I really appreciate this expanded comment on the role of “gatekeepers” mentioned by the OP and the linked article reviewing the regulatory abandonment of the “Fairness Doctrine” in 1987 that perhaps enabled the ascendant trajectory of conservative media we see today.

It would certainly seem a highly significant transitional event, though I feel the question and underlying circumstances merit more reflection and discussion – which hopefully we can continue as we move through the class material.

Expand full comment

Agreed.

P.S: My "Kurt" comment to you:

Multi-syllabic input thrives here, thanks to people like you. I am hoping that together, with Asha, WE = The Freedom Academy - LEARN TOGETHER. We have the chance to show the world what community -- instead of chaos -- means. Thank you so much Shakespeare Kurt. Your words matter.

Expand full comment

My real talent is mono-syllabic grunts. ;)

But thank you for the kind words and yes ... to learn together is a journey well worth taking, I think, and even more so when sampling lots of chocolate and art along the way.

Expand full comment

Promise to read this extended historical role of the Fairness Doctrine, especially in the "fly-over" states; where radio, in general, and Sinclair & Salem broadcasting thrived. Sandy thank you so much for the link and all your insights.

Expand full comment

"Uncorked geysers of lies and propaganda." Not only is The Freedom Academy bringing zephyrs of wisdom ... but you, sir, are a wordsmith in the knowledge wind. Thank you, Dave.

Expand full comment

Thank you, Di Louise! But I have you to thank for adding "Roevember" to my lexicon. It took me a minute to get it, but I like it. I hope it is an "ember" that continues to kindle unabated until the right to choose is reinstated nationwide.

Expand full comment
Dec 2, 2022·edited Dec 2, 2022Liked by Asha Rangappa

While watching the YouTube video on KGB disinformation and listening to your lecture, a disturbing reality hit me like a ton of bricks…

When ex-KGB Putin and Russian agents helped Trump “win” the 2016 election, they successfully planted a “disinformation infection” inside the White House, quickly spreading to Congress, throughout the Republican Party at state levels, infecting tens of millions of Republican voters, and now threatening our democracy as election deniers discredit elections whenever Republicans lose.

With this infection now spreading through Trump’s “Truth Social” realtime disinformation platform and will continue to propagate through Trump’s 2024 candidacy, clinking of champagne glasses still must echo inside the Kremlin SIX YEARS after Russian disinformation first infected America’s highest office!!!

Expand full comment
Dec 2, 2022Liked by Asha Rangappa

Sadly, yes! Yale historian Timothy Snyder's "The Road to Unfreedom" expertly explores the rise of Putin in leadership then his tactics to exploit German elections, Brexit, and Ukrainian uprising then invasion of Donbas..all while grooming Trump and planning 2016 US election interference. Chilling.

Expand full comment
founding
Nov 30, 2022Liked by Asha Rangappa

Great first lesson, Asha! Looking forward to a robust discussion and, in particular, your thoughts on how to develop and promote successful counter narratives.

Expand full comment
Nov 30, 2022Liked by Asha Rangappa

Whoa. I am out of 'academic' practice. I guess I've adapted to the fast and brief social media practices of information/disinformation conveyance. That's really all over question one, isn't it? The speed and availability of information.

As to question two, I would boil it down to the idea that division=weakness. Fostering division being the goal of disinformation operations and U.S. interests being weakened as a result, globally.

For question three, I'll note the obvious exaggerated differences between the right and left. Climate Change, Covid/Vaccinations, Ukraine and Election Fraud.

This was fun. Thank You, Asha!

Expand full comment

Divide and conquer - as old as the Ancient Greeks (Phillip of Macedonia) and Romans (Julius Ceasar).

Expand full comment
Dec 1, 2022Liked by Asha Rangappa

As other commenters have noted, modern media technologies – employing common rhetorical and psychologically-based communication techniques – seemingly blur the boundaries between various domains in which an information consumer receives, processes, and then acts upon messages that are designed to influence recipients’ attitudes and behaviors.

This boundary blurring effect substantially compounds the challenge to not simply to see this phenomenon in the conduct of adversary foreign state actors, but also from elsewhere in daily information feeds from many different sources. In this respect, manipulative and malign communication strategies are also central to domestic actors promoting surveillance capitalism, sectoral regulatory capture, or hyper-partisanship, etc., all of which and more are becoming more prevalent and consequential as threats to our constitutional system of ordered liberty.

One comparative parallel can be made to the domain of corporate advertising and communication, which has certainly mastered – and perhaps even developed – many of the identified methods and objectives for disinformation delivery. These techniques include selective elements of “truth,” inter-connected loops of sources and authorities, and reinforced messaging over time with persistent repetition and amplification. These tools of persuasion are used ubiquitously to alter community perception to gain competitive advantage in varying ways and degrees, though most perniciously as forms of denial, deflection, or distraction as we’ve seen in this week’s course material.

To illustrate the corporate communication domain parallel, the U.S. tobacco industry continually denied the carcinogenic properties of its products, as ostensibly supported by its own undisclosed paid-for “science” that industry leaders knew to be false. Similarly, many of the largest global fossil fuel energy companies have promoted climate-change denial by sowing false doubt based on industry-funded “science” producing results that were intentionally misleading or flatly contrary to what was known to be true. These disinformation strategies were undoubtedly pursued to gain significant advantage and carried little apparent cost or consequence sufficient to deter deceptive communication strategies in the future.

The right to freedom of expression further complicates the boundary demarcation issue discussed here and I look forward to thinking about this further as the class proceeds. So thank you very much, Professor, for creating this interesting content and providing a novel opportunity to explore these issues together.

Expand full comment
author

I love all of these points. We'll be discussing propaganda next week and I think your example of the tobacco industry's deliberate shaping of beliefs, attitudes, and perceptions around smoking is definitely an example of that on the corporate/advertising front (which, e.g., on the climate change example, can also bleed into the political front)

Expand full comment

I also agree...and I think the same can be said of the meat and dairy industry (it's ongoing)

Expand full comment
Dec 2, 2022Liked by Asha Rangappa

Interesting first lesson. The historical development of the the use of disinformation was particularly useful. The increasingly prevalent echo chamber effect that spreads the disinformation around the world is seen as different cultural and international subgroups become involved.

One of the examples of this in recent political history is the often repeated tales of Antifa organized activities that have no or little basis in fact. These create a cultural wedge issue that creates friction within subgroups within American culture.

Expand full comment
Dec 1, 2022Liked by Asha Rangappa

There are so many good comments and responses already - the speed of information travel today, access to so much more information via the internet, the ability to (probably unwittingly) create your own echo chamber, the elimination of gatekeepers, the creation of alternate science (the tobacco industry example), and more. All very thought-provoking.

The thing I will add is that there seems to be a loss of critical thinking ability here in the US, maybe other countries as well but I don’t know. There are many reasons for that including I think the ability of all of us to “do our own research” on the internet (see the echo chamber problem above), the manipulation of social media platforms which also tend to create for us reinforcement based on our own inputs, and the loss of meaningful public education. I think the last item is arguably the most important. I love quotes but can never remember who to attribute them to, and there’s a quote that goes something like, in an autocracy you only have to educate one person; in an oligarchy you only have to educate a small group; but in a democracy you have to educate everyone. There are obvious frontal assaults on public education (Betsy freaking deVos) and more subtle attacks - reduction in funding, groups targeting school board positions, loss of qualified teachers, etc. It seems to me that most folks in this country don’t think critically about critical thinking, it is a skill that is being lost and that makes us more vulnerable to masses of people who will fall into the hands and minds of skilled disinformation campaigns.

My undergraduate degree was in biology and we were required to take a course where we were taught how to read, analyze and criticize published scientific research. Thinking critically about the questions researchers asked, the structure of their experiment, the quality of the data they gathered and the way they analyzed that data in order to decide for ourselves if an article demonstrated good quality science. These are skills, similar to general critical thinking, that I think have to be taught and practiced. We can read and see and hear so much on the internet now - how do you know how to decide what is true and what is not if you don’t even know how to think about what you are consuming?

Expand full comment
Dec 1, 2022Liked by Asha Rangappa

Great first lesson Asha! First technology today offers an incredibly low bar of entry for anyone including those intending to spread disinformation. Both from a platform/technology perspective but also a monetary perspective. What business does not have multiple sources online for PR, Sales, Marketing, etc. Twitter, FB, LinkedIn, YouTube, AWS, Azure, Google Cloud, IBM, etc are very easy to access (and abuse) and inexpensive to use. Last time I looked there are about 350+ vendors & 500+ cloud platform services available for contracting IT infrastructure & services. So a bad actor can create multiple accounts on multiple platforms that reference each other literally in minutes and in days be distributing content with an appearance of legitimacy by incorporating some nuggets of truth in the larger disinformation content. Content on the internet generally remains there for lengthy periods of time enabling many opportunities for reuse in future campaigns of disinformation. Social media facilitates easy and near instantaneous repetition if a campaign goes viral. Then of course is the human factor - many if not most successful cyberattacks include some level of social engineering for success. In the case of disinformation it is the human vulnerabilities that are being exploited. Whether it is an unconscious bias, overt belief in some conspiracy or stereotypes like antisemitism, racism, CRT, replacement theory, etc or some fringe conspiracy like pizza gate, stolen elections, absentee ballots constituting voter fraud, alien DNA or nanoprobes in vaccines, vaccines giving people Covid or the flu, prominent Democrats and Jewish people running a worldwide satanic pedophile movement that murders & consumes babies. It is the human context that will determine the effectiveness of any give campaign of disinformation.

The thing that strikes me the most about disinformation is how HUGE the ROI is for adopting it as a means of political warfare. How many trillions of dollars has the US spent on technology & military for defense and how many millions has Russia spent on information warfare that is having stellar success around the world? Without firing a single bullet in “the real world”.

Expand full comment
author

It's the poor man's warfare! Russia employs it precisely because it cannot compete militarily, economically, technologically -- and yet it is so effective because it is asymmetrical.

Expand full comment
Dec 1, 2022Liked by Asha Rangappa

1. I think speed and ease of entry have increased exponentially, but by the same token the ease of entry and therefore the vast sea of potential disinformation must compete to get eyeballs. Of course if you have the data file of the largest social media company and can home in on each user's likes and dislikes, you can simply micro-target through the noise with ever more effectiveness.

2. It of course undermines trust in us. It damages any moral authority and technical expertise that we could bring to bear on that public health crisis or any other collaborative undertaking in the world, and also harmed our ability to address HIV here in the U.S. effectively with some groups not trusting government recommendations.

3. Exhibit A is election integrity. But we can look at CRT, vaccines, a new one around wind power, or an old one like death panels, or Obama's birth certificate.

Expand full comment
Dec 2, 2022Liked by Asha Rangappa

Zuckerberg's Facebook has a lot to answer for for the 'ease of entry'. I single out FB because it is the ultimate ideal platform for nefarious state actors and others. It is opaque like a walled garden that outsiders can't look in. Insider outsider dynamic is never good for an egalitarian society. Once inside it has tools like recommendation and group formation just to name the obvious ones that you can use to create exponential reach. Once FB got into news and politics business, it was a perfect gift to state and non-state actors. It is one thing to have a group of your long last relatives and friends and talk about recipes, but including politics made it lethal, as Cambridge Analytica case showed. I hold Zuckerberg responsible for a lot of ills that we are experiencing today, not just in the US but in developing countries too. The thing is that people like me saw this happening with FB when it was first introduced. The business model was broken from the get go.

Expand full comment
Dec 1, 2022·edited Dec 1, 2022Liked by Asha Rangappa

Asha, do you know the concept of Brain in a Vat (BIV)? Of course, the real usage of that concept is to see what is reality as understood by your body and mind within the context of the boundary of 'I' and the universe. But to me BIV makes more sense in our present day for explaining what is happening to our reality -- by disembodying the brain (of trumpers, for example) and putting it in a vat and feeding it disinformation or what have you to deliberately distort their reality, usually done for nefarious purposes as is being done now. Calling it a cult is oversimplifying the situation, IMHO. It is not just one man (though there is one) but it is a whole cottage industry of grifters and fraudsters, if you will that is putting these people's brains in a vat and corrupting their reality.

Expand full comment

Perhaps we give Florida Man too much credit. Not only are his followers and enablers willing to go along with his chaotic management style, but they are using him to achieve their own nefarious goals. Compulsive lying and bloviating is a mental health issue -- but what excuse is there for those who would prop that up as "greatness?"

Expand full comment
Nov 30, 2022Liked by Asha Rangappa

Hi Asha, regarding the first question, social media has made ease of entry, much easier, pun intended. Then you have the organized repetition from like minded individuals on the same platforms, which then get picked up by mainstream media. A micro example of this was just earlier in the week when Elon Musk (I know you block him)starting dropping tweets regarding apple being against free speech because they reduced advertising spend. Then this claim was pushed by twitter followers and influencers. Then a few hours later I am seeing the “fight” between Musk and Apple being discussed on CNBC. Whether this was misinformation, the ability to get the message out and have it repeated so rapidly was very powerful and became the truth for many.

Expand full comment