Deep Dive with Shawn C. Fettig
Welcome to Deep Dive, the podcast where politics, history, and queer lives intersect in engaging, in-depth conversations. I'm Dr. Shawn C. Fettig, a political scientist, and I've crafted this show to go beyond the headlines, diving into the heart of critical issues with authors, researchers, activists, and politicians. Forget surface-level analysis; we're here for the real stories, the hidden layers, and the nuanced discussions that matter.
Join me as we explore the intricate world of governance, democracy, and the challenges facing the LGBTQ+ community. Expect empathy, unique perspectives, and thought-provoking dialogue—no punditry, just genuine insights.
Ready to dive in? Catch us on your favorite podcast platform, and don't forget to follow the conversation:
- Instagram & Threads: @deepdivewithshawn
- YouTube: https://www.youtube.com/channel/UCjZ9grY02HMCUR34qaWhNmQ
Got thoughts? Questions? We'd love to hear from you! Drop us a line at deepdivewithshawn@gmail.com.
"Deep Dive" - Because the most important conversations happen below the surface.
Deep Dive with Shawn C. Fettig
After America E6: Lie Hard - How Authoritarians Weaponize Bad Information
How does disinformation threaten the very fabric of our democracy? In this episode of After America, we explore this pressing issue. We examine the intentionality behind disinformation campaigns and the alarming findings from the Pew Research Center about the spread of fabricated news during the 2020 election. We highlight the distinctions between misinformation, disinformation, and rumors, highlighting how these falsehoods not only strain interpersonal relationships but also hinder policy agreement. And, we take a close look at the controversial House Committee on the Weaponization of Government, led by Representative Jim Jordan, and its contentious focus on alleged suppression of conservative voices.
We uncover the complexities of how disinformation erodes public trust and democratic institutions, and discuss how social media amplifies conflicting realities through echo chambers and algorithms. From the emotional triggers of false content to the radicalization of vulnerable individuals, this episode leaves no stone unturned.
The threats to American democracy are more significant than ever, and misinformation plays a crucial role. We’ll explore how certain political actions, despite claims of protecting free speech, actually stifle it, further eroding public trust. And, we discuss the long-term democratic implications, including increased polarization and the potential rise of authoritarianism.
Guests: Dr. Kate Starbird, Dr. Alice Marwick, and Dr. Tom Ginsburg
Credits:
Infados - Kevin MacLeod
Dark Tales: Music by Rahul Bhardwaj from Pixabay
-------------------------
Follow Deep Dive:
Instagram
YouTube
Email: deepdivewithshawn@gmail.com
Music:
Majestic Earth - Joystock
Pew Research Center has reported that 72% of people surveyed saw at least some news about the 2020 election that seemed completely made up. 60% thought that misinformation and disinformation had a major impact on the outcome of the election and 23% admitted to sharing fabricated political stories themselves, sometimes intentionally. Multiple studies have shown that Republicans are more likely to both believe and share fake news articles and information than are Democrats. Misinformation, disinformation and unfounded rumors erode the foundation of democracy by shaking public trust in institutions, distorting the shared reality necessary for informed civic participation and polarizing the electorate. This erosion of truth is a critical ingredient to the rise of authoritarianism, as citizens become unable to distinguish fact from fiction, making them susceptible to manipulation by those who would subvert democratic norms for their own gain. Given this, it would seem that government interest in uncovering how and where misinformation and disinformation start, how bad information spreads, and implementing initiatives to bolster fact over fiction in our politics and political discourse would be a step in the right direction. But when Congress acted in January of 2023, supposedly to investigate just these things, not one Democrat in the House voted for the creation of the Committee on the Weaponization of Government, and the name kind of gives away the reason Instead of focusing on the threat that misinformation and disinformation and rumors pose to democracy, republicans in the House, with Representative Jim Jordan of Ohio leading the charge, formed this committee to expose how federal agencies have supposedly targeted conservatives and suppressed right-wing voices. The panel's tactics have drawn comparisons to the darkest days of McCarthyism. The irony is that the committee has become a Trojan horse, doing exactly the thing it purports to combat spreading disinformation and rumors. It's not protecting democracy. In fact, it's taking up the mantle of those seeking to destroy democracy and bringing the threat directly into the heart of our government.
Shawn:Welcome to After America. I'm your host, sea C Fettig. Find, follow and like Deep Dive with Sean C Fettig on your favorite podcast platform and on YouTube, and check back every Sunday through September for new episodes of After America as we examine the precarious state of American democracy how we got here, precarious state of American democracy how we got here and where we might be headed. The clock is ticking. Democracy is at a crossroads and the time to act is now. In the age of information, our democracy faces an unprecedented challenge. The rapid spread of bad information has created a landscape where truth is increasingly difficult to discern and the foundations of our democratic institutions are under siege. At the heart of this crisis lies the distinction between misinformation, disinformation and rumors. Misinformation is false or inaccurate information spread without malicious intent, often due to honest mistakes. Disinformation, on the other hand, is deliberately false or misleading information, created and disseminated with the intention to deceive or manipulate. Dr Kate Starbird, associate Professor at the University of Washington, leading expert in the field of misinformation and disinformation. Co-founder of the Center.
Dr. Starbird:So misinformation is information that's false, but not necessarily intentionally false, and it's also a term we use as an umbrella term for a lot of different kinds of misleading information. So you'll see people use the term misinformation both as a specific okay unintentionally false information and as an umbrella term. So disinformation is information that's false or misleading, that is intentionally seeded or spread for a specific objective, so sort of a financial goal or a political goal. We see disinformation often spread for reputational gains. But the core difference is actually on intentionality, and so misinformation is not necessarily intentional and disinformation is intentional. However, I do think it gets much more complicated.
Dr. Starbird:I tend to see disinformation.
Dr. Starbird:You can't really look at a single piece of content and understand if it's disinformation, because disinformation really functions as a campaign, it functions as a series of information actions that sort of work together to create a distorted perspective on reality.
Dr. Starbird:Again, it's not typically just one piece of content, it's several things kind of put together as well. Like disinformation isn't always wholly false, it's usually based on a true or plausible core and then sort of wrapped in layers of distortions and exaggerations in order to create this sort of and then the last piece, I think in my definition of disinformation, which I'm kind of relying heavily on, a guy named Lawrence Bittman. The last piece is that we talk about intentionality being the core piece, except when you look at disinformation campaigns. A lot of the times in fact most of the individuals spreading disinformation in the stuff that I study are sincere believers of the content or those sort of willing, willing spreaders who just don't bother to even think about whether or not it's true or the veracity isn point seeded or amplified by someone for strategic means. But a lot of the people that become part of it may be doing it for other reasons.
Shawn:And then there are rumors. These occupy a gray area representing unverified claims that circulate widely, potentially originating from either misinformation or disinformation. Here's Dr Starbird again.
Dr. Starbird:We've been focusing a lot on the term rumor and the reason we use rumor. A, it's got a long history of research going back into the 1930s and 40s that we can draw on. And second, a rumor allows you to sort of rumors can actually turn out to be true and they could turn out to be false. What a rumor is is uncertain information that's spreading, often through informal channels, and so we like that term at this point because it allows us to start working on something without yet knowing exactly how things are going to turn out. While there's still some uncertainty, we don't have to determine intent either, right? So, rumor, you don't have to worry about veracity or intent yet you can start to work on something. Okay, this isn't verified, it's unsubstantiated, there's a lot of uncertainty here. Let's start studying something. We also find that rumor, like the term, keeps the conversation going, in the sense that it doesn't shut down a conversation. Don't put a big misinformation label on something. You say, okay, let's have a dialogue about what's going on.
Dr. Starbird:And then there's important kind of things, especially in the election space that we've been looking at, where a rumor, even when it's false, usually has a signal about something that people are confused about or concerned about Right.
Dr. Starbird:So we've been looking at a lot of research or we've been doing a lot of research around rumors about election processes and procedures so when and where to vote, but also the legitimacy of elections and so in that case, when actually you see a lot of rumors, it's because people don't really understand the process and how it works.
Dr. Starbird:They don't understand the layers of security. They don't understand that you know a ballot that's stolen out of a mailbox. When somebody's stealing mail is, actually there's a lot of actually layers of things that are going to help that person get their ballot back. And I think treating a lot of these things as rumors especially when it's from the sincere believers or the sort of people that believe it's true enough and aren't sort of motivated to try to distort things that actually kind of approaching things as rumors can be a little bit more constructive. So we've been really thinking about rumors on one hand and then disinformation campaigns, where people are intentionally trying to manipulate people whether they're trying to seed rumors or amplify rumors or deceive people in other ways and manipulate systems.
Shawn:The impact of this information disorder on our democracy is profound. According to Pew Research Center, 64% of American adults believe fake news stories cause a great deal of confusion about basic facts of current issues and events. This degradation of a shared reality has a real and negative impact on our democratic discourse, civic participation and trust in legitimate sources of information, including government agencies, scientific institutions and credible media outlets. This then has profound implications for our decision-making. When we can't agree on basic facts, how can we engage in meaningful debate or make informed choices at the ballot box? Dr Starbird describes this.
Dr. Starbird:So, in democracy, we as citizens, as voters, we self-govern in some ways, right, so we choose who represents us and we're part of that process. And because, you know, the public is part of the process, if the public is working with good information, they can make choices that are going to, you know, align with their, their hopes and and their goals and and what they think you know will, will create a better world for themselves. If, if they don't, if they're not informed, if they're, you know, misinformed, disinformed or just don't have the information that they need, they're not able to make the decisions that they need to make to create the country, even at the community level as well, on the flip side, if you don't have a shared reality, in democracy let me even back up there we can disagree about things and be a strong democracy. Everybody in a democracy should have different views of the world. We have different experiences and we need to kind of bring those together to make decisions. But there are certain things that democratic citizens have to agree upon in order to function, and that's not, you know, a sentiment about a particular political issue, but that really, really has to be a common ground in trusting that the outcomes are the outcomes, in trusting that the vote leads to an elected leader and that elected leader takes power. You have to have a shared common reality, some kind of common ground to stand upon together in order to have a functioning democracy.
Dr. Starbird:If you can't come together to even have conversations, if you can't agree on what the rules are, if you can't agree that the outcomes are the outcomes, then a democratic society really can't function. And so it's not that everyone has to believe everything that's exactly the same. We're all going to have our own experiences, our own opinions, but there are these sort of core, core pieces of a shared reality that we have to have in order to govern ourselves. And so that's that's where having an informed public is so important. And really it's about things like do we trust the outcomes? Do we understand how election processes work in order to have trust in the outcomes? Same things in the legal processes. Do we understand how these processes work enough to trust the outcomes? And if we don't, then these institutions fail right, because a lot of what it means to be in a society is to acquiesce to the outcomes that we've decided, that that's how the law, or elections, or whatever, work.
Shawn:The spread of false narratives also fuels political polarization, driving wedges between different segments of society and hampering constructive dialogue. Driving wedges between different segments of society and hampering constructive dialogue. This polarization can lead to increased social tension, decreased willingness to compromise and a weakening of democratic norms. The persistence of misinformation can also lead to voter suppression and manipulation. False claims about election integrity or voting processes can discourage participation and foster cynicism and skepticism about democratic systems. Claims about election integrity or voting processes can discourage participation and foster cynicism and skepticism about democratic systems. This all creates an environment ripe for exploitation by those seeking to subvert democratic processes for personal or political reasons, for authoritarians to take and wield oppressive and destabilizing power. One of the obvious culprits of the origin and proliferation of bad information is, maybe ironically, the democratization of content creation and distribution. Dr Alice Marwick, associate professor at the University of North Carolina at Chapel Hill and a renowned expert on the social and cultural impact of digital media, explains this.
Dr. Marwick:I think our fatal flaw when we imagined what the internet could be was that assuming only people and groups with liberal or democratic beliefs would use tools to organize and coordinate. Right Like the Arab Spring or Black Lives Matter, like these are great, you know, social movements that, have you know, had really positive ramifications. But the thing is there's nothing particularly normatively good about these tools. Groups like neo-Nazis or white nationalists or the far right are just as good at using social media and they're just as effective. There's nothing intrinsically good about online participation, and because we assumed that social media would be used by good actors for causes that we agreed with, its designers didn't really game out the negative use case when they were designing it. Like, for example, let's look at Twitter. You know it wasn't designed to prevent harassment, because the creators of Twitter never thought that it would be used for harassment. But as a result, it ended up being this giant vector for harassment, to the point where it took Twitter years and years to put even the most basic safeguards into place.
Shawn:Social media platforms have become the primary battleground in this information war. These digital spaces designed for connection and sharing have inadvertently become powerful vectors for the spread of false information. Have inadvertently become powerful vectors for the spread of false information. Unlike traditional media, where gatekeepers and fact-checkers operate, social media allows anyone with an account to become a publisher. With just a few clicks, a piece of information, true or false, can be shared with hundreds, thousands, even millions of people. This ease of sharing is a double-edged sword. Sure, it's a valuable way to spread critical and important information, such as emergency alerts and breaking news, but it also allows for trafficking of dangerous falsehoods. Dr Marwick describes this dynamic.
Dr. Marwick:I think, in terms of democracy, there's this assumption that as a citizenry, we need to be able to make informed decisions, and in this model you have this ideal citizen, who's someone who votes and chooses a candidate based on their policy positions, and I don't think that's actually how most people approach voting. I think political affiliation is as much identity-based, or even more based on identity, than it is about policy. And, of course, there's a lot of people who don't vote and even more people who don't pay attention to news or politics. But if you don't have access to reliable information, then you can't make informed decisions. And I don't think that people who voted for Trump, for example, were misinformed. I think it was pretty clear who he was.
Dr. Marwick:But I do think disinformation has impacts on democracy. I think, first of all, it erodes institutional trust, because a lot of disinformation is about institutions. So, if you think about COVID disinformation, a lot of it was about undermining the CDC, undermining doctors, undermining scientists, or a lot of disinformation is about government conspiracies or elites doing evil things. But disinformation also undermines our trust in institutional knowledge, because people don't believe information that's put out by academics or journalists or scientists. And trust in institutions, I think, is really important for democracy. And second, disinformation creates this epistemic multiplicity which means basically that people believe wildly different things. It's not just about a disagreement, it's about a complete conflict in what constitutes ground truth. Things like the moon landing was faked or migrants are criminals. The moon landing was faked or migrants are criminals. When you don't have shared factual basis in a population, it makes it impossible to agree on good policy but it also affects interpersonal relationships.
Dr. Marwick:So so many people have had to cut off family members because of conspiracy theories like QAnon that this is a really well-known phenomenon at this point and it's being studied in psychology and social work and I think you can tie that directly back to disinformation.
Shawn:The lack of robust oversight on these platforms exacerbates the problem. While traditional media outlets have established fact-checking processes, social media platforms often struggle to implement effective mechanisms to verify the accuracy of user-generated content. By the time a piece of misinformation is flagged or debunked, it may have already spread the globe, leaving lasting impressions that are difficult to correct. Once they've set in, social media algorithms designed to keep users engaged inadvertently contribute to the problem by creating echo chambers. These algorithms analyze user behavior and preferences, serving up content that aligns with existing beliefs and interests. While this personalization can enhance user experience, it also limits exposure to diverse viewpoints and can reinforce misconceptions.
Shawn:Users find themselves in digital bubbles, force misconceptions. Users find themselves in digital bubbles surrounded by like-minded individuals and information that confirms their existing beliefs, making it harder to encounter and accept contradictory facts. Worse, these algorithms aren't perfect. Sometimes they read you wrong and take you into rabbit holes you never meant to stumble into, and maybe you're smart enough to climb back out. But think about children and mentally and emotionally compromised folks, or otherwise vulnerable people. This is Dr Marwick again.
Dr. Marwick:I think people are much more likely to believe information that plays into their preexisting biases and beliefs right.
Dr. Marwick:One of the things in studying so-called online radicalization is that if somebody is true, you know it's very rare for someone to have a complete shift in belief right to go from being like a hardcore social justice liberalaged Republican to a QAnon, and that's because a lot of the messaging in some of these movements aren't intrinsically contradictory to mainstream right-wing thought.
Dr. Marwick:So if you see a piece of information online and it kind of bolsters all your beliefs and the way that you see the world, you're much more likely to spread it, you're more likely to believe it and it's not going to sort of trigger you to get upset or feel moral outrage or anything like that. You know I am a disinformation researcher and I have shared disinformation. I shared a tweet from Mother Jones a couple of years ago that was totally false and it was about how, you know. The tweet said something like the former slave-owning states are most likely to have private prisons and I was like, oh, that sucks, I'm going to retweet that and it wasn't true. But it bought into my own presuppositions and political beliefs, so I was much less likely to be critical of it than I was something that might contradict my beliefs. So I think it explains why people might be more likely to believe fringe information and institutional information if it reinforces what they already think.
Shawn:Bad information spreads so quickly and so unchecked for a handful of reasons. An almost defining feature of false or misleading content is its emotional nature, often tapping into strong negative emotions like fear, anger or outrage, which adds fuel to the fire. This is important because, as mountains of research have found, negative emotions are much more likely to trigger action and response in people than do positive emotions, so these negative emotional triggers make such content more likely to be shared, as users react instinctively rather than critically evaluating the information. This emotional contagion contributes to the rapid spread of misinformation across social networks. Far-right extremists, especially, have learned how to manipulate these social media ecosystems to both recruit and to play on emotional reactions rooted in racism, homophobia, misogyny, xenophobia and nationalism. Dr Marwick explains.
Dr. Marwick:Radical groups, radical far-right groups or white nationalists. They don't need to reach everybody, because you or I, if we were to fall on far-right content, we would probably find it, you know, distasteful or terrible and we would just turn away from it. What they're looking for is that one person in a thousand who might agree with their ideals, and so a lot of what they do is starting with sort of gateway ideas that are more palatable. They're usually looking specifically for young people, because young people are, you know, they're often searching for identity. They're more likely to have the time to devote to political activism online, and young men are a really key recruitment group for basically any extremist group, and so what you'll have is you'll have people who are putting out messaging that is anti-feminist or anti-trans or anti-Muslim or anti-immigrant. Now, I don't agree with any of those beliefs, obviously, but they are politically palatable. They fall into the general mainstream of larger American political discourse, and if you're some young man who you know you're some 19-year-old kid and you really dislike feminism and you're looking for anti-feminist information online, you are more likely to come across far-right information, because far-right communities strategically use these types of gateway beliefs to draw people in.
Dr. Marwick:In 2017, I did this project where we were collecting memes from this site called the Daily Stormer. It was a white supremacist site designed to appeal to millennials and they had this regular feature called Mimetic Monday where people would create memes for outreach of anti-Semitic and white supremacist messaging and they would design them for particular communities. So if you think about, like trad wife content, for example, that's mostly looking at young women, like that's targeted to young women, but it has these white nationalist beliefs and the sense of sort of white European aesthetic and superiority baked into it. The other thing that the far right has done is they've been able to portray far right beliefs as sort of edgy and counterculture and almost like kind of punk rock. Right, like if everyone in your life is saying diversity is good and Black Lives Matter, then you coming out as somebody who doesn't believe those things. I guess in some cases you could say that is counterculture, right? Except the fact that racist, sexist ideas are still very normative in American culture and certainly in American history, right?
Dr. Marwick:Another thing that the far right does is they tend to create narratives that tie into deeper narratives. So we just did a project on the moral panic around drag queen story hour and we compared the rhetoric around drag queen story hour to previous moral panics around mostly gay people. Like these are historical, so they're not using terms like LGBTQ. But we looked at, for example, anita Bryant's 1977 Save Our Children campaign in Miami that was campaigning against one of the first city ordinances to try to outlaw discrimination against homosexuals in housing and employment and the rhetoric is virtually identical to the moral panic around Drag Queen Story Hour. So when you have these campaigns that tie into very deep-seated ideas that never really went away, they can be activated really quickly and they can sort of get people invested because they play into their deep-seated prejudices and ideas.
Shawn:Confirmation bias plays a significant role in this process. People are naturally inclined to believe information that aligns with their existing views and to dismiss contradictory evidence. On social media, this tendency is amplified. Users are more likely to share content that confirms their beliefs, regardless of its accuracy, further propagating misinformation within their networks. The algorithms that power social media platforms also contribute to the spread of false information through what's known as algorithmic amplification. These systems are designed to prioritize engaging content, often favoring posts that generate high levels of interaction. Unfortunately, sensational or controversial content, which often includes misinformation, tends to be highly engaging, leading algorithms to promote it more widely. This results in false information receiving disproportionate visibility and reach.
Shawn:Another problem is the speed at which information spreads on social media. In the race for clicks and shares, misinformation can circulate rapidly across networks before fact-checkers have a chance to respond. So attempts at correcting or debunking false information are always playing defense and can never keep up, leaving many users exposed to false information without ever seeing correction or clarification. Many users exposed to false information without ever seeing correction or clarification. Psychological factors also play a role in the spread of misinformation on social media. Sharing content can be a way for users to signal group belonging or gain social rewards in the form of likes and comments and shares. Some users may share false information simply because it's entertaining or aligns with their worldview, without taking the time to verify its accuracy. The desire for social validation can override critical thinking, leading to the inadvertent spread of misinformation. Finally, the phenomenon of super spreaders amplifies the impact of false information on social media. A small number of highly influential accounts, whether belonging to celebrities, politicians or other public figures, can significantly boost the reach of misinformation. When these accounts share false or misleading content, their large follower base can quickly disseminate it to millions, giving it an air of credibility due to the perceived authority of the source. The ease of sharing, lack of oversight, echo chambers, emotional triggers, confirmation bias, algorithmic amplification, speed of spread, psychological motivations and the influence of super spreaders all work together to create a perfect storm where false information can thrive.
Shawn:The big lie, the false claim that the 2020 US presidential election was stolen from Donald Trump is a great example of an ecosystem in which almost all of these components existed, where misinformation, disinformation and rumors spread rapidly. Social media's role in propagating the Big Lie was multifaceted. Platforms like Facebook, twitter and YouTube became conduits for false narratives, allowing unsubstantiated claims of voter fraud to reach millions of users within hours. The ease of sharing on these platforms, combined with algorithmic amplification that prioritizes engaging and often controversial content, not only allowed for the spread of election misinformation, but actually created the environment in which it was possible. A key factor in the spread of the big lie was the emergence of a new class of social media influencers who gained significant following by promoting election fraud claims. Kyle Becker, a former Fox News producer, is an example, rampantly and relentlessly pushing false narratives about the election that spread rapidly to a large audience. That then redistributed that information, creating echo chambers where the big lie was continuously reinforced and rarely challenged.
Shawn:The emotional nature of the content played a crucial role. Posts alleging fraud often tapped into feelings of anger and fear and betrayal among Trump supporters, making them more likely to be shared and believed. This emotional contagion helped fuel the false narrative, spread beyond its initial audience, reaching more moderate voters and sowing doubt about the election's integrity. This persistence of these false claims on social media platforms, despite some industry efforts at content moderation, led to a significant erosion of public trust in the electoral process. Surveys showed a sharp decline in confidence in the US election system during this time, with only 20% of respondents feeling very confident in its integrity. The January 6th insurrection illustrated how social media-fueled misinformation campaigns can extend beyond the digital realm, influencing real-world events and shaping political discourse, with deadly consequences, and the social media industry hasn't moved in meaningful ways to police themselves to address the problem. Dr Marwick explains why.
Dr. Marwick:I think social media has made two big mistakes. The first is that they consistently prioritize profit over user experience, to the detriment of everybody. This is sort of what Cory Doctorow calls the inshittification of the Internet. And when you prioritize profit and growth over everything, it means that you prioritize things like controversial content, because it means that people get on the Internet and they argue and that gets you page views and that's good. And the second thing is that they still aren't building products with bad actors in mind.
Dr. Marwick:Trust and safety at social platforms has gone from sort of this, you know, almost forgotten, overlooked community manager type of role to pretty institutionalized. And I think most people who work in trust and safety at social platforms are very well intentioned and are working very hard to try to keep their platforms safe, but they're often under resourced, they often are deprioritized. You know, we know that a lot of big tech platforms laid off a lot of trust and safety people because they're not a growth area right, they're not increasing profit for the product. In fact, in many cases they probably are decreasing profit for the product. And so when you have social platforms whose priority is pretty much solely profit and nothing else, that means that you end up strategically overlooking harmful acts because it might decrease the amount of money that you're making, and I think that that has been a huge failure.
Dr. Marwick:But I would also say I think the fact that these social platforms have run virtually unchecked is a failure of government.
Dr. Marwick:We have a complete lack of regulatory safeguards for technology. If you think about the lack of privacy, that's obviously the biggest one but also the fact that there's just basically no checks on what social platforms can do. There's very little limitations to their power, their actions. Partly this is because government just generally isn't very educated on technology, but partly, I think, it's just been this idea that, well, tech is a bright spot in the American economy and it's making money, and until fairly recently, there was this real lack of desire to regulate tech because it was so profitable. I think we're in a moment where tech regulation is probably more possible than it has been before, and what I find really disconcerting is that, rather than focusing on things like a comprehensive privacy law, instead we're focusing on like young people's social use of social media, which I think is completely the wrong tactic to take the fact that your video store records are protected, even though there's no video stores anymore, but your emails are not is a complete failure of government policy.
Shawn:And so this ecosystem hasn't gone away. It hasn't even been checked. In fact, in this election year, the systems and the bad actors exploiting them and their vulnerabilities are more sophisticated and more organized than they were in 2020. Look what happened then and imagine the havoc they can wreak this election season and beyond. It is against this backdrop the wild west of content moderation on social media platforms that allows for the rapid and unchecked spread of bad information to proliferate that another complicating and deliberate prong exists, exploiting, in fact, contributing to the chaos and distrust that this social media ecosystem creates, and that's the Republican Party. It not only has fully embraced bad information as a tool to subvert elections and chip at public trust and institutions, but it has built a complex and concerted apparatus aimed at silencing, even criminalizing, people, organizations and initiatives designed to generate and spread facts, actual facts.
Shawn:Republican officials have been weaponizing government power to advance partisan agendas, target perceived opponents and stifle anyone that promotes truth. A prime example is the creation of the House Committee on the Weaponization of Government, ironically engaging in the very behavior it claims to investigate. The committee has broad authority to subpoena law enforcement and national security agencies, even regarding ongoing criminal investigations, and has used this authority to harass federal agencies, intimid regarding ongoing criminal investigations, and has used this authority to harass federal agencies, intimidate researchers and interfere with ongoing investigations, raising concerns about potential interference with legitimate investigations and the politicization of law enforcement. Democrats have labeled it the tinfoil hat committee because it's been used to promote conspiracy theories rather than conduct serious oversight. Accusations of harassment and intimidation have plagued the committee's work from its inception. One example is the committee's approach to whistleblowers, who've been used by Republicans on the committee to generate misinformation and confusion about investigations into Donald Trump and other Republicans. A report by subcommittee Democrats notes that some of the so-called whistleblowers did not meet the proper definition and had engaged in partisan conduct that called their credibility into question. There were also claims that these individuals received financial support from Trump loyalists, raising questions about the authenticity of their testimony.
Shawn:The committee has extended its scrutiny beyond traditional government agencies to include social media companies and their content moderation practices. One significant example is the committee's investigation into the Biden White House's alleged coercion of big tech companies like Facebook, google and Amazon to change their content moderation policies. According to the committee's report, the administration's engagement led these companies to suppress content that included true information, satire and criticisms of the Biden administration. Recently, the Supreme Court ruled in Murthy v Missouri that there was insufficient evidence that the committee's characterization of the facts is accurate and that the administration could continue engaging with social media companies on issues related to misinformation.
Shawn:The committee has also targeted researchers of misinformation, disinformation and rumors and their impacts on elections and democracy, particularly those funded by the National Science Foundation, by accusing them of supporting online censorship. The committee has overwhelmed universities and other research organizations with subpoenas and demands for information, aggressively targeted and grilled researchers and low-level staff of research organizations, and issued reports that accused these researchers of being part of efforts to censor and silence right-wing voices. The committee's interim staff report from February 2024 alleged that NSF-funded researchers were complicit in online censorship activities, which has led to a climate of intimidation for the researchers involved, with some facing harassment and death threats from online mobs. These actions by the committee have raised significant concerns about the politicization of scientific research and the potential chilling effect on academic freedom and the pursuit of knowledge in areas critical to understanding and mitigating the impacts of misinformation and disinformation.
Shawn:Parallel to these developments, and emboldened by the committee, state legislatures have initiated a wave of attacks on higher education and academic freedom, often justified as efforts to combat perceived liberal bias and promote viewpoint diversity, state legislatures, particularly those controlled by Republicans, have introduced and passed bills targeting diversity programs and tenure systems in universities. For example, in Florida, legislation has been enacted that weakens tenure protections, allowing for post-tenure review and potential dismissal of faculty members deemed unsatisfactory. Texas has passed laws prohibiting diversity statements in hiring and restricting campus diversity training, infringements on academic freedom and attempts to control the content of higher education. This is Dr Tom Ginsber, a distinguished professor of law at the University of Chicago and expert on issues of democracy, constitutionalism and legal reform around the world, explaining the impact of legislation and tactics such as these.
Dr. Ginsburg:The same thing that is ailing us and is ailing other societies, and it's the cause of polarization, many people think, which is the extensive development of the social media, where, you know, now we have voters whose sort of whole senses of selves were developed in a social media era. And in the social media era, you know, you can like things really really easily. You know there's a lot of virtue signaling going on. There's a lot of disinformation and misinformation, and this is obviously going to accelerate with the AI era. And so I think the vision of democracy that we learned in civics class, which is, you know, you have sort of nonpartisan, neutral information and then political parties disagree about the policies, what to do with those? You know, objective knowledge, you know that's long gone, where the parties are creating their own knowledge. They're creating their own voters in some sense using these tools. So I think that's a huge challenge. The traditional image in some sense is inverted and you know knowledge institutions like universities are thus much more important than they ever were.
Shawn:The impact of these legislative efforts extends beyond administrative changes and, much like the House Committee on Government Weaponization's actions, create a chilling effect on research and teaching, particularly in areas related to race, gender and social justice. Professors may self-censor to avoid controversy or potential repercussions, leading to a narrowing of academic discourse and a potential decline in the quality and breadth of education. This is Dr Starberg describing this.
Dr. Starbird:Well, we've certainly seen a chilling effect in terms of, you know, some of my colleagues have either lost their jobs or been nervous about doing some of the same kinds of work that we've done and based on a very strategic and very sort of misleading framing of work to understand rumors, to understand disinformation and to try to, you know, create healthier information environments A strategic framing of that as censorship by the folks who benefit most from these platforms being exploited to spread harmful content. My mood on that is actually very good because on Wednesday the Supreme Court came out with the Murphy versus Missouri ruling and one of our projects was mentioned actually in the hearing and then in the decision, and they were. It was a very good morning for us because the Supreme Court essentially said that that the factual claims around whether censorship was happening in these cases that have been, that they've sort of accused me and some of my colleagues of censorship, that the factual claims were lacking, that the people that were claiming they were censored haven't proved that that has happened and in fact, as a person who's been accused of being part of this, there's just not that's, that's not what was going on there, and so so we're feeling very uplifted by that, by that outcome. We're facing a lawsuit that's in parallel from the same plaintiffs in the same court that went through that ended up into the Supreme Court, the same Louisiana court in the Fifth Circuit, and we're really hopeful that that's going to get thrown out. But we've actually seen a concerted effort, whether it's online harassment. We've seen congressional interviews and document requests. I've received 40 public records requests, 40 plus and these lawsuits and just a multi-pronged effort to try to attack research into online disinformation and online rumors and with the goal of, I imagine, getting us to stop doing this research, the same way that climate change scientists have been attacked, the same way that there was an effort to get people to stop studying the impacts of gun violence.
Dr. Starbird:I think we're just the next iteration of this and it has taken a toll on some of my colleagues around the country. I've been really fortunate to be at the University of Washington, where we've had great support internally throughout the university, given us the resources to fight it legally. We've had communication support in trying to navigate this. I've been really, really fortunate and have been trying to be a little bit of a lightning rod for my team. So our work continues. We haven't been slowed down, except for probably me. I've spent way too much time fighting the trolls on multiple fronts than I wish I had the last couple of years, but the work goes on and you know. You understand its importance when you realize who's getting mad at it.
Shawn:State governments have also been targeting content moderation on social media platforms and, as such, the landscape of online speech has become a battleground for legislative and legal conflicts, which has manifested in several key areas, each with significant implications for free speech and the future of online discourse. One of the most prominent developments has been the legislative attempts to limit content moderation on social media, particularly through laws passed again in Florida and Texas. These laws enacted in 2021, aim to restrict the ability of large social media companies to remove or moderate certain types of content, especially those with political viewpoints. The Texas law, for instance, prohibits platforms with over 50 million monthly active users from censoring content based on viewpoint, while Florida's law imposes strict guidelines on how platforms can moderate content from political candidates and journalistic enterprises. These laws have sparked intense debate over their First Amendment implications. Proponents argue that they protect free speech by preventing platforms from silencing certain viewpoints, particularly conservative ones. Critics, though, contend that these laws actually violate the First Amendment by compelling private companies to host speech they may not agree with, infringing on their editorial discretion. Recently, the Supreme Court vacated lower court decisions that would allow these laws to take effect and remanded the cases with instructions to more closely consider First Amendment implications. So for now, we wait.
Shawn:So what does all this mean for democracy? The proliferation of misinformation, disinformation and rumors in our political discourse has created ample space for the weaponization of government power, and the Republican Party has taken full advantage. This threatens American democracy by undermining free speech, eroding public trust in institutions and potentially causing long-term damage to democratic engagement. At the heart of this issue lies a paradoxical approach to free speech. While claiming to champion free expression, certain Republican-led initiatives have actually worked to stifle it. These actions contribute to a broader erosion of public trust in institutions. When government power is wielded to intimidate researchers, limit academic inquiry or interfere with private companies' content moderation practices, it undermines faith in the impartiality and integrity of both government and educational institutions. The accusations of censorship against tech companies, often amplified by right-wing political figures, further fuel this distrust, creating a climate where all information sources are viewed with skepticism.
Shawn:The potential long-term effects on democratic engagement are profound. As public trust in institutions wanes and the information landscape becomes increasingly fractured, citizens may become more disengaged from the democratic process. The rise of partisan echo chambers, exacerbated by both social media algorithms and government actions that limit diverse viewpoints, makes it harder for people to encounter and engage with differing perspectives. This polarization can lead to decreased willingness to compromise, a key component of democratic governance. Moreover, the persistent spread of misinformation, coupled with government actions that appear to validate certain false narratives, can lead to a distorted understanding of political realities. This can affect voting behavior, policy preferences and overall civic participation. When large segments of the population operate on fundamentally different sets of facts, it becomes increasingly difficult to have productive political discourse or reach consensus on critical issues. Dr Marwick describes this.
Dr. Marwick:I think that we sometimes overestimate the impact, on things like voting, of disinformation, right? These very direct, immediate effects. I think what the larger effects of disinformation are are much socially broader. They're about division, they're about a lack of commonality, they're about increased partisanship and polarization and they're about people being very suspicious of the world in which they live, suspicious of who's telling them what, of whether they can rely on that information, and that decline in trust I think does have long-term democratic implications. I think that often, especially for social scientists, we're always trying to find like what are the immediate impacts, like what? But sometimes these impacts aren't immediate, sometimes they are long term. Hillary Clinton, regardless of who the Republicans put up against her, and I would vote for literally anybody over Donald Trump, right? Like I don't care what information I consume. That's not going to change my mind.
Dr. Marwick:But I'm a very, you know, I'm a. I'm pretty set in my political views. I'm pretty and I think most people are, who are, who are not persuadable. But there's also always this kind of middle right, these undecided voters, these people who probably are influenced by vibes right by just oh, I've heard kind of bad things about this candidate or that candidate, and that's not necessarily because of disinformation. You know, studies have shown that the mainstream media in 2016 was far more critical of Hillary Clinton than they were of Donald Trump, which is kind of crazy to think about after the fact.
Shawn:And in this environment, authoritarians can thrive, exploiting the landscape around social media content, moderation and misinformation to dismantle democracy and consolidate power in several ways.
Dr. Starbird:Here's Dr Starburn. Again, that's necessarily a goal, but it's certainly an outcome that people begin to lose just lose trust in information. And there's a connection and there has been sort of like theorized a connection between that kind of loss of trust and information and other institutions. That that requires sort of a shared, shared reality between that kind of pervasive disinformation and sort of like the shifts from democratic societies to more authoritarian societies or the sort of support of an existing authoritarian society through sort of like this pervasive disinformation that has you lose trust in information. And the challenge, as you mentioned before, is if you don't have trust in information, then you don't know, you kind of lose agency, you don't know that you can take a decision that's going to align with what your goals are, with what you know how the world works, and so if you don't have trust in information it's hard to be an informed citizen. It's therefore hard to support democratic governance as I tick through these ways in which authoritarians can exploit misinformation.
Shawn:As I tick through these ways in which authoritarians can exploit misinformation, disinformation and rumors toward authoritarian ends, I wonder who might come to your mind, no-transcript. By constantly accusing legitimate institutions, media, universities, government agencies of bias or censorship, authoritarians can create a climate of distrust, which makes it easier to dismiss factual information that challenges their narratives. Laws that force platforms to carry all political content equally can be exploited to amplify authoritarian messaging while drowning out opposing views. This creates an illusion of widespread support. The threat of government investigations or legal action against platforms, researchers or critics who challenge authoritarian narratives can lead to self-censorship and reduced opposition. By encouraging partisan echo chambers and limiting exposure to diverse viewpoints, authoritarians can deepen societal divisions and present themselves as the only solution to chaos. Spreading doubt about electoral processes and results becomes easier when platforms are restricted in their ability to combat election-related misinformation. Paradoxically, laws ostensibly protecting free speech can be used to justify actual censorship of dissenting voices. Under the pretense of fairness or protecting democracy, and by flooding the information space with conflicting narratives and undermining education, authoritarians make it harder for citizens to discern truth from fiction, making them more susceptible to manipulation. These tactics collectively work to create an environment where democratic norms are eroded, public discourse is poisoned and authoritarian control can gradually take hold under the facade of protecting freedoms. And so the dangers for American democracy are clear and they are present, and they're barreling toward what very well could be our last free and fair election this November.
Shawn:As we wrap up this episode, it's clear that the landscape of American political discourse is at a critical juncture. The proliferation of misinformation, disinformation and rumors isn't just a matter of differing opinions or harmless political spin. It's a direct threat to the foundations of our democracy. We've seen how these tactics have been weaponized, particularly by certain Republican factions, to undermine trust in institutions, silence researchers and sow doubt in the minds of voters. As we approach the 2024 election, the stakes couldn't be higher. The spread of false narratives about election integrity, coupled with efforts to restrict voting access, creates a perfect storm that could fundamentally alter the democratic process.
Shawn:But perhaps most alarming is how this environment aids the rise of authoritarianism. When citizens can't agree on basic facts, when academic freedom is under attack and when government power is wielded to intimidate rather than inform, we create fertile ground for authoritarian figures to seize control. The degradation of trust in democratic institutions and the media leaves a vacuum that can be filled by those promising simple solutions to complex problems, even if those solutions threaten our fundamental rights and freedoms. However, this isn't a foregone conclusion.
Shawn:Each of us has a role to play in combating misinformation and protecting our democracy. It starts with being critical consumers of information, supporting quality journalism and engaging in respectful dialogue with those who hold different views. We have to also remain vigilant against attempts to weaponize government power and stand up for academic freedom and free speech. So, as we head into this 2024 election and beyond, remember that democracy isn't a spectator sport. It requires active participation and protection. So stay informed and engaged and commit yourself to truth. It's through that that we can ensure that the power of American democracy remains where it belongs in the hands of the people, remains or belongs in the hands of the people. Check back next Sunday for another episode of After America. You.