American University communication experts are warning that incorrect information, often intentional, is rampant leading up to Nov. 3.
“We tend to be in a vacuum with social media and at times we follow people that have similar beliefs and similar likes,” said Jason Mollica, a professorial lecturer in the School of Communication at American University.
A voter outside a Northern Virginia polling site agreed that information can be viewed in silos.
“I’m mostly on Facebook,” said Jessie Chirino, 34, at the Courthouse, Virginia, voting location. “I don’t see much of what I dislike, maybe because I cater it more towards what I do like to see.”
According to Techlash, a report by Gallup and the Knight Foundation on American concern with technology companies, the majority of Americans surveyed believe that the burden for the spread of misinformation falls on tech and social media companies. The issue was front and center again Oct. 28 on Capitol Hill when a senate hearing pressed Twitter, Google, and Facebook CEOs on how they moderate, or not, political speech on their platforms.
More than three-fourths of Americans polled believe that tech giants have too much power with regard to swaying public opinion. The report said 93% of Americans are concerned about misinformation online and 71% say tech and internet companies are doing a poor job of preventing its spread (Sarah Salem / The Wash).
To help voters sort through misinformation and disinformation about the election, AU experts focused on these warning signs:
1. Be wary of something that looks “too good to be true”
“What’s most helpful to look out for is something that confirms your beliefs a little too much,” said Ericka Menchen-Trevino, AU assistant professor.
“For example, half the country, if they saw Donald Trump kicked a puppy, they would believe it. But if the other side saw Donald Trump ran into a building and saved a child, they would believe it,” she said.
Mollica agreed. “It comes to how much do you believe the messenger,” he said. “People will believe anything. If you start seeing posts on Facebook that really look too good to be true, be aware.”
2. Be skeptical of information not verified by a reliable source
“If you don’t see many other people in your network posting things like this, that’s something to be aware of,” Mollica said.
Outside the Courthouse voting station, precinct captain for the Arlington County Democratic Committee Michael Angeloni, 32, said he tries to do his research and verify a source.
“You should pay attention to what the link is,” Angeloni said. “Depending on who’s sharing it, that can be a good way to know. What kind of news source is it, what kind of news do they post. If it’s not an outlet you recognize, look into it.”
Chirino says she will look on Google to see if other sources corroborate what she saw.
“If I see it on multiple sources, then that probably confirms it happened,” she said. “Google is my best friend.”
3. Be alert if the account posting the information looks suspicious
Saif Shahin, AU assistant professor, said to figure out which accounts are bots on the internet. “Bots” are computer-generated accounts that can post on the internet, usually on Twitter.
“Bots often spread misinformation,” Shahin said. “They have very few followers but tend to follow a lot of people. They’re there to spread misinformation.”
Mollica said you can spot bots by looking at both the username and the content. If the account was just created, and there are either numbers in the username or the user and profile name don’t match up, that’s an indicator.
“How are they posting,” he said regarding the content itself. “If the content in the post is very broken, or seems like it was generated by a computer, that’s something that would be a red flag.”
4. Avoid the “sock puppet” account
Aram Sinnreich, AU professor and chair of the Communication Studies division, said a sock puppet account is an account used solely for deception purposes.
“For example, you might see a Facebook group of people claiming to be ‘LGBT Americans for Trump,’” he said. “But you might not recognize the names of the organizers because they’re not real people, they’re characters created for the purpose of spreading disinformation.”
5. Be vigilant about something exploiting someone’s mistrust in an institution
As an example, Mollica mentioned that the QAnon theory was spread by a group of people who shared a distrust in the government and mainstream media.
Sinnreich agreed. “Public trust levels in the media, government, scientific and academic institutions are at all-time lows right now,” he said. “It’s because the public sphere has been flooded with narratives that challenge the integrity and authority of these institutions often without basis.”
The best advice the experts had was to be vigilant, verify all sources and be thorough before reposting something. Sinnreich recommends users do this whether a post goes with or against their own political beliefs.
“I think, for instance, that Donald Trump is a racist,” he said. “But if someone I’ve never heard of says, ‘I heard Donald Trump say the N-word,’ even if I believe Donald Trump is capable of that, I should not retweet or repost that claim until I verify its truth.”
Mollica said the days before the election will spring a lot of information on everyone, true or not.
“In the coming days, people need to make sure they’re paying close attention to what’s being posted,” he said. “As we get closer and closer to the election, you need to have a clear mind.”
Add comment