On Tuesday, 8 November 2022, American citizens will be voting in the midterm elections, choosing their Congress representatives.
Midterms usually bring an average lower number of votes compared to the presidential elections. Average voter turnout is usually around 40% of the eligible voters (as opposed to the last presidential elections, which had a 67% voting turnout).
However, the last midterms, in 2018, saw the highest number of voters since 1978: more than half of eligible Americans cast a vote, presumably as a result of growing polarization and political activity. Judging by the early voting, the 2022 midterms will achieve a similar turnout: last Friday, CNN reported that more than 5.8 million Americans already cast their vote.
A Storming, Susceptible Social Sphere
The high turnout is actually no surprise to those following the discourse on social media. Republicans, Democrats, Pro-Trump, Pro-Biden, Kathy Griffin followers, civil war skeptics and WW3 prophets – all political parties, entities and voters seemed to find their way into social conversations. Including, of course, malicious figures: bots, trolls, avatars, and other disinformation spreaders pulling the strings, trying to affect elections outcome.
There are three common tactics used by those malicious players to affect social conversations:
1. Good Ol’ Disinformation Spreading
Disinformation spreaders thrive during election season, as they can easily attach themselves to the various trending topics to intentionally spread false information and mislead us. As those hot topics soar higher than ever, it’s easy to add lies and fake information into the mix. With hundreds of posts, shares and replies every minute, the fake messages might not be thoroughly checked and debunked, and the more people see it and engage with it, the more likely people are to believe it. You might be more certain than ever about the safety of vaccinations, but the more you stumble upon fake claims about vaccinations cause autism, the more you’re susceptible to that fake information.
Enticing and attracting unsuspecting authentic profiles to share disinformation is the main goal of disinformation spreaders. Even responding to those fake profiles and contradicting them raises their engagement rate and spreads their message. Whether you believe them or not, It’s a win-win situation for them. That is why fake profiles tend to show more hostility and negativity compared to real profiles: even our anger promotes their cause.
2. “Drowning the Noise”: Creating Chaos
This technique is referring to the way in which malicious players sow confusion and diversion to distract authentic users from the facts, or to make them give up on understanding and forming an opinion altogether. It shares common grounds with the debate tactic called “whataboutism”: imagine you’re discussing abortion rights, and somebody jumps in and asks “But what about Ukraine? You’re completely ignoring it!” This declaration would then change the focus of the conversation, and you’ll end up not discussing what you intended, and probably become even sick and tired of the whole debate.
Just like in this example, “Drowning the Noise” isn’t necessarily about disinformation. While disinformation can definitely be used to distract, it is means to an end, and the goal is actually distraction. Those “Noise Drowners” use our frustration (and sometimes our rage) to divert us from the topics that are important to us.
3. “Poisoning the Well”: Mal-information
Mal-information is information that is real, but is presented out of context. For example – sharing something a candidate has said ten years ago as a recent quote, or a photo from a war that was actually taken in a completely different continent, or something that was said as a joke in a serious context. Like previous tactics, this tactic is meant to be shared, to distract and enrage, to mislead,
Floating Voters Under Attack
Stopping this endless cycle is mostly about personal awareness and responsibility.
A recent Cyabra article, “Flattening the Disinformation Curve” (part of our “Intro to Disinformation” series), explains how to double check every piece of information you come across, what you should be on the lookout for, and how to avoid spreading disinformation yourself.
However, it’s also important to understand (both as individuals and as part of a community, state or nation) that disinformation is never equally distributed. Some people are targeted much more than others, based on their social media habits, the online communities they are part of, the profiles they follow, and even the state they live in. In swing states, disinformation spreaders have much better chances of getting their message across and affecting the outcome of the elections.
While disinformation spreaders might not even bother with states that are mostly red or blue, floating voters from those states who express their hesitations and raise questions on social media are easily targeted by disinformation, as their opinion is perceived as easily swayed one way or the other. Those voters might struggle with their vote until the very last minute, which means they can be swayed by disinformation. It might be just a matter of finding the topic they’re most passionate about, or catching their eye as someone they follow shares or responds to a trending topic.
So, should we suspect everyone? Not necessarily. But it is important to check, double check and triple-check every fact you come across and every user you’re interacting with, especially in trending topics and during heated debates. Don’t waste your time on raving trolls and ranting bots. In the end, you get to decide what’s most important.
Fear You’re Interacting With a Fake Profile? Ask Cyabra
If you suspect an inauthentic profile is spreading disinformation on social media, tag us on Twitter or LinkedIn, and we’ll be happy to check and let you know if this profile is fake or real. Our mission is to uncover the good, bad and fake of social media, and we’re always delighted to help.