You're pointing at a thing our own politicians and billionaires are currently doing and going "What if Russia did it too".
Understanding that the media amplifies particular stories to promote a perspective that is in their interest and against your own doesn't require the addition of a foreign power, that just muddies the issue.
Don't know why you're getting down voted. Bots and media manipulation are a thing, Russia and many governments are almost certainly doing it on different scales. But you make a good point that our own governments are doing it do, and even before social media stories were prompted or hushed up for reasons other than newsworthiness or public interest. That's not a conspiracy theory, that's basic media history of the last century.
Yeah there's a gulf of difference between what U.S. politicians and billionaires are attempting to do vs what the Russian oligarchs/politicians have already done.
Man, the kids were really prophetic with their slang. I'm from Michigan, so I've always been biased against Ohio, but goddamn if they don't give reasons to be.
People are gullible, not just right-wingers. You’re just more likely to perceive the other side as gullible and not notice the blind spots of your own. And well, we are living in a moment in history of a surge in right wing populism, which puts that side’s gullibility in full frontal display.
Some quotes from the study:
"Accordingly, a surplus of pro-conservative misinformation may indicate, simply, that conservatives are more gullible. This logic is illustrated by the story of Macedonian teenagers who converged to producing false stories catering to Trump supporters, rather than Bernie Sanders supporters, because it worked better."
"...misinformation catered more to conservatives, and this contributes to them being on average more likely to believe false information."
It probably only takes a staff on the order of a thousand people to make things go viral on the internet.
If it's your job to just sign up for social media accounts (fill in the the captchas, type in a name, upload a few images) you could easily create at least a hundred per day.Multiply that by a thousand and that's one hundred thousand accounts per day.
Of course you'd have to post some comments occasionally to make it look real. But that would just be re-wording the text from other comments. Of course if someone were to do this, youtube comments would look like, well... exactly like youtube comments are like right now.
So figure a a hundred thousand accounts per week with comments to make it look legit, that's millions of accounts per year. Yeah you'd want to space it out a bit so it wouldn't look suspicious. And you'd need to route the traffic through a botnet so the IPs are from the same country the account claims to be from. But within a year you'd have millions of accounts that all appear legit to any automated system checking them.
So now you've got the accounts and you want something to go viral. Have your thousand people start logging into accounts and running the video or whatever through your botnet, click like, leave a comment, maybe even check out the ad so the social media company makes a bit of money and aren't incentivized to look at it too closely. This probably only takes around 10 seconds per account. You could have anything you want have at least a million likes and engagement within a day. Which is probably way more than is needed for the algorithms to start recommending the content to legitimate users. And then it's all automatic from there.
Sure a few thousand people sounds like a lot. But not for the government of a country that wants to do disinformation.
It probably only takes a staff on the order of a thousand people to make things go viral on the internet.
Depending on the site, maybe less than that.
It wasn't all that long ago that Reddit had "power users" that was just a small handful of people/one person running an account that consistently made it viral on the site.
In both cases, it wasn't the original message that kicked off the firestorm, it was a deliberate strategy put forward by billion-dollar presidential campaigns.
Nobody knew about the "eating my neighbor's cat" post even after the debate. It took weeks to track down what Laura Loomer had whispered into Trump's ear. Nobody considered the "Hillbilly Elegy had a chapter where Vance fucks a couch" tweet important until celebrities and politicians began retweeting it as a means of disgracing a weird conservative sex pest.
If there's a rumor started by a smear campaign run out of an office in Moscow (and they're even halfway competent in their execution) you're likely only going to hear about it once it becomes the focus of some rhetorical exchange-of-fire on a top tier domestic social media celebrity or in a Senatorial debate. Even then, you won't get to hear where it originated from until the polls have long since closed, in much the same way nobody got the details on the Comey indictment of Hilary or the Georgia election-steal attempt by Trump until it was too late.
It isn't "one person" starting a rumor. Its an industry that feeds on rumors and is constantly regurgitating them to get your attention.
I mean, they don't just retweet them, they twist the narrative, write legitimate looking articles on legitimate looking websites that people can quote, and subtly propose civil unrest, as that's their ultimate goal.
imagine what an ex-KGB agent with unlimited resources can do.
Oh, there's no need to imagine: I'm on the internet right now. I'm probably staring at this kind of state-actor bullshit on a daily basis without even knowing it.
Get agitators into communities and stoke fears. So that the messages are posted by the people stoked and you are able to stay removed from it as the actual source.
Just like everything in the Trump era, that KGB agent would fail miserably because why would something so ridiculous work? The most significant lasting legacy of Maga-politics will be the death of comedy, because who would write something so extreme? No one would believe it