Koyaanisqatsi
Veteran Member
This is from a different thread that I think warrants it’s own (in spite of the fact that I think I already started a thread along these lines). Regardless, the point here is to see how welll we can now estimate the actual impact of the Russian influence on the 2016 election.
First, this from a clandestine study understaken by Facebook that is strong evidence proving what Russia did works:
The study is also summarized here:
Presciently:
Most notably in regard to the timeline leading up to the 2016 election:
And then, finally, there is this (emphasis in original):
That’s just from Facebook’s own study. Here is a more in-depth study: The IRA, Social Media and Political Polarization in the United States, 2012-2018 that is summarized here:
Here’s their breakdown:

First, this from a clandestine study understaken by Facebook that is strong evidence proving what Russia did works:
Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.
The study is also summarized here:
But now Facebook, the world's biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.
It has published details of a vast experiment in which it manipulated information posted on 689,000 users' home pages and found it could make people feel more positive or negative through a process of "emotional contagion".
In a study with academics from Cornell and the University of California, Facebook filtered users' news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened.
The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."
Presciently:
Jim Sheridan, a member of the Commons media select committee, said the experiment was intrusive. "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he said. "They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."
Most notably in regard to the timeline leading up to the 2016 election:
It was claimed that Facebook may have breached ethical and legal guidelines by not informing its users they were being manipulated in the experiment, which was carried out in 2012.
And then, finally, there is this (emphasis in original):
The effect the study documents is very small, as little as one-tenth of a percent of an observed change. That doesn’t mean it’s unimportant, though, as the authors add:
Given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences. […] After all, an effect size of d = 0.001 at Facebook’s scale is not negligible: In early 2013, this would have corresponded to hundreds of thousands of emotion expressions in status updates per day.
That’s just from Facebook’s own study. Here is a more in-depth study: The IRA, Social Media and Political Polarization in the United States, 2012-2018 that is summarized here:
If you’ve only checked into this narrative occasionally during the last couple of years, the Comprop report is a great way to get a bird’s-eye view of the whole thing, with no “we take this very seriously” palaver interrupting the facts.
If you’ve been following the story closely, the value of the report is mostly in deriving specifics and some new statistics from the data, which Oxford researchers were provided some seven months ago for analysis. The numbers, predictably, all seem to be a bit higher or more damning than those provided by the companies themselves in their voluntary reports and carefully practiced testimony.
Previous estimates have focused on the rather nebulous metric of “encountering” or “seeing” IRA content put on these social metrics. This had the dual effect of increasing the affected number — to over 100 million on Facebook alone — but “seeing” could easily be downplayed in importance; after all, how many things do you “see” on the internet every day?
The Oxford researchers better quantify the engagement, on Facebook first, with more specific and consequential numbers. For instance, in 2016 and 2017, nearly 30 million people on Facebook actually shared Russian propaganda content, with similar numbers of likes garnered, and millions of comments generated.
Note that these aren’t ads that Russian shell companies were paying to shove into your timeline — these were pages and groups with thousands of users on board who actively engaged with and spread posts, memes and disinformation on captive news sites linked to by the propaganda accounts.
The content itself was, of course, carefully curated to touch on a number of divisive issues: immigration, gun control, race relations and so on. Many different groups (i.e. black Americans, conservatives, Muslims, LGBT communities) were targeted; all generated significant engagement
Here’s their breakdown:
