The Computational Propaganda Project

Oxford Internet Institute, University of Oxford

How Do You Fix Someone Else’s Election?

Sam Woolley was interviewed on BBC radio to discuss the project’s research and the use of bots for election meddling. Smears, bots and bags of cash – we reveal some of the tricks used for fiddling elections around the world. German Chancellor Angela Merkel’s security chiefs say Russian intelligence is actively trying to influence next month’s…

Continue Reading

Fake News Bots Are Here

Project member Sam Woolley was interviewed about bots and the 2016 US Election on NPR. How do you judge public opinion on any given issue? What others are thinking? Paying attention to? If social media play into your read, watch out. When it comes to politics in particular, social media can be overrun with, twisted…

Continue Reading

Decades Later, Governments Still Wary of Social Media

Project member Gillian Bolsover contributed to this story published by the Voice of America. When the dust settled, Iran published social media photos of protesters on a pro-Ahmadinejad website and circled their faces in red “in an attempt to identify individuals who participated in the protests,” said researcher Gillian Bolsover of the Oxford Internet Institute at the…

Continue Reading

What Facebook Knows

The project’s research and writing was discussed in Vice News. To answer these questions conclusively, academic researchers have said that Facebook could very easily clear the air by releasing more of its data. But just as the company keeps its algorithm under wraps, the company has thus far declined to share broad data about the…

Continue Reading

Digging up facts about fake news: The Computational Propaganda Project

Our project work was covered by the OECD. The phenomenon of junk news and its dissemination over social media platforms have transformed (some say destroyed) political debates. The combination of automation and propaganda, also called computational propaganda, can shape public opinion. The trouble is, how can we tell the difference between fake facts and real…

Continue Reading

We the…Bots & Trolls

Project member Nick Monaco was interviewed on PBS about his Taiwanese case study. Find the full video here.

Spreading fake news becomes standard practice for governments across the world

The project’s research on government-sponsored social media manipulation was covered in the Washington Post. These propaganda efforts exploit every social media platform — Facebook, Twitter, Instagram and beyond — and rely on human users and computerized “bots” that can dramatically amplify the power of disinformation campaigns by automating the process of preparing and delivering posts….

Continue Reading

Please Prove You’re Not a Robot

The project’s research was featured in a New York Times column written by Tim Wu. Robots posing as people have become a menace. For popular Broadway shows (need we say “Hamilton”?), it is actually bots, not humans, who do much and maybe most of the ticket buying. Shows sell out immediately, and the middlemen (quite…

Continue Reading

Government ‘Cyber Troops’ Manipulate Facebook, Twitter, Study Says

The project’s recent paper on government efforts to manipulate public opinion online was covered by Bloomberg. Governments around the world are enlisting “cyber troops” who manipulate Facebook, Twitter and other social media outlets to steer public opinion, spread misinformation and undermine critics, according to a new report from the University of Oxford. Adding to growing evidence…

Continue Reading

How Bots Win Friends and Influence People

Project PI Phil Howard was interviewed for a short article in IEEE Spectrum. Every now and then sociologist Phil Howard writes messages to social media accounts accusing them of being bots. It’s like a Turing test of the state of online political propaganda. “Once in a while a human will come out and say, ‘I’m not a…

Continue Reading

1 2 3 9