Philip N. Howard’s latest book Lie Machines: How to save democracy from troll armies, deceitful robots, junk news operations, and political operatives has been reviewed in the Washington Post:
All those not living in a cave know by now that social media platforms such as Facebook, Twitter, YouTube, Instagram and even Tinder have become vehicles for a veritable tsunami of mendacious and polarizing information. Debates rage about whether Russian troll farm efforts changed the outcome of the 2016 presidential election. (The best analysis suggests they did not.) The nonpartisan Brennan Center for Justice has already warned that Russian interventions in this year’s vote will be “more brazen” than in 2016.
But how much of this is really new? Has technology made the political lie any different from its Attic or Progressive-era precursor? Philip Howard, the author of “Lie Machines,” is unquestionably well-placed to illuminate this question. Director of the Oxford Internet Institute, Howard was asked in 2017 by the Senate Intelligence Committee to conduct a postmortem on the social media activities of the Russian Internet Research Agency. A seemingly modest operation run out of a nondescript St. Petersburg office with between 40 and 100 employees and a $10 million budget, the IRA — notice how even the name will baffle a standard search engine — has given President Vladimir Putin a large return on his small investment.
The work of Howard and his team was pivotal in clarifying the Russian strategy in 2016. They showed, for instance, how the IRA targeted Americans at the poles of the political spectrum, exacerbating their divisions, and flooded swing districts with misleading or inflammatory advertisements. Tens of millions of users viewed IRA ads.
Howard cautions against optimism that the quality of online political discourse will improve soon. To the contrary, a new tool kit for Web-based public lies has been tested by Russia and China, for use first at home and then against foreign foes. It is diffusing quickly to more nations. Howard counts five other countries — India, Iran, Pakistan, Saudi Arabia and Venezuela — that are using the same tool kit against overseas democratic publics. In 2020, there were “organized social media misinformation teams” working for parties and governments in some 70 countries. Howard spoke to firms in Poland and Brazil that are helping in those efforts and found robust competition among producers of the mediated lie. In this market, his analysis suggests, the incentives driving supply are unlikely to abate.