Political bots

Project on Algorithms, Computational Propaganda, and Digital Politics

Auditing for Transparency in Content Personalization Systems

Do we have a right to transparency when we use content personalization systems? Building on prior work in discrimination detection in data mining, I propose algorithm auditing as a compatible ethical duty for providers of content personalization systems to maintain the transparency of political discourse. I explore barriers to auditing that reveal the practical limitations on the ethical duties of service providers. Content personalization systems can function opaquely and resist auditing. However, the belief that highly complex algorithms, such as bots using machine learning, are incomprehensible to human users should not be an excuse to surrender high quality political discourse. Auditing is recommended as a way to map and redress algorithmic political exclusion in practice. However, the opacity of algorithmic decision making poses a significant challenge to the implementation of auditing.

Download here.

Mittelstadt, B. (2016). Automation, Algorithms, and Politics| Auditing for Transparency in Content Personalization Systems. International Journal Of Communication, 10, 12. Retrieved from http://ijoc.org/index.php/ijoc/article/view/6267

algorithmsAutomationBotsBrent Mittelstadtethicsinformation ethicsOIIpersonalizationPoliticsrecommendation systemstransparency

Phil Howard Phil Howard • 15th October 2016

Previous Post

Next Post