For Society

Due to its ability to create content quickly, cheaply, at large scale, and potentially personalized to the needs of individual readers, automated journalism is expected to substantially increase the amount of available news. While this development might be helpful in meeting people’s demand for information, it could also further increase people’s burden to find content that is most relevant to them. To cope with the resulting information overload, the importance of search engines and personalized news aggregators, such as Google News, are likely to increase further.

Search engine providers claim to analyze individual user data (e.g., location and historical search behavior) to provide news consumers with the content that most interests them. In doing so, different news consumers might receive different results for the same keyword searches, which would bear the risk of partial information blindness, the so-called “filter bubble” hypothesis.65 According to this idea, personalization will lead individuals to consume more and more of the same information, as algorithms provide only content that users like to read or agree with. Consequently, people would be less likely to encounter information that challenges their views or contradicts their interests, which could carry risks for the formation of public opinion in a democratic society.

The filter bubble hypothesis has become widely popular among academics, as well as the general public. Eli Pariser’s 2011 book, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think,66 has not only become a New York Times bestseller but has attracted more than 1,000 citations on Google Scholar through October 2015. However, despite the theory’s popularity and appeal, empirical evidence available to date does not support the existence of the filter bubble: Most studies find either no, or only very small, effects of personalization on search results.67 Of course, this may change as the amount of available content—and thus the need for personalization—increases and algorithms for personalizing content continue to improve. The study of potential effects from personalization, whether positive or negative, remains an important area of research.

More generally, a further increase and more sophisticated use of automated journalism would eventually raise broader questions that future research must address. If algorithms were employed for public interest journalism, questions will arise as to whether we can and should trust algorithms as a mechanism for providing checks and balances, identifying important issues, and establishing a common agenda for the democratic process of public opinion formation. Furthermore, future research will need to study the implications for democracy if algorithms are to take over journalism’s role as a watchdog for government.