The paper “Designing What’s News – An Ethnography of a Personalization Algorithm and the Data-Driven (Re)Assembling of the News” is a great illustration of Kranzberg’s First Law of Technology, which states that “Technology is neither good nor bad, nor is it neutral.”
In this paper, published in Digital Journalism, Anna Schjøtt Hansen and Jannie Møller Hartley report on the development of a personalisation algorithm, and its deployment at a large news organisation in Denmark.
The algorithm was introduced to personalise the news that were displayed for each reader, when they visited the newspapers’ website. Though, what we see in Hansen and Hartley’s paper, is that the introduction of the algorithm doesn’t just change the way the frontpage looked for different readers. Rather, in the process of quantifying what is newsworthy, and of committing the editor’s decision making to a formula that is binary in its working and can be applied across all articles, the algorithm changes the very essence of what is timely and local news and, therefore, of what is newsworthy or not.
Here is my summary of some of the changes reported in Hansen and Hartley’s paper:

I saw a similar effect in the use of algorithms to classify who might be a good customer, or to detect money laundering. Whenever the data scientists tried to capture human behaviour in a formula or convert decision making to a binary decision, a new phenomenon was construed which, intrinsically, was neither good nor bad; but which was, nonetheless, consequential.
Have you come across other great examples of algorithms’ development, and how they shape behaviours?
One thought on “On how algorithms are consequential – example from a newsroom”