maandag 29 juli 2019

JOURNALISM AS ALGORITHM


Automating the News. How Algorithms Are Rewriting the Media. By Nicholas Diakopoulos 

ISBN 9780674976986 ; Harvard University Press 2019


At the end of a training in standard data journalism, focusing on Excel, scraping and visualizing data a question came up how to integrate data skills in journalism education? Discussion ended in two opposed opinions. The first stated that is was cheap and easy to integrate data journalism skills in the standard curriculum as a kind of specialization. That is instead of TV, Radio, Magazines and Newspapers, an other choice for specialization was created high lighting wide range of data journalism skills. The opposition took a longer approach and defended the idea of creating a new curriculum that could be named computational journalism. Computer science and therefore coding like R would be an important element. Then, I was not convinced by the arguments of the second group and believed that borrowing elements of computer sciences for data journalism would be the right approach.
Now I am less convinced and believing that there is demand for a separate study as computational journalism. There are interesting developments in that direction. First there is the growing interest in the use of R at leading news media.

Automating News

The Economist recently decided to publish analysis of data and the corresponding visualizations, created in R, on GitHub. This makes it easy to download the data and the programming, and try it yourself.
The BigMac Index is an interesting example. The analysis and visualization in R is a small program- code typed in at the command prompt. In order to share this piece of code with others, the Economist uses a jupyter-notebook format. In a data journalism training the notebook can be used to follow the steps for calculating the Big Mac Index.
Not only the Economist, also the BBC decided to give priority to R. Data journalists at the BBC decided to use ggplot- a visualization library in R- for their graphs and charts.
In the area of journalism education Paul Bradshaw started a year ago an MA in Data Journalism at the Birmingham School of Media. Studying “Coding and computational thinking being applied journalistic-ally (I cover using JavaScript, R, and Python, command line, SQL and Regex to pursue stories)” is one of the elements of this new MA, writes Bradshaw on his blog).
An other example come from the field of training. The yearly IREconference training has a growing number of workshops dedicated to R and machine learning. For example ‘Making the leap from Excel to R’.
In his new book ‘Automating the News, How Algorithms are re-writing the Media’ by Nicholas Diakopoulos, argues that the use of algorithms in journalism is the new key concept. He concludes:
It is my contention that a separate graduate degree in Computational and Data Journalism is needed in order to teach a new breed of educator-practitioner-scholar’.

Quakebot
Computational journalism, according to Diakopoulos, is the widespread use of algorithms in journalism practice. This practice however has a wide range. The robo-journalist was one of the first algorithms that was developed. Already 5 years ago a quakebot wrote standard news stories based on earth quake data. Now automated news writing is a standard procedure in financial reporting at Reuters and Bloomberg, ‘both sell specialized information terminals to stock traders, automation parses text documents, such as earning releases, and almost instantly generates and publishes a headline to a terminal interface that reflects whether the company beat or missed earnings expectations writes Diakopoulos. It gives reporters more time to dig deeper than just reporting the numbers. And secondly this automated news production is much faster.
More interesting is the use of algorithms in data journalism. Writing code for scraping the latest data; or using a script that send a message when certain numbers are spiking, and therefore are newsworthy such as a fast rising unemployment or the influx of refugees.

Prediction
Prediction is an interesting use of algorithms, already used sports reporting in the USA. But new for Europe was a story on R-Bloggers, that showed that the Netherlands had a chance of 5% of winning the FIFA women’s world cup. Politics, elections and public administration is an other area. It is for example not so difficult to predict the nomination of a mayor( age, gender and political party, based of social characteristic of cities(number of inhabitants, average income, unemployment etc).
Of course prediction are never 100% correct; and that makes the use in journalism tricky and perhaps dangerous. On top of this, there is a bias, because the model for prediction is trained on fixed data set.
News bots were a hot issue a year ago, all media wanted to have one. The BBC for example has a range of different bots to disseminate news. The Washington Post created a chat bot on Facebook messenger. And let’s not forget the algorithms of Google and Facebook to bring you the latest. Bots makes the spreading of news faster and more difficult to check; bots could easily be used to produce a stream of fake news.

Human Centered
Diakopoulos uses a lot of interesting American examples, his analyses of the use algorithms in journalism is profound; he shows that journalism will definitely be changed by the wide spread use of algorithms. At the same time stating that the human insight in reporting and editing is still needed and important to control the news flow. ‘ I have stressed in this book that as algorithms grow in their capacities of data mining, automated content production, and curation, journalists and society must not forget the role people will play in the future of algorithmic media...we have agency in how these systems ultimately operate and influence the media’

Geen opmerkingen:

Een reactie posten

Opmerking: Alleen leden van deze blog kunnen een reactie posten.