Data journalism is already more than fifty years old. It started in the sixties as precision journalism with Phil Meyer, then CARR computer assisted research and reporting and now data journalism. The shortest definition of data journalism is 'social science done on deadline' (Steve Dough). We incorporate the tools of the social sciences to analyze data and include them in our storytelling.
In the beginning, some 10-15 years ago, practicing data journalism needed extra skills and training. Scraping data, cleaning up and analyzing in Excel, making graphs in maps, getting data into the story, this all needed some extra journalism training. Therefore data journalism became a specialization of journalism.
The field is changing fast, and data journalism becomes a do-it-your-self toolkit that everybody can use with a minimum number of skills and understanding. Take a tool like Flourish https://app.flourish.studio/ for example: put the data in and push a button a get the graph of a map. Or the latest: workbench. Clean, scrape, analyze and visualize data without coding. A project from Columbia J-school at New York. Sign-up and get started:http://workbenchdata.com/. All the data journalism tools integrated in one package.
Reflecting on data journalism on his onlinejournalism blog, Paul Bradshaw creates two categories of data journalism training: teaching slow or fast. Teaching data journalism fast works as follows: “For many years I began my introductory data journalism classes with basic spreadsheet techniques, followed by visualization sessions to show them how to bring some of the results to life. In 2016, however, I decided to try something different: what if, instead of taking students through the process chronologically, we started at the end — and worked backwards from there? The class worked like this: students were given a spreadsheet of several tables already ready to be turned into a chart”. The new tools just mentioned not only make data journalism easy, but also clears the way for thinking about the story to be produced, and not too much about the technology and number crunching behind it.
If you ask a journalist why he or she choose this profession, a likely answer is that he/she hated mathematics. What a pity, the data, the numbers are back!. All kind of organizations are collecting data, from the government and NGO's to private forms and companies. Some of these data are “open source”or 'open data' and can be used in our reporting. Take for example stories in the Economist about GDP and Sovereign Debt in Sub Sahara Africa, dirty cooking fuels. These data and the related stories can produced easily with tools from the fast track. However knowledge from the slow track is indispensable. Bradshaw: “If the challenge in teaching ‘fast’ data journalism is how to boil it down to the essentials and motivate word-oriented students, teaching ‘slow’ data journalism brings a very different challenge: how to do justice to the vast diversity of the field”. The hard core of data journalism entails more than pushing a button to create a graph; some basic statistical knowledge to calculate for example key figures, relationships between variables and checking confidence intervals. Deepening the statistical understanding of data is I believe an important element in the slow training. An introduction to R-project is a possibility, scraping data; going deeper into design; introducing D3 graphics based on Java script and using plot.ly https://plot.ly/
Reflecting on my training in Africa, I think that the fast approach in training data journalism has been the most successful. See also: https://d3-media.blogspot.com/2015/01/10-lessons-from-data-journalism-training.html Following Bradshaw's: First because the fast track starts with the context of the participants and secondly it is a problem solving activity. Using the data journalism easy tools makes it possible to do an-all-in one example, resulting after some puzzling with the tools, in a graph and story: that is in journalism.