While this is not a historiographical topic, the issue recently arose due to newly published papers, and it is one of these exemplary cases of Big Data that generate both publicity and prototypical arguments. The story itself is as always complicated, making it thus a good example for science historiography.
By 2009, some of Google's researchers had turned their Google Friday project into a service that used MapReduce over localized search requests to predict flu outbreaks. Their paper was submitted to Nature (a reprint is found here) and generated enormous interest, because they used a variety of clever techniques to keep the theory low.
However, the system performed badly during the H1N1 influenza epidemic, mostly because the non-seasonal nature and the heightened media-awareness required a very different set of queries to suss out what was happening, as Google researchers analyzed in 2011.
In March of 2013, Nature counted Google Flu Trends as one of the victims of the flu. Google researchers published yet another analysis and modification of the algorithm at a conference for neglected tropical diseases.
In March of 2014, another round of papers was released---here and here---by empirical social science researchers from Northeastern University and Harvard, arguing for the statistical problems of approaches such as Google Flu Trends, mostly attacking the exemplary status and the theory-reduced aspects.
The release of these papers had an echo in the blog-sphere that caused more general discussions about the achievements and limitations of big data, whether in the Financial Times, or in the New York Times (Bits Blog).
No comments:
Post a Comment