Skip to content

How Data Will Transform Science

2014 July 27

In the late 17th century a middle-aged draper named Antonie van Leeuwenhoek became interested in the magnifying glasses he used to inspect fabric.  He started experimenting with making his own and ended up creating one of the world’s first microscopes.

His work caught the attention of the Royal Society in London, which encouraged him to continue his research. Eventually he got around to examining a drop of water under his new device, which led to his discovery of an entirely new realm of microscopic organisms.

Technology and science have always been inextricably linked.  Watson and Crick used the new technique of x-ray diffraction to discover the structure of DNA.  Quantum mechanics would not be possible without proton accelerators.  Strangely though, new data techniques have thus far had little effect on how science is done.  That’s about to change.

The Costs Of Data Failure

In January 2010, two highly respected economists at Harvard, Carmen Reinhart and Kenneth Rogoff, published a working paper that warned of grave consequences for economies whose sovereign debt rose above 90% of GDP.  The conclusion quickly became the centerpiece of a fierce political debate about public debt in the US.

Yet all was not as it seemed.  When other economists analyzed the same data, they saw no such correlation.  In fact, they found that the two esteemed economists had made a simple coding error in Excel.  A scandal ensued.  Was it an honest mistake, or a cynical political gambit designed to further the careers of two academics?

In actuality, the problem was not so much with Reinhart and Rogoff, or even with divisive politics, but science itself.  The journal Nature reported that even in cancer research, where life and death truly hang in the balance, the overwhelming majority of studies can’t be replicated and The Economist asserts that such errors are nearly unavoidable.

Clearly something has to change.  While the rest of the world has become a highly connected place, driven by new approaches to data and technology, science remains, to an alarming degree, stuck in the last century.

The New Scientific Revolutions

Timo Hannay, Managing Director of Digital Science, says that “science has been a cottage industry, but needs to move to an industrial scale where we’re integrating more skills such as gathering, analyzing and interpreting data.”  He argues that three fundamental transformations that need to take place.

First, he points to mass collaboration, which is still in its infancy.  Sometimes, this involves amateurs, like in the case of Galaxy Zoo, which enlists the public’s help to classify systems light years away.  Often though, as in the case of the NIH’s genome and protein databases, it is the masses of professionals that are contributing to a common project.

The second area of transformation is publishing.  As I noted above, scientific publishing is in a state of crisis.  Peer reviewed journals take too long to publish and even when they do, the results are questionable, at best.  All too often, it’s hard to discern junk science from the real thing.

The third is the rise of artificial intelligence as a viable way of doing science.  Humans are, in fact, very poor data processors and there are a variety of machine learning techniques that can tremendously aid researchers in doing their jobs.  The mathematician Samuel Arbesman even foresees computers making discoveries scientists don’t understand.

These are all exciting developments, but as Hannay points out, many in the scientific community are not only slow to adopt them, but don’t even regard them as “real” science. He says of the new techniques, “we need to recognize and reward these contributions.”

The Rise of Cognitive Collaboration

Much like Leeuwenhoek, Ignaz Semmelweis made a transformative discovery.  Through studying childbed fever, he realized that hand washing could significantly reduce the mortality rate in hospitals.  Unfortunately, the Viennese medical establishment balked—they believed that infection spread through bad air— and Semmelweis became an outcast.

Eventually, students of the doctor published reports of his findings in other countries and the idea gained traction elsewhere, but not in his home country.  Over the years, Semmelweis became increasingly distraught, was committed to a mental institution and died, ironically, of an infection he contracted there.

It is a tragic story.  But thankfully, one that is considerably less likely these days.  We increasingly live in an age of cognitive collaboration, in which humans use machines to more effectively work with other humans.  Today, Semmelweis could publish his findings online, share his data and would not be so constrained by localized sentiment.

We are, as many have noted, in the midst of a new industrial revolution in which we are beginning to use machines not just to augment our physical capabilities, but our mental ones as well.  Most, if not all, aspects of human experience are being rapidly transformed. Business life has already been significantly altered, but science still needs to catch up.

The New Industrialization of Science

Over the past century, industry and science have formed a productive partnership.  From the first corporate lab at GE to countless others like Bell Labs, PARC and IBM Research, modern management techniques have helped to propel the process of discovery forward, creating major breakthroughs and multi-billion dollar industries.

Yet, when it comes to management innovations, science often lags behind and that is very much the situation today.  As Dr. Hannay puts it, “most scientists have better technology at home than they do in the lab.”  He believes that many of the practices that help businesses compete can also bring science more fully into the digital age.

For example, labguru takes project management techniques common in industry and applies them to scientific research.  Figshare allows researchers to publish not only their formal conclusions, but their data sets, negative results and informal findings.

Bernie Meyerson, the Chief Innovation Officer at IBM, also believes that there is a revolution taking place.  Rather than having to invest in massive labs to pursue a new area, he says that now, “you can stand up a world class laboratory much faster and you can also shut it down and move on when it doesn’t work out.  You can just be more agile.”

He believes that digital technology will reduce the time to outcome for research to pay off by “several orders of magnitude,” massively improving scientific productivity.  It is as if we are back in the days of Leeuwenhoek, staring into a microscope for the first time and just beginning to understand the possibilities.

– Greg

Comments are closed.