Cameron Neylon

Cameron Neylon
Rutherford Appleton Laboratory, ISIS
Oxford, United Kingdom

Speaker of Workshop 3

Will talk about: Data Deluge: Huge opportunity or damp squib?

Bio sketch:

Cameron Neylon is a biophysicist who has always worked in interdisciplinary areas and is an advocate of open research practice and improved data management. He currently works as Senior Scientist in Biomolecular Sciences at the ISIS Neutron Scattering facility at the Science and Technology Facilities Council (STFC). Along with his work in structural biology and biophysics his research and writing focuses on the interface of web technology with science and the successful (and unsuccessful) application of generic and specially designed tools in the academic research environment. He is a co-author of the Panton Principles for Open Data in Science, founding Editor in Chief of Open Research Computation, and writes regularly on the social, technical, and policy issues of open research at his blog, Science in the Open.

Talk abstract:

The web is just the latest example of a network that has qualitatively changed what human society is capable of with a limited set of resources. Before the web networks of mobile phones, and before that fixed telephony made qualitatively different forms of social interaction possible. Twenty years ago an impromptu meet up between local friends and someone visiting for a day would have been near to impossible. Today it is trivial.

We are only just beginning to see what network enabled research might make possible. Tim Gowers, one of the worlds great mathematicians, described the experience of the PolyMath project compared to his normal approach to mathematics as like a driving is to pushing a car. Examples can be multiplied but they are single isolated examples. The question must be how can we best exploit the capacity of networks across our research effort. The path remains at best obscure at the moment but an emerging understanding of how networks function can help to guide the way. The key aspects of an effective network are threefold:

• The larger and more connected the better: Networks thrive on connectivity. The larger the network and the more connected it is, the greater the opportunity for critical information to reach the right person.
• The lower the friction the better: Transfer of non-rivalrous resources at speed and with low friction is the most important capacity of a network. Artificially introducing friction, or not acting to reduce friction means effectively breaking connections within the network, reducing its capacity.
• High information flow requires effective demand-side filtering: Filtering at source creates friction. Therefore the information flow necessitates the design of flexible and configurable filters that can be used to modulate resource flow on the user (demand) side.

In an ideal world we would utilise the near zero cost of dissemination to enlarge the scale and connectivity of our research network by making content free. We would actively reduce friction to sharing of research resources by focussing business models on generation of "web ready" content, charging for the first copy costs up front and competing on the basis of the service offering. In this world there are many services which currently don't exist but look quite similar to thing that provided by many traditional players. The question is how to get there from here.