In our series “Data! Data! Data!” – Cures for a General Counsel’s ESI Nightmares” we have been interviewing thought leaders in the ESI management and e-discovery universe who can help us navigate through “the perfect storm”: ever increasing data volumes; more litigation and government inquiries, and skyrocketing e-discovery costs. That series will continue next week for several more weeks with interviews that include Deborah Baron of Autonomy, Brandon Daniels of CPA Global, the EDRM godfather George Socha, plus many more.
But everywhere you look, the quantity of information in the world is soaring. For instance, during 2009, American drone aircraft flying over Iraq and Afghanistan sent back around 24 years’ worth of video footage. New models being deployed this year will produce ten times as many data streams as their predecessors, and those in 2011 will produce 30 times as many.
According to one estimate, mankind created 150 exabytes (billion gigabytes) of data in 2005. This year, it will create 1,200 exabytes. Merely keeping up with this flood, and storing the bits that might be useful, is difficult enough. Analysing it, to spot patterns and extract useful information, is harder still. Even so, the data deluge is already starting to transform business, government, science and everyday life.
This week’s Economist has published a special report that takes a close look at this data deluge covering such issues as how information has become superabundant, how information is changing business, how internet companies profit from online data, news ways of showing data, etc.
To access the Economist special report click here.