Big Data - will it make any sense?
Tuesday, 28th June 2011
Tim Buckley Owen
Small wonder that LexisNexis has chosen to badge its HPCC Systems platform as a Big Data solution. To judge from the sudden surge of interest, Big Data is set to become the next Big Thing – and just recently big names like IBM, McKinsey and Gartner have all offered their own takes on its future development.
For McKinsey, analysing large datasets will become a key basis of competition so long as the right policies and enablers are in place. Big Data analysis can help segment populations to tailor services to customer needs, improve human decision-making and innovate new business models, products and services.
But there are caveats – issues around privacy and security, for example, alongside the difficulty of integrating data from incompatible legacy systems. Above all, though, there’s a big potential skill shortage – of 140,000 to 190,000 analysts in the United States alone, plus 1.5 million managers with the skills to understand the implications of the data once it’s been crunched.
Gartner adds further warnings. The very term “Big Data” puts too great an emphasis on the volume of information that can be handled – and it’s heavily weighted towards current issues, which could lead to short-sighted decisions which may hamper its future development.
Like McKinsey, Gartner too warns that today’s information management disciplines and technologies are simply not up to the task of handling all the dynamics that deployment of Big Data engenders. “The real issue is making sense of big data and finding patterns in it that help organisations make better business decisions,” it concludes.
But perhaps the most far-reaching implication is that it’s not just for the global enterprise. According to a survey of chief information officers from mid market companies, carried out by IBM, extracting actionable insights from Big Data is a top priority for them too.
They’re looking to mine data for analysis from videos, blogs and tweets as well as from more structured sources. And the growth in the use of mobile technology means that almost three quarters plan to invest in mobility solutions.
So LexisNexis’s decision to roll out its own Big Data solution seems timely. It’s been providing large volume data analysis capabilities to its own customers for over 10 years, using its High Performance Computing Cluster (HPCC) technology; now it’s offering HPCC as an Open Source solution to the wider community.
Significantly, LexisNexis has no intention of releasing any of its data sources, data products or data linking technology. “These assets will remain proprietary,” it uncompromisingly states.
From info pros’ point of view, that could be a good thing. As these studies indicate, it’s not just what you can do with the data, it’s knowing what’s worth adding to the mix.
About this item:
By Tim Buckley Owen
Tim is an information skills trainer and writer on the information industry with over 40 years' experience in the profession. His career has encompassed information management, writing, editing, training, government policy advice and corporate media & marketing.
Besides writing for FreePint, Tim runs courses for training providers and private clients on enquiry handling, abstracting & summarising, information packaging & presentation and information management. The sixth edition of his classic handbook Successful Enquiry Answering Every Time is published by Facet Publishing. You can find details of Tim's training services at www.buckleyowen.com.
Tim can also be reached at firstname.lastname@example.org
More articles by Tim Buckley Owen »
« See all FreePint Blog items