Skip to main content

8 big data predictions for 2015

8 big data predictions for 2015

By Bernard Marr

For this post, I’ll be gazing into my crystal ball to predict the moves that big data will make in the coming year. As with all predictions, we have to take those with caution because some of them might not turn out to be true. And of course real game-changing innovation often comes out of left-field and takes even the most vigilant of seers by surprise. So, if something earth-shattering happens in the upcoming year which shakes our conception of what can be done with data to the core, and I missed it here, blame the crystal ball.
The value of the big data economy will reach $125 billion
This is the revenue taken in by vendors selling software, hardware and services to allow other businesses to implement big data strategies. The figure comes from research by market research specialists IDC (International Data Corporation).
The Internet of Things will go mainstream
There are a plethora of wearable and data-enabled devices on the market now. Some are great, some are clearly faddish and lacking in practical use. 2015 could be the year they break out of the gadget-geek and early adopter markets as people’s need to be connected at all times continues to grow.  Expect to see (perhaps be crashed into by) your first person wearing smart-glasses in the street pretty soon.
Machines will get better at making decisions
At the moment, big data generally acts as guidance for decisions that are mostly still made by people. Expect this to change in the near future, as advances in machine learning bring us closer to the point where data-gorged machines are capable of making more accurate and reliable decisions than people (I know, it’s scary!).
Textual analysis will become more widely used
Increasingly, much of the data we are storing for analysis is in an unstructured form. Textual analysis has become increasingly sophisticated in the last few years and that trend will continue. Computers will become more proficient at “reading” a piece of text (or voice converted to text) and spotting themes and sentiments – meaning it can be classified and analyzed in the same way as structured data.
Data visualization tools will dominate the market
Specialized software designed to create visualizations from data, making it easier for us to spot patterns and links between cause and effect, will become increasingly sophisticated and widely used. This market is expected to grow 2.5 times more quickly than that for other business intelligence software products.
There will be a big scare over privacy
Big security breaches, like those suffered by users of Apple, Sony and Snapchat services in recent years, didn’t scare the general public enough to stop them sharing details of their private lives on social media and other web services. In fact it seems that more people than ever believe that giving corporations our personal information is a small price to pay for the convenience and utility offered by new technology. But could we be headed for a “perfect storm” – hackers have shown they are able to compromise even the securest systems, and governments and law enforcement agencies have been slow to clear the hurdles that prevent many breaches from being brought to justice. A devastating hack or information leak might be enough to start changing people’s attitudes and restoring a bit more common sense over how we take responsibility for our own personal data.
Companies and organizations will struggle to find data talent
There are expected to be 4.4 million people employed worldwide in positions directly involved with big data analysis by next year. But this won’t be enough. By next year, 70% of US businesses will either have a data strategy in place or will be planning one for the near future, according to one survey. The number of colleges offering courses related to big data analysis continues to grow rapidly, but there will continue to be a shortage of workers trained in the necessary skills for the foreseeable future.
Big data will provide the key to the mysteries of the universe
The Large Hadron Collider is currently undergoing an upgrade and will resume operation early next year. It currently collects around 30 Pb of information each year from the high-speed proton collisions which take place 600 million times each second in its machinery. This information is analyzed over a network spanning 170 computing facilities in 36 countries making it by far the largest scientific big data experiment ever undertaken. It has already succeeded in identifying a particle which matches the theoretical Higgs boson – a discovery which many have taken to mean we are heading in the right direction in our attempt to understand how the universe works and came to be. When it spins up again, twice as powerful as before, who knows what else it will find?
This article is published in collaboration with Smart Data Collective. Publication does not imply endorsement of views by the World Economic Forum.
To keep up with Forum:Agenda subscribe to our weekly newsletter.
Author: Bernard Marr is a globally recognized expert in strategy, performance management, analytics, KPIs and big data.
Image: Internet LAN cables are pictured in this photo illustration taken in Sydney June 23, 2011. Australia cleared a key hurdle on Thursday in setting up a $38 billion high-speed broadband system after phone operator Telstra agreed to rent out its network for the nation’s biggest infrastructure project in decades. REUTERS/Tim Wimborne.

Popular posts from this blog

Dunia Aplikasi OpenSource

Mengenal Fungsi Dan Komponen Panel Listrik

Membangun Ruang Server (1)