I have talked a lot about Big Data on this blog. It is a technology that is now becoming normal and accepted in the enterprise, largely because of two factors:
1. The Internet of Things (IoT) means that every electronic device is becoming connected. Even light bulbs can now be assigned an IP address so you can connect them to a home control system. All these connected items generate vast amounts of data…
2. Consumer behaviour and their relationship to brands has been entirely reversed in the past five years, from brands offering a way to get in touch to consumers defining exactly how they want to review or criticise products. Now brands need to seek out comment and to engage wherever the customers are located.
There are many more factors, but I believe that these two broad changes are responsible for creating enormous amounts of data – amounts that seemed unfeasible a decade ago.
The industry analysts support this view. Ovum recently announced their own research, which indicates that from now until 2019 they predict that the Big Data market will grow 50% each year. Compounded annually this means that by 2019, the market for Big Data software and expertise will be six times bigger than it is now.
Six times. That’s a lot of market growth. The Ovum Big Data Practice Leader, and co-author of the report, Tom Pringle, said: “The experimental era of big data is coming to an end, organizations are formalizing their use of big data technology to realize the business value they expect to find.”
The important factor to note here is that Ovum is suggesting that the time for experimenting with Big Data is over. Many companies have tried it, toyed with open source software and systems, and experimented with the insights they can gain from Big Data analysis, but it is now proven that many companies need these insights.