My views may not be the most exciting news for tech consumers, but they touch upon one of the most pertinent issues for large companies today – namely managing the volume of big data that exists and how to plan for future growth. A recent infographic released by Big Data solutions provider Aureus Analytics says that there is a “40% growth in data per year” and predicts that data “will grow 50 times by 2020.” It goes on to say that 80% of data within a company is unstructured. A recent talk by MarketShare CEO Wes Nichols at the DLD conference in Munich revealed the extent of this unstructured data growth:
While this is great news for big tech companies and start-ups alike – it means that users are sharing information with the apps and software designed for this purpose – it is quite daunting for IT managers tackling the issues surrounding the management and analysis of this huge influx of data.
Saving all that data –internal vs. external storage
Companies in the data storage and search fields will continue to thrive, and new solutions for analytics and infrastructure will gain more ground. After all, what use is all of that data if you don’t know what it contains or how it can help your business?
There will be significant advances in duplicate and old data removal technologies. The amount of data that is being added to company servers each year is unsustainable – only the biggest companies could possibly build infrastructure fast enough to keep up with the task. SMBs will be left trying to optimize existing infrastructures as efficiently and cost-effectively as possible, which means spending more time on removing old data and reducing data duplication.
Search as a solution
Building stronger search systems to trawl through this flood of data could act as a temporary solution for making better use of this new information. In order to use 80% of the new unstructured data being generated, companies will have to find a way to structure, search, and analyze it, and learn to use the results from these analyses appropriately.
Cloud computing – here to save the day?
Another big trend in tech today is cloud computing, generally understood as “the storing of company data in off-premise servers.” Sounds like a great idea in theory – you can send all of your heavy data to a storage farm and help free up resources on your own on premise infrastructure. The problem with this general explanation is that it fails to clarify or even explain all of the different options out there. Is it still cloud storage when only the applications are stored in the cloud, but all of the data remains on the company servers?
SMBs seem to be a perfect fit for cloud computing because they can increase storage capacity at a fraction of the cost of traditional infrastructure. Something that is still unknown is how many Fortune 500 companies are going move their data to the cloud. It is intuitive that the vast majority would rather keep their data on their own servers, safely under their own supervision. This means that for the world’s biggest companies, cloud computing is a non-starter. We may see a hybrid type cloud where basic non-sensitive information would be stored elsewhere, but it doesn’t seem to be expanding beyond that.
Enterprise endgame
The speed of adoption is very different between companies and consumers. Companies tend to take a longer view on solutions. Planning and implementation of software upgrades or new products can take years to accomplish. A consumer, on the other hand, isn’t bogged down by huge rollouts, and to a certain extent, not as concerned with security issues. This helps them move more quickly with new trends. Companies need to realize that a long-term view is always good for the stability of a company’s business, but might not be best when finding business solutions. This is especially true when you look at the amount of data that exists and will continue to grow well into the future.
Leave a Reply