“Super size me”: The Big Data effect

WAIBigDataBig Data makes everything bigger. Threats on security, planning potential, economies of scale, information to analyse, you get the super pack but you also have to super invest in smart technologies, and people. And then answer that specific question: what does Big Data mean, to you? WAI had a look around to figure out how “Big” innovation can get in this new environment.

As a major technological trend for 2014, Big Data inspires as much as it frightens service providers and infrastructure owners. Because it’s “Big”. According to “Le Cercle – Les Echos”, security will be among the main concern for IT managers to secure autonomy and protection of the company. Although the concept of “Cloud” is now commonly used, (and misused), the data analytics part of it remains an issue, partly because the way information is structured at the moment doesn’t fit the “anytime anywhere” part of the cloud, or not as fast as it should to deliver the user experience expected. And you can add as much bandwidth as you wish: if it’s too big (and presumably “Big Data” is not small), it will overflow your capacity. Hence the initiative of the Indian company CloudByte, presented in an article from lemagit, who created a software technology called Elastistor which allows companies to flexibly manage data storage in a hybrid format mixing blocks and files to share ‘bits’ of information more effectively across servers, virtual machines and applications. And they do have a hint: being flexible makes big data a bit easier to swallow.

Of course investment is part of the story, and there are so many ways to spend money, the holiday season being the perfect time to make wishlists and break open the piggy bank. Start with acquisitions. As lemondeinformatique highlights, many major companies have acquired security and data analytics start-ups in the recent months, showing great interest for the people and technology behind Big Data. Monsanto spent 930 millions USD to acquire Climate Corporation and leverage their expertise in weather forecasting and big data analytics to help farmers plan cultures ahead (source: The Verge). On top of technology, there is a clear investment on knowledge. As John Michelsen says, “Big Data requires a new profile of data analysts (…) Experts estimate that the 30 most promising jobs in the next 10 years will be given to profiles mixing science, technology, engineering and mathematics (STEM)”. Engineering schools got the message: Ensimag and Emsi Grenoble will be delivering a Bid Data specific course as of next school year (source: lemondeinformatique).

Another group got the same message: the dark side, the malware, the hackers, the big threat to big security. Hence the emergence of new ways to think security, so it can cover all aspects of data analysis in its wide “cloud” dimension. In this area, Algorithms, Vizualisation, Context, Automation or AVCA as put forward by Jon Oltsik on Networkworld, is one of the innovation pattern that can be used by security professional to make sure their resources do not get overflowed by security data analysis and solution development. The interest relies in a never-ending “self-learning” approach which will value, calculate, correct and defend systems against external attacks in an automated analysis of past, current and new threats. The American editor Blue Coat has launched Advanced Technology Protection which aims at combining the analysis of known and unknown threats, looking at all threats types regardless of technology (source: silicon.fr). This global approach highlights the need for companies willing to embrace Big data and security to think in broader terms, beyond acronyms and known solutions, and constantly learn from change and failures.

All of these changes will not run without a goal. If Big data makes everything bigger, there is one last but not least link in the chain that needs to grow as well. What will companies do of the Big Customer? Now that they safely store and use information on end-users, service providers have to think, on top of privacy and protection issues, about the value this information may have in the eye of their customers. With the recent NSA scandals over privacy that shaked the physical world vividly, value may be rather negative. Big effort is needed to bring more sensitivity about big data deployment and reinsuring end-users. As InternetActu reminds, it’s all about trust and building relationships with customers, so on top of explaining what is the use of data, there may be a sense in asking them what do they think we should do. Focusing on customer needs is something that worked well in the old “physical machines” world. As the article points out, moving to the cloud should not virtualise trust and conversation with the customer, it should actually give more time and physical space for it. A number of initiatives have already taken place to figure out what customers would do with all the data we keep on their activity and usage, although they struggle at early stage, there is an intent to include physical trust and relations to bring a concrete sense to big data and virtualization.

IDC experts estimate that “Big Data technology and services market will grow at a 27% compound annual growth rate through 2017 to reach $32.4 billion” on a worldwide basis. In order to maximise the direct value of Big Data technology, companies need to invest in flexible tools which will adapt to the hybrid physical/virtual servers and softwares they already use, make sure they have the adapted competencies and “self-learning” attitude to bring out the gold nugget: a safe, trusted relationship with customers who understand and positively value the use of Big Data. And eventually give it a bigger sense.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s