Big data: From 'shiny objects' to fundamental value
The biggest change in enterprise computing in decades is underway, with big data to the fore in the year ahead, according to MapR boss John Schroeder.
Schroeder, chief executive and co-founder of the converged data platform provider, says big data had a profound impact on 2015, and will increasingly do so in the future.
He says how data is stored, analysed and processed is transforming businesses and he sees an acceleration in big data deployments.
“An organisation’s competitive stance now relies on the ability to leverage data to drive business results,” Schroeder says.
He is predicting five key trends for 2016, with converged approaches becoming mainstream; a move from centralised to distributed; storage – particularly flash – becoming ‘an extremely abundant resource; increased focus on fundamental value’ and a ‘flight to quality’.
Converged approaches become mainstream While accepted best practice for the last few decades has been to keep operational and analytic systems separate, in order to prevent analytic workloads from disrupting operational processing, 2016 will see converged approaches becoming mainstream as leading companies reap the benefits of combining production workloads with analytics to adjust quickly to changing customer preferences, competitive pressures and business conditions.
Schroeder says this convergence speeds the ‘data to action’ cycle for organisations and removes the time lag between analytics and business impact.
The pendulum swings from centralised to distributed Schroeder says big data solutions initially focused on centralised data lakes that reduced data duplication, simplified management and supported a variety of applications including customer 360 analysis.
“However, in 2016, large organisations will increasingly move to distributed processing for big data to address the challenges of managing multiple devices, multiple data centres, multiple global use cases and changing overseas data security rules (safe harbor),” Schroeder says.
He says the continued growth of internet of things (IoT), cheap IoT sensors, fast networks, and edge processing will further dictate the deployment of distributed processing frameworks.
Storage, particularly flash, becomes an extremely abundant resource Next-generation, software-based storage technology is enabling multi-temperature (fast and dense) solutions. Flash memory is a key technology that will enable new design for products in the consumer, computer and enterprise markets.
Schroeder says consumer demand for flash will continue to drive down its cost, and flash deployments in big data will begin to deploy. The optimal solution will combine flash and disk to support both fast and dense configurations. In 2016, this new generation of software-based storage that enables multi-temperature solutions will proliferate so organisations will not have to choose between fast and dense—they will be able to get both.
‘Shiny object syndrome’ gives way to increased focus on fundamental value “In 2016, the market will focus much less on the latest and greatest ‘shiny object’ software downloads, and more on proven technologies that provide fundamental business value,” Schroeder says.
Companies will increasingly recognise the attraction of software that results in business impact, rather than focusing on raw big data technologies.
Markets experience a flight to quality In terms of big data technology companies, Schroeder says investors and organisations will turn away from ‘volatile’ companies which have frequently pivoted in their business models.
“Instead, they will turn to focus on more secure options – those companies that have both a proven business model and technology innovations that enable improved business outcomes and operational efficiencies,” he says.