Our voracious appetite for digital data continues to present serious challenges for storage, back-up and timely management and retrieval of critical information assets.
Data consumption continues to double year on year, taking us from gigabytes to terabytes and petabytes, placing considerable pressure on businesses and on older storage technology.
While small businesses may be relieved at the lower cost of disk space, those savings haven’t been passed on to high-end storage systems as quickly, presenting both a budgetary and a data management dilemma.
Hitachi Data Systems (HDS) chief technology officer Adrian De Luca says the data deluge has happened ‘so quickly and so vigorously’ that it has been difficult for many IT organisations to have any real and meaningful strategy around managing that growth.
He joins industry analysts in warning the default position of buying more cheap storage, rather than investing in the tools to manage data growth, is a recipe for data centre chaos.
That presents a challenge for resellers and systems integrators to get a better grip on the big picture storage needs of customers who want to manage growth and migrate to newer storage technologies while leveraging existing investments.
Avoiding point products
Gartner’s research vice president of data centre infrastructure, Phil Sargeant, says rather than simply adding more terabytes to storage capacity resellers need to give thought to backup and recovery, where it might be advisable to use tape and where the client company will need to migrate to in the future.
"Smart resellers look at solutions rather than point products and should take up the offer of training sessions with vendors around storage, archiving, and back-up and recovery at every opportunity they can,” says Sargeant.
Mergers and acquisitions have reshaped the storage market in recent years with smaller vendors gobbled up for their innovative technologies, often enabling enterprise players to rescale for the small to medium business sector.
The big names in the enterprise however remain intact: IBM, HP, Hitachi Data Systems, EMC and Netapp.
Sargeant says the New Zealand market for intelligent enterprise disk systems was $51 million in 2010; up significantly from $34 million in 2009, which he expects to see matched this year. At least 65% is sold indirectly through resellers or partners.
"There are a lot more people storing a lot more on disk rather than tape, including a lot more internet, video, graphics. Increasingly CCTV content is also being stored.”
NAS and SAN co-existing
There are two dominant types of enterprise storage: network attached storage (NAS), which is predominantly file based, and storage area networks (SANs) which store in blocks of data.
In the past SANs were associated with high performance requirements such as heavy databases while NAS was more suited to unstructured data such as audio, video and other files with less demanding delivery requirements. Many organisations now have both.
Banks, airlines, telecommunications carriers and major retail chains typically back-up data internally as well as having a secondary data centre off site where it is stored and backed up again.
Most other enterprises outsource this process to third party data centres, ISPs or integrators, although the cost of remote data centre duplication is often beyond the budgets of many small to medium sized businesses.
Where possible, Sargeant says companies need to be looking at newer storage approaches with lower cost disk capacity and data reduction technologies that are more energy efficient and cost less to run.
That might mean prioritising ‘hot’ or critical data and storing or archiving ‘cold’ or secondary data on less expensive disks or even tape storage.
"A lot of people archive data on cheaper high capacity Serial ATA (SATA) disks and while these 2-3Tb single high capacity drives might not perform at same speed, they are economical and secure,” says Sargeant.
Increasingly the storage challenge is about the ability to optimise data so it takes up less space and moves more quickly over networks. Compression has long been a standard space saver, now deduplication can strip out multiple versions of files and only update changes.
Some three to four year old systems will not support deduplication, although add-on appliances may buy time. Meanwhile the reality is companies who’ve got too far behind the curve may be in dire need of a technology refresh.
Disk durability doubted
A number of larger business have migrated entirely to magnetic disk, although Sargeant suggests there are still serious concerns about disk durability. "You can’t really afford to do long term archival of 5-50 years on disk, the technology changes rapidly, disks fail and they do not have the longevity of tape.”
What’s more, he says those who don’t stay on the continual media migration path risk not being able to read archived disk or tape data. "The Linear tape option (LTO) for example is now up to version six and many people on earlier versions will become stranded in five years.”
HDS’ De Luca says a range of pressing challenges are forcing organisations to take another look at more scalable storage and protection technology with built-in deduplication.
He recommends storage services such as virtualisation, replication and deduplication be introduced as close as possible to the data, even embedded in core devices, as adding different appliances into the middle of a SAN can add complexity and create bottlenecks.
De Luca says the overwhelming complexity of a lot of IT infrastructure has also contributed to the difficulty of finding where information assets are located. What’s needed, he says, is clever searching and tool sets that can index content across all different storage and back-up repositories including tape.
Cloud cover increasing
While there’s a lot of talk of the cloud changing the economics of storage, there’s been little deployment to date. "Companies might archive off to the cloud or back up or do a third copy to the cloud but not for mission critical data,” says Sargeant.
For the moment it’s still seen as an avenue for hosting low performance data; but it’s certainly a space to watch, as it will eventually provide direct competition to current reseller models.
As the data deluge continues, customers will want to deal with vendors and resellers who can not only provide storage capacity and management systems but outsource, manage and host storage in the private, hybrid (a mixture), and, increasingly, the public cloud.
"If the companies they’re doing business with begin using the cloud, resellers will have to change their business model to provide storage technology to the cloud providers. You need to be asking who your future market, and begin targeting that,” says Sargeant.
Symantec vice president of product management Brian Dye says the cloud will greatly change the way services are delivered with more enterprises leveraging public and private clouds as they become highly available. "Enterprises will require the ability to manage storage resources whether they’re local, campus-wide, multi-campus, global or in the cloud.”
Dye says a hybrid cloud archiving model allows enterprises to use hosted messaging services while keeping their archives on-premises. "This will help drive cost out of the discovery process, maintain strict access to data, and define who is searching it and where they are sending requests.”
Virtual storage options
In the early days of server virtualisation, storage requirements went through the roof, but this has forced new approaches to virtual storage, now appearing in a number of leading edge solutions.
Faster network infrastructure, including a growing migration to fibre optics, is opening the way for wider deployment of fibre channel and fast Ethernet-based storage solutions. "The advent of 10Gb networking puts a different complexion on the performance of NAS and IP-based storage and the way this is connected,” says Sargeant.
If a data centre becomes too difficult to manage there’s often a case for consolidation, bringing different storage from multiple vendors under one umbrella. That, says HDS’ De Luca, is the ideal opportunity for businesses to move into storage virtualisation.
"They need to determine whether the procurement decisions made over the past couple of years meet current business needs and provide the level of agility being asked for in today’s organisations.”
De Luca says virtual technologies can help repurpose assets and by placing a layer of storage services over existing systems you can introduce ‘thin provisioning’ and data replication into a multi-vendor environment.
This reduces duplication and makes it simpler to do a data refresh. "Gone are the days when you could take down your data centre for a weekend to migrate information from one platform to another. Virtual technology enables you to do that quickly and without any downtime.”
Virtualisation, however, requires careful management. Symantec warns a non-standard fragmented approach to creating a virtual infrastructure can expose gaps in security and back-up procedures.
In the Symantec Storage Trends for 2011-2012 report, Symantec’s Dye warns up to 60% of data stored in virtual environments may not be recoverable in a disaster because enterprises have failed to implement data protection technologies.
He says enterprises need to re-evaluate their information management strategies and disaster recovery technologies to ensure mission-critical data is protected from everyday business risks and disasters.
Storage is no longer a static
business or an afterthought. It is an essential and often complex and expensive part of efficient business systems but without the right tools and processes, organisations of all kinds risk losing control of their data assets under the growing weight of information overload.