Story image

Why bother managing storage?

01 Aug 11

Storage on a per gigabyte basis is relatively inexpensive and almost all storage systems, including entry-level systems, make it relatively simple to expand capacity. It seems like it’s easier and less time consuming to just keep adding more disk drives to existing storage systems than it is to go through the work of actually managing the data on those storage systems. Within the constraints of the IT realities of time vs tasks to accomplish, administrators may be left with the attitude of ‘why bother managing storage at all – just keep buying capacity’.
The problem is that every time capacity is added to the storage infrastructure it has a negative impact on almost every other aspect of the data centre. Simply put, even if storage were free, adding capacity also adds cost and without effective storage management the situation can quickly get out of hand.
The problems with just adding capacity
The choice to add storage comes down to one of two options. The first and least expensive is to add storage to an existing system. The good news about this method is that no additional points of management have been added.
The problem, however, is that for most systems adding storage makes performance suffer since there are more hard drive spindles for the storage system to support. In other words, the storage system’s best performance day is its first and from that point forward every upgrade steals a little performance from the system.
The second choice is to add another storage system, either because the current systems can’t support additional capacity or they can’t support that capacity and continue to deliver the performance those applications and users have come to count on. In this case, a new storage system means another point of management and if a different vendor is selected for this system a new interface and terminology has to be learned as well. This adds to the time required to manage the storage environment. Every time a new server is needed, a decision has to be made about which storage system it should be attached to. This means the systems have to be manually inspected to see which has the available capacity and performance to handle the new server’s demands.
Both of these additions in capacity impact the backup process as more data needs to be protected and new volumes need to be tracked down and added to the data protection process scheme. Without proper storage management, the backup administrator may have no idea that new data stores are coming online.
The biggest problem with both of these methods is that they consume the data centre’s most finite resources: budget, space and power. These are resources data centres simply cannot afford to waste and using them is often not required if a good storage management policy is put in place.
In reality most storage systems have plenty of capacity available if they are managed effectively. If additional capacity or even an additional storage system is needed, this should be predicted so that it can be properly budgeted and planned. The ability to use capacity effectively, and add it efficiently, can lead to significant reductions in hard costs, space requirements and power consumption, while improving storage administrators’ efficiency. And, knowing that all storage is being utilised efficiently, knowing when the next storage upgrade will occur, and knowing that data is appropriately protected could even lower stress levels in the environment. So in some ways, effective storage management could increase job satisfaction and personal productivity.
Overcoming storage management resistance
If a well-managed storage environment brings so many advantages, why do so many administrators resist it? The primary cause of resistance is the tools, or lack of them, that administrators feel obliged to use.
Most of the time a collection of free utilities that came with the storage system and the operating system, plus spreadsheets, are combined to create a storage management ‘solution’. The problem is that these techniques require manual inspection and then manual updating of a spreadsheet. The storage administrator is supposed to update them when he or she has free time, an even more elusive resource than space, power and cooling. The modern storage environment, with shared storage area networks (SANs) and network attached storage (NAS) plus the addition of server virtualisation, is simply too dynamic to be managed on a manual, periodic basis. Too much can change in too short a time, and as a result, over-provisioning of storage seems like the easy way out – except for the problems outlined above.
The first step in implementing a storage management solution is to forget the built-in utilities that come with storage systems and operating systems. These tools may be fine in a much simpler environment. However, as a data centre grows and becomes more diverse, these built-in tools are quickly found lacking and become part of the reason that storage administrators frequently adopt a ‘just throw capacity at it’ mentality - one that storage vendors are not motivated to contradict. So every time capacity is added to a data centre it has a negative impact on manageability. When capacity does need to be added, one should know that it’s truly needed and exactly how it should be used.
The answer is to move to an affordable third party storage management software application that can manage storage across a variety of vendors’ hardware. Plus it should be able to manage storage from the view of the server (physical or virtual). This gives an end-to-end view of storage that the built-in tools simply cannot provide. With a tool like this in place, storage can be managed by the system, but also by the way in which applications or users are accessing that storage. The ability to balance the load, in both capacity and performance, becomes a reality.
The final requirement for a modern storage management tool should be real-time analytics. Management’s goal is to utilise all aspects of the storage environment more effectively, which means running the infrastructure close to its limits more often.
In this scenario, the storage administrator needs an ‘early warning system’ that will show when it’s finally time to expand the environment. This also means that the tool chosen can’t make the administrator wait for days while the environment is scanned. A more real time capability is required. One way to accomplish this is to use a storage profiler that places a real-time agent on each of the attaching servers and to understand storage manufacturers’ API sets to communicate directly to those systems.
With the right tool in place, the storage management process functions like a heads up display that’s constantly monitoring the storage environment and providing multiple views into that environment. In real-time implementations data about the storage environment is instantly available and trouble spots can be immediately addressed instead of waiting for the infrastructure to be crawled for information. Most administrators and IT managers think of storage management as a tiering exercise and see the next step as starting to delete or move old files. While the archiving of older data may be valuable in the future, a real time tool can resolve immediate problems long before files need to be deleted or moved.
What’s in it for the reseller?
In some cases, a storage reseller may feel that it is in their best interest simply to give in to the customers’ capacity needs without at least broaching the subject of better storage management. In reality, this is not as easy as it seems since there is almost always competition for the business. In some cases this approach might open up the account to a complete storage swap out.
Often, the right thing to do for the customer – helping them gain control of their storage assets – is also the right thing to do for the storage reseller, because it can help move the reseller into the role of trusted advisor instead of just another vendor. 
Storage resellers will find that when they advise a customer on how to manage their storage better, there will still be plenty of opportunity for more business. However, instead of capacity add-ons, the new business may come in the form of technical services to diagnose the storage problems and design a new solution. Even when the customer does need additional storage, the reseller will have clear-cut answers to which, and how much, storage the customer will need. 

LG takes home over 140 awards at CES
Including Engadget Best of CES Award in TV category for fifth consecutive year for the LG Signature OLED TV R.
Expert comment: Google fined US$57mil for GDPR breaches
The committee examining the breaches found two types of breaches of the GDPR.
McAfee customer choice for Cloud Access Security
“This is the second time that McAfee has been named... and we believe this demonstrates our ability to stay ahead of the pack.”
The message behind the Apple/Samsung iTunes partnership
Futuresource has released its perspective on what Samsung Smart TV’s new iTunes Movies and TV Show app means for Apple as a company.
Cost keeping Kiwi companies from making the most of IoT
"To move past the barriers inhibiting production scale IoT, organisations have to solve the broader security and upgrade issues."
Cybercrime could cost companies US$5.2tril over next five years – survey
New research from Accenture found that only 30% of organisations are very confident in internet security.
Achieving cyber resilience in the telco industry - Accenture
Whether hackers are motivated by greed, or a curiosity to assess a telco’s weaknesses; the interconnected nature of the industry places it in a position of increased threat
Three ways 5G will impact power management in 2019
Power failures or any network interruptions are not an option. Rolling out 5G is complex and presents a unique challenge for telco providers.