Several years ago I was in a small lecture theatre delivering a presentation for a new data centre build. Of the 15 people present, about half sat on either side of the central aisle. I was intrigued enough to ask why there seemed to be a divide in the room and was told: “We are facilities, those guys are IT and we don’t talk to them.”
The Channel September 2009 43Be our guest Since then I have seen this scenario time and time again, much like two countries building a bridge over the water to meet and it not quitelining up in the middle for the cars to cross.
Historically, the facilities department designed the physical space and layout for a data centre, and provided the power and cooling while the IT department filled it with computer equipment but was often excluded from the planning stages. Working as two separate units resulted in data centres which were not physically suitable for the IT requirements and failed to meet the business needs.
A data centre is a physical structure, so it’s foolish of an IT department to think that it will be able to physically build a data centre without the facilities guys’ help. Likewise, it’s equally foolish for the facilities department to think it can deliver a data centre which meets the business and IT needs without working with the IT people.
For IT and facilities to work together successfully, both must understand the changes in each other’s areas of expertise over the last decade and more importantly, be aware of future trends. The following are some key changes that need to be taken into consideration:
Power and cooling: IT equipment has changed significantly. Smaller and faster servers require less space, but generate a substantially higher heat output. Power densities of 5-30kW per rack quickly show performance inadequacies in historical data centres designed for an average of 2kW per rack. IT professional bodies and IT manufacturers are taking a leading role in understanding the difficulties, with bodies such as ASHRAE (American Society of Heating, Refrigeration, and Airconditioning Engineers) playing quick catch-up.
Green data centres: Traditional raised floor layouts provide limited cooling. Creative air conditioner advancements (such as in-row cooling and hot aisle containment systems) meet objectives of function and are more environmentally friendly due to being energy-efficient.
Resilience: Keeping the data centre cool is critical; if the building air conditioning goes off, people get hot and sweaty; whereas in a computer room, if the servers fail, the business falls over.
Management: This is traditionally done using a building management system in the facilities department. As the IT department now needs to know what’s going on with support infrastructure, these management systems must be bilingual and understandable by both the IT and facilities department staff.
Total cost of ownership: One rack of computing hardware can cost $500,000 and could generate 20kW of heat. With poor design over 10 years, it could cost $500,000 in electricity consumption alone. Whilst the cost of hardware is declining, the operating costs are increasing.
To avoid a data centre disaster, follow this basic three-step process with your customers:
- Define business and IT requirements. Get managers from the facilities, finance and IT departments together to define the key project objectives and create a cross-discipline project team to manage the process.
- Develop a concept design and establish project budgets for both capital and operational expenditure. Ensure that the concept design meets the objectives established in step one.
- Create a detailed design ensuring adequate budget is allocated to engineering design and project management components, then set a clear time line for the project and ensure that each part of the team understands the time constraints.