I, robot rules the data centre

If you want to pick a phrase to describe the data centre of the future, try “”less is more.”” Less space, less power consumption, less complexity. Fewer and smaller machines, fewer people, fewer sites. All driven in part by the fact there will continue to be less money, while at the same time the demands

on IT will be the one area where words such as less and fewer definitely do not apply.

“”The cost-performance ratio is improving all the time, but you’re expected to do more and more and more with it,”” says Martin Walker, director of IT infrastructure at Pacific Blue Cross in Burnaby, B.C. “”The demand for all things IT is increasing, but the money is not.””

“”There is such a pressure on cost savings and continuous benchmarking and moving our costs lower continuously,”” agrees Eva Maglis, senior vice-president of integrated technology management with Montréal-based computer services firm CGI Group Inc.

So data centres have to become more efficient. Avenues they’re taking to get there include:

• replacing numerous small servers with fewer, larger ones, helped along by virtualization;

• exploring utility computing to make the best use of computing power;

• increasing use of open-source technology for the cost savings they hope it will provide; and,

• relying more on interchangeable, reusable components in hardware and software.

“”We see a much simplified data centre in the future,”” predicts Al Zollar, general manager of sales for IBM’s eServer iSeries — thanks both to the physical simplification that comes from smaller, standardized packages such as blades, and to the logical simplification made possible by the ability to run multiple operating systems on the same hardware through virtualization.

For Maglis, the number-one trend is Linux. Software is a major part of the cost of mainframe systems, she says, and given heavy cost pressure more and more organizations are looking at the open-source operating system. “”It’s already running on a lot of the mid-range,”” says Maglis. “”Where you don’t see it a lot right now is the big mainframe, but it can run on the mainframe.””

Four servers consolidated to one

Asked to picture what his organization’s data centre might look like five years from now, Walker says it will take up less space. Pacific Blue Cross’s two mainframes — down from four not long ago — will be consolidated to one. Smaller servers will continue to be consolidated, and peripheral devices will get smaller. Blades, which put server functionality onto thin cards that slide neatly into racks, will proliferate.

Across the country in Florenceville, N.B., Anil Rastogi says data-centre centralization has been the trend for a decade or so now and will continue. The chief information officer at McCain Foods Ltd. says his company’s data-centre operations are already heavily centralized.

But Maglis sees a continued balancing act between centralizing and decentralizing. The days of a single “”glass house”” are gone in most organizations, she says, yet CGI and others are trying to avoid spreading IT facilities too thinly.

“”You’re trying to create a balance between how much you’re going to centralize and how much you’re going to decentralize.””

A tendency to centralize goes hand-in-hand with the push to reduce the fewer small servers to specific tasks, replacing them with larger machines that do more. Virtualization helps that trend by allowing a single large server to be set up as if it was multiple, smaller servers, each running a different application and in some cases even running completely different operating systems. This has been a staple of mainframe computing for decades, but in recent years has made its way into smaller machines, such as Windows servers.

Virtualization goes hand-in-hand with utility computing to let data centres shift resources to where they are needed and even acquire computing power as a commodity the way we’re used to acquiring electricity. Joe Hogan, Worldwide VP of managed services marketing, strategy and alliances at Hewlett-Packard predicts this will grow.

Current software licensing models are obstacles to utility computing, Hogan admits, but “”I think eventually the marketplace will find a solution.”” As for resistance to relying on outsiders to run mission-critical systems, Hogan suggests that Henry Ford might well have told Thomas Edison a century ago that he could not rely on another company to generate electricity for his plants because “”that is so critical to my business.””

For the Ontario government’s Smart Systems for Health Agency, set up early last year to provide data centre services to the province’s health-care community, virtualization means separate systems 150 kilometres apart look like a single system. “”From their perspective (they are) adjacent in the same cabinet,”” says Mike Monteith, Smart Systems for Health’s chief architect.

A virtual abstraction

Monteith says the agency’s facilities were designed from the ground up on the principle of abstraction and virtualization.

“”It’s about virtualization for availability as well as capacity on demand,”” he says. According to Monteith, it is so important to keep systems up and running in the health-care field that the traditional notion of disaster recovery is no longer good enough. A separate backup site distant from the primary data centre may not be able to take over fast enough to avoid huge costs and loss of life. “”A couple of minutes becomes a big problem in health care,”” says Linda Weaver, chief technology officer at Smart Systems for Health, “”and not having access to your data becomes a big problem.””

“”What we decided was to abandon the concept of disaster recovery and shoot for a principle of continuous availability,”” says Monteith. That means physically separated data centres that look like a single one, complete with a fully synchronous, mirrored storage-area network designed to avoid data loss in an emergency, and a virtual operations centre.

Another fundamental principle in Smart Systems for Health’s design is componentization. Rather than duplicating the same function in different places, Monteith says, the goal has been to build it once and re-use it. “”We’ve tried to box them in a way that promotes convergence of functionality and re-use across the data centre.”” Software designers have been talking about this for at least 20 years, but as a new agency Smart Systems for Health had the opportunity to implement it more thoroughly than many data centres can.

“”Some of the big leaps we were able to make are going to take other people a little longer,”” Monteith says.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Grant Buckler
Grant Buckler
Freelance journalist specializing in information technology, telecommunications, energy & clean tech. Theatre-lover & trainee hobby farmer.

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs