ITBusiness.ca

Powering up the utility pump

All for one and one for all. From each according to his ability, to each according to his need. Imagine all the people. Utility computing reminds you of slogans and aphorisms such as those. It’s about all your computing resources working together as one, ready to step in wherever they are needed,

rather than a cluster of underutilized servers running financial applications sitting idly by while peak demand brings your e-commerce site to its knees. It’s about extra computing resources coming online either automatically or at the click of a mouse when you need them. Sounds great, doesn’t it?

The only trouble is that like all such grand visions, it’s not always easy to see how we get there from here.

Many paths leading to utility nirvana

But take heart. The path – or the paths, because there are several – to utility computing are not well marked, but they exist. It may not be simple, but this vision is not as hard to attain as world peace.

What it has in common with many ideals is that everyone has a different idea how to get there. But the goal is clear: flexibility.

“”That’s the concept people are trying to get to, where I can treat all my resources as if they’re one big pool,”” says Jasmine Noel, principal analyst with consulting firm Ptak Noel & Associates in Boston and a frequent commentator on utility computing. That in turn “”allows you to ebb and flow with the business,”” says Lynn Anderson, vice-president of enterprise marketing at Hewlett-Packard (Canada) Ltd., which touts utility computing as an enabler for its “”adaptive enterprise”” vision.

IBM, which prefers the catchphrase “”on-demand computing,”” likewise links the idea to business imperatives. “”An on-demand business is an enterprise whose business processes are integrated end-to-end across the company and with key partners, suppliers and customers,”” said Ed Kilroy, president of IBM Canada Ltd., in a recent speech in Toronto. Al Zollar, general manager of sales for IBM’s eServer iSeries, says utility computing is part, but not all, of his company’s on-demand vision.

Like turning on a light

Many use the analogy to utilities such as electricity. We don’t buy power generators, we buy kilowatt hours. Ideally, we would not think of an assortment of servers running applications, but of a supply of computing power to be used as needed. Or at least that is “”the purest sort of end game of utility computing,”” says Gordon Haff, senior analyst at research firm Illuminata Inc. in Nashua, N.H.

This analogy leads naturally to outsourcing – since power utilities produce the power and ship it to us over wires, wouldn’t utility computing mean the same thing? Maybe. An outside service provider could be part of a utility computing strategy, Zollar says, but so could your own systems.

Noel outlines four paths to utility computing. A given organization might use one, more than one or conceivably all of them.

The first is a provisioning system that can move work among multiple servers as required. Pick servers, storage devices and so on from your data centre, decide what applications to put on them, and the system sets it up for you. “”If my organization needs another Web server I just hit the button,”” Noel says.

HP’s Utility Data Center includes provisioning software and lets IT staff use a visual interface to assign resources to different jobs, and operations software that monitors performance and alerts staff when an application needs more resources. Several large customers – among them Procter & Gamble Co. and LM Ericsson – have installed it, says Bill Dupley, infrastructure solutions manager at HP Canada.

Sun Microsystems Inc.’s N1 system distributes software to processors as required. Gordon Sissons, vice-president of products and technology at Sun Microsystems of Canada Ltd. in Markham, Ont., says N1 is popular with service providers, who may use it to offer on-demand services to customers.

Meanwhile, Veritas Software Corp. offers a tool for provisioning blade servers on demand. Veritas got the OpForce software in its acquisition of Jareva Technologies, which Fred Dimson, general manager of Veritas Canada Software Inc., describes as a key step in its utility strategy.

Hardware with extra capacity – built in, but not turned on – allows extra capacity to come online at short notice. The ability to do quick upgrades without actually ordering more hardware is proving popular, Haff says. But while some vendors, including HP, Sun and IBM, offer the ability to turn capacity on and off and pay for it only when using it, Haff says customers have been slow to use this option.

Which approach works best?

Noel’s second path is virtualization. A virtualization layer sits on top of multiple systems, making them look like a single pool of computing power.

“”Point and click and it looks like you’re adding a CPU to your software,”” Noel says. However, “”the solution is not fully cooked and ready to be implemented in a couple of weeks,”” Noel warns.

Then there’s outsourcing. An example is the way Gridway Computing Corp., a Kanata, Ont., service provider, helps Ottawa-based Tundra Semiconductor Corp. handle peaks in its semiconductor design operation’s computing needs. At certain points in the design process “”they tend to never have enough software licences for the applications they’re running and they tend to never have enough computing power,”” says Chris Kramer, co-founder and chief technology officer at Gridway. It could cost Tundra more than $100,000 to add the capacity to handle those peaks, and it would sit idle much of the time. Instead, Tundra can rent processors by the week from Gridway for $150 per processor.

Haff says this approach makes sense when you have short-term needs for capacity that would be expensive to have in-house, but it’s more economical to own what you will use frequently unless it is very expensive.

Noel offers one final path to utility computing: Buy one big box, partition it to run different applications and different operating systems if necessary, and change partitions on the fly.

Not all these approaches are workable in every situation. Provisioning won’t work well for a large enterprise resource planning (ERP) system, Noel says, because it will not run on a single server blade. If your existing hardware is more or less monolithic, partitioning may make most sense. Virtualization best suits large, technically sophisticated organizations, while outsourcing could be an option for almost any size of business if handled carefully, Noel says.

Exit mobile version