Pay as you go

The concept of utility computing is still somewhat nebulous, but there is a general suggestion of saving money, and that’s usually enough to get IT managers seriously interested.“There’s certainly a lot of interest driven from the financial end of it,” says Scott Collinson, director of business development for the managed services unit of Hewlett-Packard (Canada) Co., in Mississauga, Ont.
To some customers, utility computing is a way to eliminate a capital investment in equipment and “buy back computing by the glass or by the drink,” Collinson explains.
The term covers a range of options, brought together by the general idea of buying computing power as you need it.
The name utility computing comes from an analogy to utilities like electricity and water. When IBM first began talking about the idea, executives pointed out that in the early days of electricity, factories usually owned their own generators. Eventually electrical utilities became commonplace and most businesses, as well as homes, came to rely on electricity delivered over a wire from a remote, third-party generating plant.
Call it outsourcing, or call it, if you don’t mind dating yourself, the service bureau concept. It’s essentially the same — someone else runs the computers on your behalf.
For this to qualify as utility computing, though, an organization should be paying for only the computing power it needs, not simply paying an outside agency to run its computers.
Toronto’s University Health Network, a group of hospitals, has such a deal with HP Canada. David Stankiewicz, manager of technical services for the operator of three major hospitals, explains that UHN made no up-front investment in hardware or software. HP operates the systems on its own premises, and UHN pays according to usage — for instance, so much per e-mail account per month.

pay as you go

Ottawa-based GridWay Computing Corp. supplies extra computing capacity on its computing grid to customers who include a number of semiconductor design companies. Chris Kramer, GridWay’s founder, says these companies periodically need masses of extra power for verification runs, and acquiring in-house capacity that would sit idle the rest of the time wouldn’t be practical.
The servers need not be either located off site or operated by someone else to deliver utility computing. Several computer vendors now offer their customers the option of installing a server with as much capacity as they will need at peak periods, but turning on, using and paying for some of that power only when they need it.
IBM began by offering the ability to order a system with extra processors in reserve and turn them on as the customer’s needs grew, then in 2003 added the ability to switch processors on and off to meet demand fluctuations, says Barry Pow, product manager and advocate for IBM’s iSeries, one of the product lines offering this feature. The capability is best suited to customers who often experience spikes in demand.
David Senf, program manager for software research at research firm IDC Canada, says the bulk of utility computing today happens “behind the firewall” or within the enterprise rather than involving an outsourcer.
Utility computing — or on-demand computing, as some prefer to call it — can also refer to the ability to shift work around to the machines that currently have the capacity available.
New York-based DataSynapse Inc. sells software that lets applications run across heterogeneous grids of computers, using computer power wherever it’s available. Peter Lee, DataSynapse’s chief executive, offers a helpful definition of utility computing: the ability to monitor and meter use of computer power, possibly charging according to how much is used. An outsourcer can meter the cycles used by a customer, a system vendor can install a machine on customer premises and charge by how much of its power is used, or an IT department can meter the use of in-house capacity and charge end-user departments accordingly.
Whichever approach you take, “utility computing is about being able to get more out of the metal,” says Senf.
Getting more out of the processors may mean demanding more of the network. In the outsourced model, reliance on remote systems makes the link to those machines vital. A network is also critical when distributing work across multiple machines. And the ability to turn on additional processor power as needed may affect network bandwidth demands.
That could mean increased network costs, but that’s not necessarily bad news. “My telecommunications budget is going to go up, but my IT budget is going to go down because of centralization,” says Collinson.
If utility computing means delivering computing services remotely, the network certainly must be reliable, and bandwidth demands may increase. “Network performance can be an issue for remote locations,” says IDC’s Senf. “The latency can be problematic — however, there is software that can deal with those latencies.” Faster switching technology for wide-area networks is also helping reduce the headaches, Senf adds.
University Health Network is linked to Hewlett-Packard by a Synchronous Optical Network (SONET) ring with two core switches providing redundancy, Stankiewicz says. Operated by another contractor, the network is subject to service-level agreements and UHN does regular performance reviews and capacity planning to ensure the network keeps up with its needs.
“We collect a whole lot of data to tell us what’s going on with our network,” says Stankiewicz, adding that the hospital was very careful to select an established and reliable network operator.
GridWay has a partnership with Telecom Ottawa, an affiliate of the local electrical utility that operates a 10-gigabit-per-second (Gbps) all-fibre network. GridWay can set up virtual local-area networks for its customers on the fly, giving them 100 Mbps or even 1 Gbps of capacity between its systems and their own, he says.

time to look at your overall network architecture

Lee claims DataSynapse has given more thought than some of its rivals to the networking implications of its grid approach, and therefore can run transactional, data-intensive applications across multiple machines. DataSynapse’s strategy includes elements like data caching and transferring data directly from peer to peer rather than funneling everything through a central, controlling server, he says.
When an organization starts moving to utility computing, Collinson advises, “it’s a really great time to look at your network and the over-all architecture.” When clients contemplate the utility model, “we look at their network right away, and we have usually some very quick, high-level ideas for re-architecture,” Collinson says. “A lot of times, networks tend to grow and there’s a lot of sort of built on ad-hoc, and we tend to look at simplifying a network.”

utility computing not limited to servers

IBM’s Pow says a spike in demand that leads to extra processors being activated may well lead to a spike in network traffic, and “there obviously would be a need … to say if we are getting that kind of demand for volume, what does that mean to our communications environment?”
At Shell Canada Ltd. in Calgary, though, turning on added capacity on an IBM i890 server doesn’t have a noticeable effect on the network. Peak utilization of Shell’s network rarely exceeds 20 per cent, says John Thiers, senior staff systems analyst.
Of course, Collinson notes, the utility concept isn’t necessarily limited just to servers — it can also be applied to the network itself, with a third-party provider offering bandwidth on demand. The growing viability of and interest in Internet Protocol (IP) telephony is helping promote that idea, he says. “I see a lot of businesses considering that — not quite the uptake yet.”
For that matter, the utility model for processing power is still nascent.
“Utility or on-demand computing is taking hold at a much slower pace than the hype would have indicated,” Senf says. He believes that’s largely because larger businesses — the market vendors have been aiming their utility computing pitches at for the most part so far — are accustomed to deploying applications on their own infrastructure.
“We’re at a time and place now where you can really move this along,” Collinson says. But he adds that many organizations have legacy applications that can get in the way of a utility model.
Senf suggests that vendors might find more fertile ground in small to mid-sized businesses, which often lack the expertise to run their own computing infrastructure. They are likely to be attracted to software as a service, an approach that frees them from much of the work.
If the major roadblock to utility computing is the shift in thinking it requires, then it is likely to gain ground in time. Certainly its advocates think so.
“We are in the early stages of a massive generational paradigm shift in how IT is being fashioned,” Lee says. Big words, but at least part of the statement is true: However significant utility computing is, we are in the early stages.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Grant Buckler
Grant Buckler
Freelance journalist specializing in information technology, telecommunications, energy & clean tech. Theatre-lover & trainee hobby farmer.

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs