ITBusiness.ca

Adoption of utility computing may be delayed

SAN FRANCISCO – Confusion over pricing models and a lack of industry-wide standards may delay the adoption of utility computing among some CIOs, vendors admitted at the recent Intel Developer Forum.

This year’s conference highlighted a number technologies, including virtualization software

and dual-core processors, which could help organizations create an IT infrastructure that would see computing resources activated only when needed. That form of resource allocation — which some vendors call utility computing but others call on-demand or organic — has been aggressively marketed over the past two years by vendors including IBM, HP and Veritas. But in a panel discussion several manufacturers said they had yet to come up with a software licensing scheme that satisfied the bulk of their customers.

“It seems easy to create a data centre — you buy a lot of different bits and pieces, put it all in there and it’s done,” said Vadim Rosenberg, director of technical marketing at BEA. “Things get expensive when you start paying for a licence. quot;

The ideal, Rosenberg said, would be for software vendors to take a lesson from cellular carriers, who demand a flat fee for basic services as well as per-use fees for specific additional services. “If we could offer that kind of pricing, that would be great,” he said.

Not all organizations see it that way, however. Nick van der Zweep, director of virtualization at HP, pointed out that the monitoring and metering necessary to offer utility computing would require enterprises to transmit information about what they’re using, and government customers in particular have not wanted to do that.

“Our customers couldn’t handle the variability (in per-use pricing) because they didn’t want a sudden spike in their cost,” he added.

Pricing may have to vary from customer to customer, but unless the industry can agree on common ways to enable utility computing it will be hard to see real benefits, said Roger Reich, senior technical director at Veritas. “De facto standards emerge among companies with the closest business relationship,” he said. “We have to figure out to what degree we can develop interoperable standards in advance of private industry swaps.”

One problem with utility computing is that enterprise IT managers sometimes don’t know what to ask for, said Tom Kucharvy, president of research firm Summit Strategies. That lack of vocabulary remains a key issue. “(Vendors) are using different terms to say the same thing,” he said. “But that’s because many companies want to avoid comparisons with their competitors.”

Rosenberg said it’s important that vendors committed to utility computing separate the reality from the fluff. “You can drive along the 101 and see all these billboard advertising grid computing or virtualization,” he said. “When you start digging, you see that’s all it is — just billboards.”

Some enterprises have already started on the path to utility models whether they recognize it or not, added van der Zweep. “Our server consolidation customers are clearly virtualizing,” he said. They’ve stepped into that realm and they’re calling it consolidation, but they’re incorporating virtualization into that.”

Exit mobile version