Getting more out of servers

The lures of cost savings and better server management drive today’s demand for server virtualization. A 2004 Gartner report claims that organizations with more than 200 servers waste up to $720,000 annually supporting underutilized application/server combinations. Eighty percent of virtualization users polled by IDC consolidate servers to cut this waste.

How much capacity can virtualization squeeze out of a server? “It depends” is the most common reply, followed by a consensus on 80 per cent utilization, up from the current five to 15 per cent. Nick van der Zweep, Hewlett-Packard director of virtualization and Integrity server software, claims a minimum of double the non-virtual utilization rate.

Since virtualized applications do not need dedicated servers, each application resides in its own “soft partition” with a dedicated operating system. The two together make up one virtual machine (VM), and a virtualization layer manages two, three or more VMs on one physical server.

In case of usage spikes and declines, the hypervisor automatically handles resource allocation to VMs. Virtualization suites also include migration tools that take applications from physical to virtual and virtual machine management (VMM) tools that let users set resource usage rules for VMs.

In a hosted scenario, VMMs sit between the operating system and VMs. Another alternative has the VMM as the sole host.

With applications in VMs, users can more easily deploy, back up and recover them. VMs can be copied from one physical server to another with no downtime, eliminating application shutdowns for maintenance or disaster recovery.

“Virtual machines are just files,” said Raghu Raghuman, vice-president of platform products for Palo Alto, Calif.-based virtual machine technology provider VMWare.

Cost savings take several forms. Diminished hardware requirements lower both capital and related operating expenses. Management expenses, which Raghuman claims are up to 70 per cent of the cost of maintaining servers, also diminish as more automation lessens the IT workload.

For example, Raghuman said provisioning time is reduced from weeks to minutes, since companies no longer need to procure new physical servers for each application.

John Humphreys, program director for enterprise virtualization software at IDC, said virtualization changes server provisioning best practices, especially when people use virtual machine golden images to quickly replicate applications.

However, “it’s important for IT to recognize that because it’s so easy, these things can multiply like rabbits,” Humphreys said.

Owing to the abstraction of applications from hardware, physical servers no longer need to conform to a data centre’s standard build.

“Virtualization lets customers choose the right server at the right price,” said Debora Jensen, vice-president of Dell Canada’s Advanced System Group.

Not all software vendors have determined how they will licence in a virtual world. HP’s van der Zweep speaks of negotiations between users and software vendors to define licensing for VMs. However, IDC’s Humphreys said, “It tends to be straightforward when people go through the process.”

Some licence savings are possible. In HP’s case, 50 physical BEA WebLogic servers shrank to a cluster of five, with a commensurate decrease in licences. Buyers of Microsoft’s Windows Server Enterprise Edition will be able to run five virtual servers with one licence.

Computer makers now offer virtualization as a preinstalled option on their servers. “For our high-end customers, 50 to 60 per cent of our servers go out running more than one OS on a box at the same time,” said van der Zweep. “All you do is plug it in, turn it on, and add your applications.”

While higher utilization rates are desirable, intensive application workloads and spikes can cause resource overloads. Various solutions are coming to market to help solve this issue.

Dell’s Jensen said customers map ongoing and day versus night usage as well as load rates at mid-quarter and quarter-end, then set up business rules accordingly. HP provides capacity planning software that performs historical workload analysis for specific applications and allows for what-if analysis.

Chip makers are rolling out processors such as Intel’s Woodcrest, geared to meet a virtualized server’s increased performance headroom needs.

Humphreys said the physical server’s resources and reliability, availability and serviceability take on greater importance under virtualization. “RAS features are very important to a customer who puts several eggs in a basket.”

In a quest for a best-practices guide, Intel and VMWare launched Virtualize ASAP in June. According to Diane Bryant, vice-president of Intel’s Digital Enterprise Group, ISVs virtualize their applications on Woodcrest-shod machines in Virtualize ASAP labs to determine memory and input/output needs, expected performance and other specifications.

Brian Bourne, president of Toronto-based CMS Consulting Inc., said there are two particular VM security concerns. First, since VMs are individual files, they are easy to copy. But data thieves have another, more subtle option: Rootkits.

“If successfully installed, rootkits give an attacker control of a system,” Bourne said. Often difficult to spot, rootkits may be undetectable if installed on a host operating system or hypervisor “underneath” VMs. “Securing guest servers is still important,” said Bourne. “Securing the host is critical.”

Hilary Wittman, product manager for Windows Server at Microsoft Canada, adds that legacy applications on older operating systems such as Windows NT may run more securely as virtual machines on up-to-date servers.

VMWare is looking over its shoulder at newer commercial offerings and open-source option Xen. Info-Tech Research Group senior research analyst John Sloan points to one key differentiator in a rapidly maturing market. “By next year, you’ll have out-of-the-box virtualization,” Sloan said.

“Future success depends on the management layer. In a non-virtual environment, if you’re not sure about a connection, you can follow a blue cable to an actual physical server. Now IT managers have to get good management software to manage their virtual infrastructure.”

VMWare’s Raghuman paints a typical virtualization path into a data centre. Early adopters, such as financial institutions and governments, introduce virtualization via structured applications such as file servers and mail servers. Mid-tier, home-grown applications follow suit, with Web servers not far behind. “Virtualization is a horizontal application,” Raghuman said. “Customers run all types of applications on it.”

Experts warn that owners must test applications in a virtual environment before committing them to production.

“Slowly but surely, software vendors are being pushed to support virtual machines,” Humphreys said.

According to statistics from Info-Tech, organizations of 1,000 to 5,000 employees have embraced virtualization more readily than smaller firms since they stand to save more money from server consolidation, although smaller firms are increasingly experimenting with development VMs.

HP’s van der Zweep said his internal customers were not ready for the switch, at least at the beginning.

To entice customers, HP priced virtual servers at $5,000 per year, rather than the $10,000 it charged for dedicated servers. Customers balked, so they lowered the price to $0 – to no avail.

“(Users) would rather have a traditional server that they could point at and say, ‘This is my business’s server,’” van der Zweep said.

Once IT set up its own sales team, they found they could sell to the people signing the cheques more easily than to those who ran the servers. When 50 servers were running, word got out and virtualization sold itself.

“The whole concept of virtualization is becoming more of a reality,” Jensen said.

Share on LinkedIn Share with Google+