A single pane of glass

It’s pretty much a given that acquiring new customers is the name of the game in business. But when that acquisition comes at a heavy cost, it’s time to take a look at your infrastructure.

That’s what drove KnowledgeBase Marketing, a database marketing service provider that processes more than 60 billion transactions per year, to look for ways it could keep storage costs under control.

KnowledgeBase stores more than 150 terabytes of data, a figure that is growing at 35 per cent a year.

“We had a tiered environment where we bought different levels of storage depending on workload,” says Brian Camp, senior vice-president of infrastructure at the Richardson, Tex.-based firm. “It was a more economical way to go, but you have to have some pretty good forecasting of what your workload of that storage is going to be. You had to get it right.”

And they did, but only about 80 per cent of the time. “The 20 per cent of time we weren’t correct indicates we were buying too expensive storage,” says Camp.

To help the firm get a grip on those costs, KnowledgeBase consolidated its storage platforms and implemented an Information Lifecycle Management (ILM)/storage virtualization strategy based on Sun technology.


Sun recently introduced its StorageTek 9990 system, which features the vendor’s Dynamic Provisioning software.

Graham Wilson, group manager and business lead, enterprise storage at Sun, says virtualization lets storage administrators manage heterogeneous storage systems behind “a single pane of glass.”

The 9990’s dynamic provisioning feature, also known as thin provisioning, allows the user to abstract the physical from the logical, he says. Say, for example, you know you will need an additional 50 TB over the next year. You might not need the extra storage right away, but you don’t want to run out either.

“What (dynamic provisioning) allows you to do is to pretend (you’ve) got the 100 TB even though you’ve only got 50. To the system view we have 100 TB, even though we don’t physically, so we can start to allocate storage based on that view and add on dynamically as we start to hit up against our actually physical (limits), so customers are not having to invest up front for storage they’re not using.”

Virtualizing storage also allows the user to move data around easily and allocate it to the most appropriate storage device, a task that can be extremely time-consuming. Camp, for example, says KnowledgeBase was able to reduce to two hours from eight the task of reallocating data from one level of storage to another, for an estimated 40 per cent reduction in storage TCO, a 50 per cent reduction in storage administration time, and an 86 per cent reduction in transaction costs.

According to John Sloan, senior research analyst with Info-Tech Research Group, the promise of storage virtualization is the ability to pool storage and treat it as a utility resource that can be carved up into volumes and provision different applications. And while server virtualization is getting a lot of press these days, Sloan cautions that storage virtualization is a whole other ball game, mostly because there is far less standardization in storage than in processors.

“It’s just a much more complicated mess to try to do that abstraction, so on the storage virtualization side you usually need an intermediary that forces virtualization on all those disparate parts,” he says. Is it worth the effort? It depends, he says. Larger enterprises that want to optimize the storage capacity they’ve accumulated rather than buying more arrays should look at it, he advises, but smaller enterprises or those that have just started to consolidate their storage might not see many benefits.

The return on investment comes in maximizing the use of whatever disk media you have, he says.

“If you’re getting maximum utilization and efficient allocation then you don’t have to buy as often as your storage expands,” he says. “You’re still going to have to buy disks but not as often, and also you have more efficient allocation out of your storage, which can lead to efficiencies in how you back up and replicate, because part of the total value of storage is a function of not just how much the disk costs but also how much it costs to back it up.”

The bottom line, he says, is that virtualization does have value. However, he adds, “Wouldn’t it just be a lot easier if storage itself was standardized? What’s sort of happened is the vendors have resisted the commoditization of storage and they’re now peddling virtualization as a way of getting around barriers they’ve erected.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Previous article
Next article

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs