Tuning in to docu-drama

If your data storage needs are growing far faster than your company, you’re not alone. It’s happening to almost everyone.

“Normal business operations tend to be growing at somewhere between 70 and 100 per cent capacity demand per year,” says Randy Kearns, vice-president of strategy and planning for the data management group at Santa Clara, Calif.-based Sun Microsystems Inc.

The reasons for growing storage needs vary from business to business. For London Drugs Ltd., it’s prescription information that legally must be kept, and volumes of accounting and order data that keep piling up. The systems that help run the Vancouver-based retail chain are very good at collecting data, observes Scott Riddell, London Drugs’ manager of solutions information technology, but “it’s hard to find one that will allow you to get rid of data.”

For Borden Ladner Gervais LLP, it’s documents and e-mail. “There was a point in time when you’d get a few (electronic) letters a day,” says Joel Alleyne, the national law firm’s chief information officer and chief knowledge officer. “In today’s world everything is e-mail.” As a law firm, Borden Ladner also has plenty of documents – “lawyers live in Word and e-mail,” Alleyne says – managed using a document management system. The volume keeps growing, and Alleyne is looking for ways to manage it better.

For the Royal Bank of Canada, it’s both growing demand for readily available information and legal requirements. “It’s the Internet age we live in,” says Harold Durnford, senior technical specialist for storage at the bank. “Everyone wants instant access to data.” But for a multi-national bank, complying with legal requirements is also a huge issue. “We operate in 30 to 40 countries around the world,” he says, “and there’s extreme sensitivity to meeting legal regulations.” An example: Backups that were once kept for only two years are now kept for seven.

The past few years have defined much more strictly what companies are required to keep. Tough penalties on those that can’t come up with the data when required have been imposed. The Sarbanes-Oxley Act in the U.S. and Bill C-198 in Canada are examples. There are assorted other laws addressing specific areas.

“All the customers we’re talking to who are public companies are really concerned about Sarbanes-Oxley issues,” says Curtis Gittens, senior research analyst at Info-Tech Research Group Inc., a London, Ont. research and consulting firm.

“They’re faced with volumes of information that are just growing exponentially,” says Ken Steinhardt, director of technology analysis at storage systems vendor EMC Corp. of Hopkinton, Mass., “and yet IT budgets have been relatively flat.”

Ever-expanding information
The declining cost and increasing capacity of storage does help. But even that may be a mixed blessing, because it only increases the temptation to tame the storage shrew by throwing more storage capacity at it. In the long run, that only makes it more unmanageable.

IT departments must use storage more efficiently, placing the most frequently used data on high-performance storage that gives users the information they want right away, and less-frequently-used data on lower-cost storage devices that nonetheless can produce it when needed. And they need to ensure data is disposed of when no longer needed – but not before.

Archiving is the law
Borden Ladner is implementing an e-mail archival system. Alleyne says this will move older e-mail from costly primary storage to less expensive and slightly slower optical disks. To end users, it will still look as if older e-mail messages are right there in their mailboxes, Alleyne explains, and they will be able to retrieve them at will. It may just take slightly longer because of where the messages are stored.

The same happens at the Royal Bank. “The convenience is still there, of accessing the stuff,” Durnford says, “but it’s not instantaneous.”

Alleyne says his firm is still in the early stages of addressing information lifecycle management (ILM). Borden Ladner is not alone in this. In fact, by even thinking about the issue it is ahead of many companies. “What’s happening is that companies are grappling with a storage problem and not addressing it from an ILM perspective,” says Gittens at Info-Tech. “It’s something that has not really penetrated the market yet in the true sense of the word. I mean, people are still asking us: ‘What is ILM?’”

An Info-Tech report, Ensure Storage Success: Marry ILM With Tiered Storage Technology, says ILM “involves the use of subjective and objective measures to determine where to store data at a given point in time.”

Though many are still unfamiliar with it, ILM is not new. Durnford says the Royal Bank has been doing ILM for more than 20 years on its mainframes. As data ages and is used less, it is moved to slower, less costly storage, and when it is no longer needed it is deleted. He admits the bank has done less ILM on its Unix and Windows systems, to date. One reason, he says, is that ILM tools for those platforms aren’t as mature as those for mainframes.

Gittens says most organizations don’t think seriously about ILM until they get at least three tiers of storage. When they do, according to Info-Tech, they should think seriously about implementing at least a fourth and maybe even a fifth storage tier.

Tiers are not enough
According to Info-Tech, the three-tiered model many organizations adopt is not flexible enough. It classifies data as mission-critical, business-critical or deep archival. Mission-critical data demands 99.999 per cent availability and full recovery within an hour of a failure, while for business-critical data, 99.9 per cent availability and recovery in three hours or so is acceptable. Deep archival is for rarely used data and most often relies on tape.

Info-Tech says most businesses should add an “active archived” tier for data that will be kept a long time and used sporadically but too often to be stored on tape. Some large enterprises may need a fifth, disk-based backup tier between the active archived and deep archived tiers.

Virtually flexible
London Drugs uses IBM storage virtualization to virtualize storage allotted to servers in its data centre.

“What I like about that,” Riddell says, “is it allows us to move storage allotments around.” Before London Drugs implemented this system, he recalls, it was much more difficult to give an application more storage if the original estimate of how much it would need turned out to be low.

The retailer also tries to find applications with facilities to identify stale data and help archive or purge it. That’s not always possible, he says, “but often the software vendor will point you to a third party that can provide that” or can provide technical information to help with creating a home-grown solution.

One of Durnford’s key weapons in the storage battle is what he calls “just-in-time provisioning.” Since the bank started implementing storage-area networking (SAN) technology in 1999, he says, it has become easier to add storage capacity as needed without bringing down servers in the process. So rather than install the storage he expects to need in three years, Durnford tries to install what he needs now, then add incrementally as needed. The declining cost of storage makes that a money-saving move, and the bank even negotiates price reductions ahead of time through volume contracts with suppliers. “That makes planning a lot better,” says Durnford.

None of these strategies are stopping the continuing growth in storage needs. All they do is help organizations keep up with their requirements as economically as possible. New storage media may some day eliminate the issue, but for now the storage shrew can only be tamed.

Is the storage up to our standards?
As new storage technologies proliferate the job of managing them becomes more and more daunting. Nobody wants to try to manage as a complete system a collection of hardware and software that won’t talk to each other.

That’s why the Storage Networking Industry Association (SNIA) set out to create a storage management standard. That standard is called Storage Management Initiative Specification, or SMI-S. Currently in Version 1.1, it enjoys quite widespread support among storage vendors.

SMI-S provides a standard language for storage devices to talk to each other, so that they can be managed without relying on an assortment of proprietary and incompatible management agents. Before it was introduced, says Robert Callaghan, chair of SNIA’s Storage Management Forum board, “every product basically lived in its own silo.”

SMI-S 1.1 was released about a year ago, though it is still working its way through the approval process to become an American National Standards Institute (ANSI) standard. Building on the original 1.0 version, it added support for tape libraries, network-attached storage and iSCSI, Callaghan explains. It also increases support for information lifecycle management and other higher-level services.

The next release, SMI-S 1.2, is due around the end of this year. It will add performance management for Fibre Channel devices, increased security features, virtualization management and some other features, Callaghan says.

The SNIA runs a conformance testing program for SMI-S, and the list of vendors whose products have been tested includes most of the major names in storage hardware and software. Callaghan describes adoption of the specification as “sort of steady.” A fair amount of new equipment now supports it, he says, and customers are starting to ask for SMI-S in requests for proposal.

And so most of them should, according to Callaghan. He says “any customer that has products from multiple companies in the same SAN environment” should be looking at SMI-S, and even those with single-vendor environments may want to consider it so they won’t get locked in. “It’s freedom of choice, basically,” he says.

Share on LinkedIn Share with Google+
More Articles