IBM wants to drive intelligence into the network in a bid to bolster its storage virtualization strategy and address ongoing SAN limitations.
By moving the intelligence into the network, customers get access to a lot more information about the server
environment (where the data is created) and the physical assets (where the data is stored), said Brian Truskowski, general manager of storage software, IBM Systems Group.
“”Move key intelligence into the SAN itself to drive the benefits of SAN in the right direction,”” Truskowski said.
As part of the on-demand computing strategy, IBM released two storage virtualization solutions in June — TotalStorage SAN Volume Controller and SAN Integration Server — which are designed to address the complexity of storage networking today, said Truskowski.
While SANs offer a host of benefits to customers (including better connectivity, performance, scalability, distance flexibility, vendor and product choice and the ability to consolidate assets), Truskowski said there are still challenges hindering the full potential of SANs.
For starters, he said, customers can’t easily build and manage heterogeneous SANs because maximizing utilization of the physical assets is difficult. Multiple file systems cannot share data and must be managed separately, installation and configuration are problems and there’s no common way to view and manage the environment.
“”In the good old days, servers were attached to storage via a cable — an umbilical cord — and all of that storage was dedicated to that particular server. What we’ve done now is introduced the notion of a SAN and that’s opened up some unique benefits for customers, but there are still some things we need to do to allow customers to maximize the full benefit of a SAN.
“”SANs are good things, but we have some work to do here.””
What has to happen? Truskowski said the path between a server and storage is still hard-wired — a legacy of the direct-attached storage days — and needs to be identified.
Servers and storage are still connected in a tightly coupled way — addresses in the server are connected directly to storage, he said. “”We haven’t really broken that umbilical cord yet, so there’s still a very tight relationship in most deployments of storage to server.””
Moreover, most of today’s technology is designed and deployed in “”this direct-attached world,”” he said. “”The functionality deployed up in the server has the ability to see a lot of different kinds of storage attached to the server, but it has very little knowledge of the other servers and applications in the environment. So although it has a good view of storage, it doesn’t understand any relationship with the other servers in the environment.””
The same is true for technology that’s been deployed in the storage box itself, he said. “”It may be accessible by many servers in the environment, but it has no knowledge of other storage devices that may exist on that SAN. So we’ve created this inverted hourglass where storage sees all servers but has no knowledge — and that limits a customer’s ability to maximize the value of a SAN.””
Enter the benefits of storage virtualization. He said the technology alleviates the lack of understanding and isolation between devices, addresses the challenge of interoperability, and helps customers manage their data in a more consistent way.
“”Most customers have a lot of different storage types in their environment and they all work differently. It’s not a piece of storage, it’s a lot of different storage from many different vendors, and it makes their environments much more complex.””
Digging deeper, virtualization technology goes into the environment and sifts through the confusion, he said. It gives users a virtual view of storage as opposed to a physical view. “”We can go in and address very specific problems our customers have, like deploying new storage in their environment without interrupting their applications.””
Instead of having a file system in every server, the technology lets users migrate the file system into the SAN, thereby offering one file system for all of the storage in the environment, he said.
Touting the benefits of an integrated, open environment, Kyle Foster, IBM Canada’s general manager of storage sales, said storage software is the key to lowering storage costs and providing a responsive and flexible infrastructure.
Other benefits of an integrated environment include a reduction or elimination of downtime, enhanced productivity, and a simplified infrastructure that gives users the ability to automate, find and resolve issues quickly, Truskowski said.
“”Customers can mix and match our products with third-party products and manage it with the storage management products of their choice,”” he said.
Integrated, interoperable storage solutions for the on-demand operating environment are key, Foster said. Despite a soft IT spending environment and tight budgets, storage capacity is growing at a tremendous rate and is, therefore, an area of opportunity for vendors and resellers.
Alan Freedman, research manager in the hardware division of IDC Canada Ltd. ‘s products research group, agrees, saying the growth in storage capacity will increase over 40 per cent per year throughout 2007.
“”Despite all of the doom and gloom, there’s one segment of the market that is doing quite well: growth in storage capacity is still taking off,”” he said.
Maxium Solutions, a Richmond Hill, Ont.-based reseller, is pitching the benefits of deploying IBM’s SAN Volume Controller to customers, which typically include the retail, financial and insurance markets.
President Rob Seager says response is already positive given that the technology offers customers a seamless migration, better utilization of current storage infrastructure and reduced administration costs.