Do you know where your Data is?

Users don’t care much where the data comes from, but many enterprises are still struggling to find the ideal storage architecture for their needs.

It wasn’t long ago that data storage was a niche industry, fueled by the predictions that data growth in the Internet age would be phenomenal. Those

predictions weren’t wrong, and now data storage is rapidly becoming more commoditized — at least from a hardware perspective — and even the debate over which is better, network-attached storage (NAS) or a storage area network (SAN), is starting to wane.

“”It crosses the customer’s mind but it’s less relevant now than it was in the past,”” says Alan Freedman, analyst with IDC Canada in Toronto. “”You really had two distinct separate systems before, where now there’s a lot of consolidation going on in the industry.””

The right tool for the right job

A NAS is a specialized file server that connects to the network and contains a slimmed-down operating system and file system and processes only I/O requests, while a SAN is a network of storage disks. In large enterprises, a SAN connects multiple servers to a centralized pool of disk storage, and they can be centralized or distributed.

“”SANs tend to be good at very high volumes of very predictable things,”” says Ken Steinhardt, director of technology analysis with EMC Corp. “”NAS tends to be good with very large numbers of very small predictable things, and customers tend to have applications that fit both criteria.””

For example, SANs would better handle traditional online transactions, whereas NAS is better suited for supporting collaborative development.

“”The amounts of data and e-commerce transactions are driving up the capacity requirements,”” says Freedman. “”There’s a lot more interest in backup and disaster recovery. It all depends on how you’re going to use your data.””

Historically these technologies have been pitted against each other, says Kyle Foster, general manager of storage systems at IBM Canada in Markham, Ont., but arguably, enterprises could require both or possibly neither.

“”When you talk to enterprise customers, from an IT infrastructure perspective, they’re struggling with the same two things: I’ll use the words business efficiency and business continuance: they’ll use the words cost savings and availability. And these things pull customers in opposite directions.””

Scale it up

For Trillium Health Centre, with offices in Etobicoke and Mississauga, Ont. it was a specific project that drove its adoption of a SAN with the help of EMC, says Lori Driscoll, Trillium’s director of IT.

“”We were planning an implementation of the electronic imaging management project,”” says Driscoll. “”The sizes of the images associated with that project are fairly extensive, so we decided at that point in time we wanted to review our storage options.””

Trillium’s IT department had already gone back to the well several times to acquire more storage, she says. “”We wanted a solution that would afford us an opportunity to proactively manage our space allocation and be able to go to the well once to get our needs met.””

Scalability was very important, adds Mike Mendonca, Trillium’s manager of applications and service.

“”We had to look up to five years downstream.””

It’s in the software

With the hardware becoming more commoditized, it’s the software that helps manage storage that’s become more important, especially for Network Appliance, says Val Bercovici, senior manager of technology and strategy with NetApp Canada in Ottawa. Software is also contributing to the fading of the SAN versus NAS debate.

“”We’ve had heavy software focused orientation since our inception,”” he says. “”The last bastion of differentiation is how you make all of these hardware bits work fast, work reliably and easy to operate.””

The Vancouver Public School Board could be considered a relatively early adopter of SAN technology, and for it, software is the differentiator. It’s now in its second year of running a SAN to meet all of the data storage requirements for its more than 120 locations, says Peter Powell, the board’s supervisor of network and Web development.

“”We realized one of the things we wanted to do was not have all this dependency on attached arrays of servers,”” he says. “”Instead of purchasing a server machine dedicated for a purpose, we wanted to buy server machines and dispatch them to whatever purpose we wanted and make things more interchangeable and more modular.””

One of the deciding factors to adopt a SAN was that Microsoft SQL Server did not support a NAS environment, says Powell.

“”There’s been a few teething problems here and there, but it actually has worked out pretty much the way we intended.””

The Vancouver School Board’s SAN hosts everything from its PeopleSoft financials to content management, and users don’t really need to know or care where the application or data is stored.

However, what has really provided the value, says Powell, was software — specifically, DataCore’s SANsymphony.

“”What really turned the corner for us is not just having the SAN for the storage but to be able to virtualize the storage. That was the part that made the whole thing work,”” he says. “”We don’t have absolutely vast storage requirements. For us it was the ability to deploy the storage to several different servers on a needed basis.””

Store it anywhere

The concept of virtualization is not new. It’s more a technique than it is an actual technology.

“”The term of virtualization can really have a lot of meanings and interpretations between different people and different vendors,”” says EMC’s Steinhardt.

The goal, says IDC’s Freedman, is to ease management and to facilitate more efficient use of data.

Traditionally, enterprises made a hardware decision to solve a storage problem, so there was a lot of potential to be locked down into one technology, says Augie Gonzalez, SANsymphony director of product marketing for DataCore.

“”Virtualization to DataCore is a technique used by the software to abstract physical devices from their logical image,”” he says. “”Basically the applications think they’re talking to real live disks, (but) they’re actually talking to a fabricated image that software had created, and that image has potentially very powerful behaviours that the real disk does not.””

Essentially, Gonzalez says, the software provides a layer of insulation and allows hardware to be swapped in and out. “”You get a better choice of who you buy the hardware from without having to modify your methodology and processes.””

Optimize, don’t acquire

While storage requirements are still growing, Freedman says getting better utilization is becoming a high priority for enterprises, rather just acquiring more storage capacity.

“”Storage assessments are definitely a growth area right now,”” he says. “”(Companies are) finding out what they have. They went through that phase a couple of years ago just buying when they needed it and they look around now and they have all this unused capacity.””

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Gary Hilson
Gary Hilson
Gary Hilson is a Toronto-based freelance writer who has written thousands of words for print and pixel in publications across North America. His areas of interest and expertise include software, enterprise and networking technology, memory systems, green energy, sustainable transportation, and research and education. His articles have been published by EE Times, SolarEnergy.Net, Network Computing, InformationWeek, Computing Canada, Computer Dealer News, Toronto Business Times and the Ottawa Citizen, among others.

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.