LinuxWorld 2002 in San Francisco last summer marked a watershed for the open source operating system. Big Computing was out in force — HP, IBM, Sun Microsystems — wooing the geeks with product and service launches and general abandon. Sun jumped back into the 32-bit server market it had given up

on in 1990 with its Linux-based LX-50. IBM countered with a “”three-pronged attack”” that included “”SWAT”” teams to migrate Sun customers from the proprietary Solaris Unix platform to IBM-based Linux on its own x335 server. Even open source nemesis Microsoft had a presence there, warily observing the proceedings. If Linus Torvalds’ brainchild hadn’t been prime time material before, the big vendors were sending the message that Linux was indeed ready for its close-up, Mr. DeMille.

Alan Freedman, research manager, infrastructure hardware with IDC Canada in Toronto, says Linux is the only OS showing significant growth in the number of servers shipped — 31 per cent for 2002, and a predicted 60 per cent in 2003, compared to three per cent growth for Windows boxes and virtually flat Unix numbers.

That underestimates what’s actually happening in the market, though — most copies of Linux are downloaded, not shipped installed on servers. According to research firm IDC, there were about 14 million copies of Linux in use in 2002. That number will almost triple to 36 million by 2006.

“”It’s definitely becoming more mainstream,”” says Freedman. It’s less of a clandestine “”skunkworks”” project — 24 per cent of data centre operators now “”admit”” they’re running Linux, Freedman says.

Jack O’Brien, Sun Microsystems’ manager of the Linux business office, says it’s hard to overlook the economics of Linux. Did we mention that it’s free?

“”It’s still price that’s driving the marketplace,”” says Freedman. It’s the No. 1 consideration when buying a server, followed by integration of applications.

It’s partly rooted in the notion of “”good enough computing,”” says Illuminata’s principal analyst Jonathan Eunice.

The venerable Digital VAX and the original Unix era fostered the concept that you needn’t pay for more computing power than you can use; Microsoft made commodity computing mainstream. The bells and whistles that companies flogging “”gold standard”” computing call compelling differentiators can be viewed by the “”good enough”” crowd as excess functionality.

“”In many cases, it doesn’t make sense to pay for the marginal improvement,”” Eunice says.

The continuing trend toward distributed computing — using smaller computing units connected over a network — also plays in Linux’s favour. Unix and networking evolved together and have “”a strong affinity”” for one another, says Eunice. Though firmly rooted in Unix, Linux has a cost advantage over proprietary packages.

That focus on “”scaled out”” computing — as opposed to “”scaled up”” — takes some of the wind out of one anti-Linux argument: that the number of processors the OS can scale to is limited compared to commercial Unix packages and even Windows.

Current Linux versions can run on machines with up to four processors, depending on the input-output load. That doesn’t stack up well apples-to-apples against operating systems that can run on 64-processor machines, but in a networked environment, most machines will be running one or two processors.

There’s a place for Linux in the server consolidation revolution, too.

For example, financial institutions run analysis apps for only a few minutes a day, says Wall Street veteran Eunice, but often devote a single server to the application because configuration is so painstaking.

A mainframe computer can run hundreds of instances of Linux as a “”virtual rack,”” replacing dozens of servers that wouldn’t be running concurrently.

On a mainframe, processor use can be partitioned to a fraction of a processor; RISC-based Unix boxes can partitioned to groups of two or three, and can’t handle more than a handful of partitions, argues Eunice.

“”People do like the partitioning aspect of it,”” says Freedman. It’s ideal for ISPs — which might have customers demanding their own servers, although the daily run-time would be minimal.

“”It’s a niche market,”” Freedman says, noting “”a handful of users”” in Canada.

Linux has its advantages on the wetware side as well. Skilled workers are the most expensive IT cost and are in chronic short supply. The pool of available Linux talent is augmented by the similarity to the various flavours of proprietary Unix.

Virtually all Unix skills and training port to Linux, according to Kara Pritchard, director of exam development with the Linux Professionals Institute, a Linux certification organization.

The Linux community itself is huge, notes Sun’s O’Brien, which not only contributes to the talent pool but also allow the OS to evolve quickly.

But perhaps the single most important checkmark on the Linux side of the ledger is the promise of true standardization and interoperability — common APIs and volume standards. The Linux Standards Base workgroup certifies vendor releases to prevent the splintering of Linux into incompatible versions, as happened with Unix.

Whatever the reasons, Linux has arrived in the enterprise. And even though it’s free, it’s giving incumbent operating systems a run for their money.

Share on LinkedIn Share with Google+
More Articles