The Network vanishes

Every year, someone writes their version of what I call the IT Exodus.

Always a first-person account, these stories usually begin with a rundown of the various devices the author is tethered to. Insert generic passage: “”As I download the spreadsheet to my Palm Pilot, I find myself juggling

my ringing cell phone while standing by the printer waiting for my memo.”” This is followed by an attempt to “”get away from it all,”” often to some remote vacation spot.

But it’s too early in the story arc for the hero to be set free. The author is dismayed to discover an island littered with Internet cafes, or the sight of beach bums typing up their journals on a laptop. Unlike the original Exodus, there’s no getting out of this flood of information technology. We could build an Ark, but it would probably have to run on an AS/4000 server.

These kinds of first-person pieces crop up often enough — I’m sure I’ve written one, as have several of the editors of our print publications — that you’d think the industry would take the hint. They haven’t. Stubbornly covering their eyes and ears to the suggestion that less is more, they have come up with a solution for the many people who want a break from computing: put the computers everywhere!

Variously known as pervasive, ubiquitous or embedded computing, the concept refers to a theory which measures progress by the extent to which devices become taken for granted or, ideally, invisible to the user. There are lots of companies and research faculties working on this. In about two months from now thousands of them will be gathering in Zurich for Pervasive 2002, a three-day conference jointly funded by IBM Research and ETH Zurich. Like a lot of trade shows, the event organizers issued a call for papers on the topic. Many of those posted on the Web site articulate the hurdles to making pervasive computing a reality: “”The Untrusted Computer Problem,”” is the title of one, while another is called, “”On the Gap Between Vision and Feasibility.””

It’s actually not so much the gap between vision and feasibility that we should worry about, but the gap between vision and consumer demand. While there is certainly plenty of interest in making computers and applications easier to use, there is little evidence that the world at large doesn’t want to see them anymore. Sun’s ambitious Jini project, which would have allowed anything from a thermostat to a toaster to be logged onto the network, has never taken off. Speech recognition, which promised to do away with those annoying keyboards, have made limited inroads within niche markets. For all the excitement over ever-smaller handhelds, Michael Dell will be the first to point out that the most used slot in the device is the place where it synchs with a PC.

Embedded or pervasive computing has its uses, particularly as it relates to creating virtual networks of supercomputing servers through grids like the Globus Project. At the user level, however, the quest for ubiquity looks more like an elaborate avoidance mechanism. The industry wants users to trust the technology, and thinks this can be achieved if we don’t see the technology at work. That will only be effective, however, if the technology is failure-free. Users (and IT managers) get angry when PCs don’t work. Imagine what it will be like if you don’t even know what part of the network has gone down. There is something to be said for having something to throw out the window in a moment of anger.

Human beings have rarely imagined technology as completely invisible. Even in science fiction, Star Trek captains walked around with tricorders and starship crews navigated the galaxies using cumbersome consoles. Gurus have complained that the PC was badly designed from the start, but for many applications you need something to look at, and something to input the data. A monitor is as close to a TV screen as it gets. A keyboard is based on the typewriters of the industrial age. The design, in fact, was shrewdly modelled on familiar appliances. Replacing that with a ghostly world of always-on, anywhere accessible computing may seem more disruptive than comforting. We might all get used to it, but you’d think the great minds of the IT industry could come up with a better way to improve the graphical user interface than to make it disappear.

[email protected]

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Shane Schick
Shane Schick
Your guide to the ongoing story of how technology is changing the world

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs