There will be no grand unveiling for IBM‘s vision of technology that can manage itself.
Big Blue could set itself five or 10-year deadlines for autonomic computing projects, but users can’t afford to wait that long for technology
that could make their lives easier, said Alan Ganek, vice-president of autonomic computing at IBM.
“”We’re working quarter by quarter on bringing out (autonomic) functions and capabilities,”” he said.
IT shops have moved from a high watermark of hundreds of servers on a network in the 1990s to thousands today. “”The complexity has really moved exponentially and that complexity is never really going to stop,”” he said.
Ganek was in Toronto Thursday to speak to a group of students from across Canada.
Technology needs to be reflexive to change to deal with unexpected rushes of workload, the demands of 24×7 uptime, complexities of heterogeneous environments and everyday wear and tear on corporate networks, said Ganek. Smaller businesses may find it even more difficult to meet these demands since they can’t devote the time or manpower to managing technology.
IBM’s autonomic computing refers to technology that can maintain itself without human intervention. It can spot problems (and report on them) before they actually happen and fix itself automatically if it does break.
“”Most people think it’s a little bit creepy; that the system is going to run itself somehow,”” said Amy Wohl, principal with Narberth, Pa.-based analyst firm Wohl Associates.
That view is a little misguided, she said, since it erroneously assumes that technology is capable of the level of autonomy that HAL displayed in the film 2001.
“”They think it’s going to do their job and put them out of work, not understanding that the whole of this is: let the computer do something that it can do really well, so that we can get you doing something that computers can’t do,”” she said.
IBM has been working on autonomics since 2001. It delivered its architecture for autonomic management in 2003 and updated it in the fall of 2004. Ganek expects it will be updated annually based on input from standards bodies, customers, corporate partners and IBM’s own developers.
“”This is as big a challenge as there is in this field,”” he said.
IBM submitted a Common Base Event (CBE) format to the Organization for the Advancement of Structured Information Standards (OASIS) in 2003. Last year, it released an Autonomic Computing Toolkit to the developer community to allow them to build autonomic aspects into their own software. In 2004, IBM also partnered with Fujitsu to back Web Services Distributed Management Event Format – a standard which provides a way of reporting commonplace events that occur in a network.
It may be some time before such standards are adopted, said Wohl, since they require participation from a broader range of industry players. Several companies such as HP, Sun and Computer Associates are pursuing their own autonomic agendas, but “”this stuff is relatively new,”” she said, and “”standards take a while to gel.””
Ganek said he’s hopeful, however, and believes that involvement of the academic community will actually help promote the development of autonomics in the private sector.
Institutions like MIT and Carnegie-Mellon in the U.S. and the University of Waterloo in Canada are already quite adept in the technology, he said. He noted that there are already 26 international conferences devoted to autonomics.
IBM’s own lab in Toronto has been instrumental in advancing the technology, he said.
The lab created the Generic Log Adapter, which converts existing log files into CBE format, as well as a Log and Trace Analyzer which organizes data into CBE format to help define problems.
The centre’s development team helped work on IBM’s current version of its DB2 database, which is smarter than its predecessors due to some autonomic capabilities. Using spare cycles, DB2 will scour for “”stale”” statistics and retune the database, said the Toronto lab’s senior software engineer Dominique Evans. Another function, called LEO (or learning optimizer) will improve the rate at which database queries are returned and improve their results.
Database administrators will be able to spend more time making business decisions and less time worrying about uptime problems, said Ganek.