When’glitch’ doesn’t quite do it

What’s your nightmare computer failure scenario? Picture the consequences if acritical application seized at a hospital, an airport control tower or a nuclear power plant.

At some point in the last 30 years, computer technology crossed the line between a matter of efficiency and a matter

of life and death. Yet when software or hardware fails, it’s still a “”glitch.””

It’s an attitude Peter Thompson would like to change. “”The word ‘glitch’ — it’s kind of ho-hum,”” says Thompson, president and CEO of RIS, an applications support and maintenance company. To Thompson, that speaks of a “”problem ownership”” issue.

If a bridge collapses or a medical patient dies, engineers or doctors as a body take it seriously.

“”The whole profession owns the problem,”” Thompson says. Compare that to the spate of massively disruptive bank system failures last summer, or Thompson’s favourite example, the Montreal hospital glitch that prevented 14,000 medical test results from reaching the doctors who ordered them.

“”It can creep into more and more critical systems, until one day the whole thing does blow up,”” he says.

Thompson called for new levels of accountability, responsibility and professionalism in the IT sector in an op-ed piece published in the Regina Leader Post. Specifically, he said in a telephone interview fromthe U.K. – a stopover customer visit between Romania and Germany -application maintenance and support should be recognized as a separatediscipline from development.

We tend to focus on development, says Thompson, lionizing the new applications.

“”We’ve forgotten the importance of keeping what we’ve already built running smoothly,”” he says.

Thompson proposes maintenance- and support-specific accreditation — education, training and certification specifically aimed at the discipline. At some point, he believes, there will be a system failure with such dire consequences that the government will step in and force regulation. Self-regulation is a better option.

For the record, Thompson doesn’t view the IT profession as, well, unprofessional. Critical systems are often double or triple backed up. People who work with systems are generally careful. “”But there has to be constant vigilance,”” says Thompson, especially when making changes to applications. We don’t know that errors are lurking in the code, until a user calls to say the system doesn’t work. That’s bad for business. And it could be worse than that.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Dave Webb
Dave Webb
Dave Webb is a technology journalist with more than 15 years' experience. He has edited numerous technology publications including Network World Canada, ComputerWorld Canada, Computing Canada and eBusiness Journal. He now runs content development shop Dweeb Media.

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.