Y2K: The Disaster that Wasn’t

To say Mark Brown was worried would be understating the situation.
It was June 1998 and the biomedical engineering staff member of the Sudbury, Ont., regional hospital had just come out of a seminar on how to attack the Year 2000 problem.“I knew we were behind,” he confessed to Computing Canada. “I didn’t know how far.”
Brown wasn’t alone. A lot of Canadian organizations faced the music late.
Yet what began as a tremor and grew to a roar ended with silence.
The Y2K bug, which was sometimes used to terrorize people —and certainly frightened many in the IT community — was one of those rare triumphs in the Canadian industry: A serious global technology problem was recognized, faced and conquered.
There apparently were no deaths, no large-scale bankruptcies, and no serious disruptions in this country.
Systems were inventoried, code scrutinized and contingency plans prepared.
Programmers and project managers shone. Experts spread their knowledge around the world. In some ways it was their finest hour.

Admittedly, lots of work went on up to Dec. 31, 1999 (and some afterwards).
Many systems couldn’t be fixed until software suppliers shipped upgrades, making projects miss their deadlines.
Yet in May 1999, Jennifer McNeill, chair of the Western Canada Y2K User Group, told Computing Canada that some companies were still in denial.
“I have CEOs of oil and gas companies who tell me they’re not going to fix anything,” she said seven months prior to the Millennium. “They’re going to see what happens.”
While the IT community knew the threat was real, executives didn’t always get the message. Sometimes they had to be squeezed.
Frank Farrell, an Ottawa IT consultant to private and public organizations, recalls his most persuasive weapon to get C-level suits to take Y2K seriously was pointing out the potential legal liabilities they faced if people or property was damaged as a result of computer failure.

“It was kind of a desperate move,” he said in a recent interview, “but it had to be done.”
Among the organizations that did some squeezing was Task Force 2000, a 14-person private sector committee of gold-chip executives chaired by Jean Monty of BCE Inc.
It pushed, prodded and bullied companies to look at their systems and prepare contingency plans.
Among the members was Gaylen Duncan, then president of the Information Technology Association of Canada.
No one, including Duncan, realized how long it would take to fix or what it would cost.
“Everyone was saying it was a quick-fix thing,” he said.

But ITAC was among those who soon saw that it was more serious than first thought.
The spring of 1998 was a turning point. Task Force 2000 issued a vigorous report urging business to act.
Then in April Grant Westcott, Justice Department CIO, and Hy Braiter, senior assistant deputy minister at Human Resources, issued a scathing report on Ottawa’s lack of readiness that made headlines.

“That was perhaps the smartest thing that happened in Ottawa,” said Duncan. “That was brilliant work. It gave a level of credibility to the issue, to the people who were actually getting things done, and it surfaced a whole bunch of departments that were in trouble.”
Eleven thousand IT workers — many of whom were outside programmers and managers from giants such as IBM Global Services and EDS Canada — ended up toiling on federal systems alone.
In part, the task force’s job was to be “the voice of reason to the media,” said Duncan.
That’s because there was no shortage of people willing to be quoted — even so-called knowledgeable people, such as Karl Feilder of a British company that made a Y2K desktop PC fix.
“There will be some businesses that go broke, people will be made redundant and inevitably some people will die,” he told Computing Canada, not from the bug, but because of a lack of contingency plans for the inevitable failure.

It was what might be called the “common sense” theory of Y2K — no software is bug-free, no IT project is ever on time, therefore some applications are bound to crash. Which brings us to Brampton, Ont., IT consultant Peter de Jager, who began sounding the Y2K alert in the early 1990s.
“He performed a valuable service,” said Duncan, debunking many myths.
Curiously, de Jager didn’t want to take a bow. He politely but firmly refused to be interviewed for this article.
Perhaps he’s gun-shy: Over the years he and others like Feilder were criticized for being fear-mongers.
IT pros such as Farrell and Duncan are also sensitive to the fact that the industry was mocked by the ordinary public when nothing chaotic happened on Jan.1, 2000.
But they and others know the silence of that day came because the industry faced the problem.
Y2K has left a mixed legacy: Asset management software, a body of software testing and improved project management skills and corporate governance regulations.
Unfortunately, better-written software isn’t one of them. But that’s another story.

Would you recommend this article?


Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.

Jim Love, Chief Content Officer, IT World Canada

Featured Download

Howard Solomon
Howard Solomon
Currently a freelance writer. Former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, Howard has written for several of ITWC's sister publications, including ITBusiness.ca. Before arriving at ITWC he served as a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times.

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.