Cleaning Out IT’s Closet

An old axiom states that technology is neither good nor bad, it’s all in how you apply it that makes the difference. That theory has been put to the test numerous times in the 30 years Computing Canada has been publishing.Technology itself may or may not be neutral but the industry that produces it is like all others: It’s prone to temptations, arrogance and mistakes despite, and sometimes because of, its best intentions.
We’ve spotlighted some of the more recent scandals that have plagued the industry and asked two experts from IDC Canada Ltd. — Michael O’Neil, who, at press time was managing director, and Sebastien Ruest, vice-president of services research — to outline what led to them, how they might have been prevented and what, if anything, history has taught us.
If this list is any indication, it tends to repeat itself. We begin with the beating heart of the computer industry: The microprocessor.

INTEL RELEASES FLAWED CHIP
In 1993, Intel Corp. released a chip called Pentium, selling roughly two million units. The following year it was discovered that the chips had a floating point division flaw which produced a miniscule error in results.
The first person outside of the Intel organization to discover the flaw was Thomas R. Nicely, a mathematics professor at Lynchburg College in Lynchburg, Va. In October 1994, he contacted Intel with this knowledge, but the company said that his was the first complaint it had registered.
“It appears that there is a bug in the floating point unit (numeric co-processor) of many, and perhaps all, Pentium processors,” wrote Nicely in an e-mail to Intel. “In short, the Pentium FPU is returning erroneous values for certain division operations.”
News of this problem spread to the Internet and more division flaws were discovered. Intel initially denied the problem, even though the company reportedly had knowledge of it as early as June and already had corrected chips in production.
By December 1994, Intel admitted the mistake and offered to replace the faulty chips on request.
In 1999, a flaw was found in the Pentium III 820 chipset which resulted in intermittent crashes and reboots. Intel immediately offered to replace the problem chips.

Michael O’Neil: I think communications is always important. I don’t want to point the finger at Intel specifically, but customers appreciate it when suppliers are forthright and communicate effectively. You probably get more latitude in the IT industry than you do in many other industries for product issues — “undocumented features” as they’re sometimes called.
It may have led to a little bit of a culture of cynicism.

Sebastien Ruest: I think the issue, too, is that it was before the early days of beta testers. In the early days, especially when you go back to Intel, they were the pioneers of the area. I think they weren’t able to anticipate all the unknown features. I don’t think the timeline for (length of) beta testing has changed much, they just have a larger population to do the testing.

TORONTO LEASING DEBACLE
In 1999, the City of Toronto looked to replace a raft of computers in preparation for Y2K. It released a request for quotations in May to solicit bids for computer leasing.
City council initially approved a budget to lease $43 million in computers and equipment, and a bid from local contractor MFP Financial Services was approved.
The council authorized the city to enter into a three-year contract with the company. With two exceptions, the initial equipment schedules were for five years. By July, the city had spent as much as $85 million in computer leases. In December 1999, the city procured 10,000 Oracle database licences which were also arranged through MFP.
In 2002, the Ontario Provincial Police determined that there was no criminal wrongdoing at the heart of a contract that almost doubled beyond its initial approval rate.
The Superior Court Justice appointed a commissioner to oversee what is known as the Toronto Computer Leasing Inquiry.
Toronto mayor David Miller, then a city councillor, pushed for an investigation and told the inquiry: “Because of the year 2000, we had given some unusual authority to staff to undertake computer and software purchases. So, they were able to go outside the normal purchasing roles.
“I wouldn’t say it was a lack of due diligence. It was a political oversight issue. So I guess from that perspective, the first thing I want to see out of the inquiry is the truth.”
Numerous participants were called to the hearing, including MFP president Peter Wolfraim, who laid the blame at the city’s feet; MFP sales rep Dash Domi, the brother of NHL player Tie Domi; and then Toronto mayor Mel Lastman.
The inquiry finally wrapped up in February and the MFP agreement was terminated the following month.

MO: I don’t really consider that an IT problem. That’s a government contracting problem. It was a great story, though. It involved Tie Domi’s brother and the end of the career of a prominent politician.
To a greater or lesser extent, you find problems in public procurement. This particular one happened to be IT-related.
Over time, some of the buyers and sellers have a better experience base to be able to draw from and start to be able to create better contracts. The early experiences were certainly not universally positive.
The same thing with desktop leasing, particularly with an evergreening component.
If you didn’t manage to predict with a fair amount of precision what was going to happen with a technology, you could end up with some unreasonable bills.

SR: There’s been a market created for third-party advisors that become intermediaries between the government and the IT (vendors). Over the years, these companies have served to educate the government about proper IT practices.
That’s why now you see the advent of public-private partnerships that are occurring where the government feels it has enough experience to be able to fly on its own a little bit.
The other aspect you can look at is how the government operates.
They set ceilings on how much you should spend for particular products or services. I think in some cases individuals probably take advantage of that fact. I think with the City of Toronto story, it sort of careened out of control.
HARD DRIVES GOES MISSING
In January 2003, Guelph, Ont.-based Co-operators Life Insurance Co. sent out a letter to 180,000 of its customers, informing them that some of their personal data had gone missing and was potentially at risk. The information was stored on a drive that was housed in a data centre operated by Information Systems Management Canada, a subsidiary of IBM. The disk contained social insurance numbers, banking authorization information, credit card numbers and dates of birth.
At the time, Co-operators said it didn’t know what had happened to the disk, only that it was missing and its loss was being treated as a theft. The company provided a 1-800 number so customers could call in and find out if their data was at risk.
In an interview with Computing Canada after the announcement, IBM Canada’s application management services executive Brent Cameron said, “In that instance, there was a former employee that had kind of misappropriated the hardware. The police investigated and determined that nothing actually happened to the data.”
That same year, a server was stolen from a regional CCRA office in Quebec, and a server that once belonged to the Bank of Montreal was up for auction on eBay.

MO: That’s a huge issue. Of all the disasters (on this list), that kind of disaster is the most visceral threat. Your identity is increasingly coupled with data elements about you. That data can be sold and end up interfering with your life in a way that only serious illness used to be able to do. The thing with the overall IT industry is that its continued evolution requires a high degree of trust.
It’s not going to go away. I think it’s more or less a permanent condition. The potential is there for it to get worse. Hopefully, it isn’t going to be a two-horse race (between vendors and hackers). The users themselves are going to take reasonably commonsensical steps to protect their own information.
The best defence from a vendor needs to be complemented by appropriate action on the part of the user.

SR: I was with IBM when the events went down. It was a terrible situation. You can almost blame micro technology for that one. Back in the days of Amdahl when the hard drives were the size of a football field, I don’t think anybody would have run away with a drive. Like Michael said, there’s also the issue of trust. Technology makes everything accessible.
NORTEL INVESTIGATED
Nortel began its fall from grace years earlier, but its nadir came in 2004 when it was formally investigated by the U.S. securities commission, the Ontario Securities Commission and the RCMP for its accounting practices.
The Brampton, Ont.-based company, once the jewel in the crown of Canada’s high-tech sector, was taken to task for misstating earnings. The company revised its books several times. Its 2003 profits, for example, shrank from an initially-stated US$732 million to US$434 million.
Nortel fired CEO Frank Dunn and two other senior executives in 2004 and sent others on a paid leave of absence once an internal investigation revealed the accounting irregularities. William Owens took over the top spot and Greg Mumford joined as chief technology officer. In June 2004, Owens told Computing Canada that Nortel has assigned 650 people and an external auditor to clean up the books. “This too will pass,” he said. “There will be closure.” Prior to its accounting woes, Nortel had experienced a protracted period of financial difficulties due to bubble burst and laid off tens of thousands of its employees. Its stock hit a low point of less than a dollar in late 2002. A year earlier it had traded at more than $100.

MO: It was a horrible issue for a lot of people in Canada. A lot of people are working today because their RRSPs were affected. Led really by Microsoft and the huge success it had making multi-millionaires, and in a few cases billionaires, through the ’80s and ’90s, more and more companies started shifting the burden of compensation from their companies to the public markets. It was, in effect: You want to make a dollar, I’ll pay you 50 cents of that dollar and the public markets will make up the difference. That artificially inflated an entire industry. It allowed companies to hire twice as many people as they would otherwise have been able to hire. Or twice as many good people as they could otherwise have afforded.
It created the stock bubble that you saw. It led to a huge detrimental issue for the IT industry. It led to decisions being made not in terms of the technology or what was right for the sustainable for the company, but what was right for the quarterly earnings report and the near-term stock price.
SR: I think maybe it went from adolescence to adulthood a little too fast without understanding what the guidelines are. I think the unfortunate aspect is that it was so Canadian. It sort of shattered the clean-cut view that the world had of Canadians.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs