We’ve all heard horror stories about private hospital records that end up in very public places – like on the other side of pamphlets due to improper disposal of paper documents.
Today, as more organizations connected to health care make the move to digital records, Canadians have
even more potential privacy invasions to worry about.
That shift, along with the recently enacted privacy legislation governing small and medium-sized businesses as well as health-care organizations, is the driving force behind a research project at the University of Ottawa.
Stan Matwin, the professor of information technology and engineering in the university’s School of Information Technology and Engineering who heads the project, said his group is working on developing algorithms that can automatically enforce health-care privacy policies in e-mail, using data mining and machine learning.
“”It is known the electronic flow of documents in other sectors brings about errors and therefore it is quite likely there are situations in which the privacy of patients is clearly jeopardized or even violated because documents are circulated in an uncontrolled way,”” explained Matwin. “”We will develop something I would call a monitor or filter you could put on a mail server of a hospital or insurance company or doctor’s office that will automatically look at the contents of the e-mail to see if certain privacy constraints are not being respected.””
The most flagrant example might be if someone at an insurance company sold to a marketing firm a list of the names of all the people it was insuring for the use of a certain drug, such as Prozac. But it could also be something as simple as a patient’s blood test results being sent unencrypted to someone who doesn’t have the right to that information.
The three-year project, which began in October 2003, has received $35,000 in funding from CITO. Matwin is also working with an industrial partner, Ottawa-based Amikanow, which sells e-mail compliance products.
“”At the end we should have a prototype that could be rolled over to a provider who could then use it on a pilot basis,”” he said. “”Then it would have to enhanced and re-desgined to become a deployable system.””
The university will receive any royalties from the sale of the product, he added.
Matwin is working with the University of Ottawa’s law school to build a database of documents in which Canadian privacy rules have been violated. But the team is also interested in creating a product that could also apply to the U.S., which is regulated by the Health Insurance Portability and Accountability Act (HIPAA). By creating a database of documents that violate privacy laws, the software will be able to learn from those examples and embed that learning as it deals with new examples, said Matwin.
“”In my opinion, if a violation is discovered this document should be flagged and held up for human inspection,”” he said. “”If it’s flagged it’s for a reason, and therefore some organizational action is due or the system has made a mistake. At that point if you want to let it through you can, but it will also come back to the system and the system will learn from that. It’s a similar idea to a spam filter but the underlying concept is more subtle.””
Amikanow has worked with the University of Ottawa since the company’s inception about five years ago, said president Suhayya Abu-Hakima. And while it has fairly well-developed technology in the areas of text analysis and automatic classification of e-mail, “”there’s still a ton of work to be done,”” she said. “”It’s a really huge problem.””
For example, she said, current technology can’t tell if a reference to chicken breasts in an e-mail is medical or pornographic, or if it’s related to humans or animals.
Amikanow’s approach differs from others that require digital rights management technology or secure networks on both ends.
“”Secure networks will say if you have a secure e-mail server and you need to talk to someone over e-mail they’re going to have to have the same secure e-mail server to decrypt what you send them or at least get a key to decrypt it,”” she said. “”The beauty of Amikanow’s approach is that anybody can talk to anyone else.””
According to Matwin, because the IT industry is responsible for creating the technology that makes it so easy to violate personal privacy, it is therefore responsible for coming up with a solution to protect privacy.
“”It’s something we as a field should do.””