ITBusiness.ca

Ontario has new method to keep problem gamblers out of gaming sites

It’s a strange task that Tom Marinelli has been handed – trying to keep out more than 10,000 loyal customers from 30 sites spread across Ontario.

But that’s what the CIO of the Ontario Lottery and Gaming Corp. (OLG) has been asked to do.

The OLG has long offered a self-exclusion list as an aid to problem gamblers who are trying to quit their expensive habit. But it never had a fool-proof way to make sure those who placed themselves on the list were denied entry to gaming sites.

Casinos and slot sites used to rely on binders filled with photographs and the memory of security guards to weed out those who were self-excluded. But guards found it impossible to memorize the 10,000 photos crammed into 22 binders.

The answer to this conundrum was to tap into a facial recognition system the OLG already used to catch cheaters. But widening the scope of the technology isn’t easy, Marinelli says.

“Our challenge was to strengthen the existing responsible gaming initiative, using facial recognition to [prevent] self-excluded patrons from entering casinos,” the CIO says. “Meanwhile, we had to ensure the privacy and security of all patrons.”

Marinelli described the OLG’s plan to adapt facial recognition technology at a Jan. 28 conference hosted by Ontario Privacy Commissioner Ann Cavoukian. The conference theme focused on technologies that were built with privacy as a priority.

OLG was motivated to find a better way to keep self-excluded gamblers from their locations. Despite having members on the list sign a statement releasing the OLG from any liability, the corporation was sued by nine different people over decade, according to media reports.

OLG settled all the cases for a total of $1.5 million.

To build in a system that would protect the privacy of customers subject to the facial recognition system, Marinelli called upon the University of Toronto. A research and development phase began in November 2007 and wrapped up in August 2008.

“We were asked to examine if it is possible to use a privacy enhancing solution such as the celebrated Biometric Encryption Method in conjunction with commercial face recognition tools,” says Kostas Plataniotis, a professor of electrical and computer engineering at the University of Toronto. “In our case, the personal information is considered to be the facial image itself.”

The solution is a clever system that makes use of cross-referenced databases that are essentially unlocked by the image of a self-excluded patron’s face. Self-excluded patrons will register themselves with the system by having their image taken.

With the image of their face, a biometric template and unique identification number are kept in a facial recognition database, Marinelli explains. This is a just a numerical reference that doesn’t identify the person in any way.

Next, another numerical key is generated and used as the reference to the self-exclusion database, where all of the subject’s personal information is stored. This includes their name and address. This numerical key is bound together with the unique identification number based on the facial template and stored in a new helper database. This ensures there is no direct relationship between the subject’s biometric template and their personal information.

So when that self-excluded patron gives in to temptation and walks into an OLG site, their face is caught on camera. It generates the same unique identification number, and that acts as an unlocking mechanism that allows the database to retrieve the personal information as well.

“The subject’s personal information is only released when a correct biometric match is made,” Marinelli says. “Remember, all of this is happening in real time.”

If a patron’s face is scanned and they are not on the self-excluded list, the system will “forget” the information immediately.

Researchers developing the technology took false positives into consideration – when a patron is accidentally identified as someone else on the self-excluded list, Plataniotis says.

“Measuring across a set of subjects enrolled in the system, we determined the fraction which were incorrectly identified and thus rejected,” he says. “Yes, there are always problems, the issue is to keep errors to a minimum.”

According to Marinelli, while it’s not impossible that an incorrect identification would be made, you wouldn’t want to bet on the odds that it will happen to you.

The OLG has issued a request for proposals and is waiting for a vendor to integrate the new technology into a facial recognition system. It has already begun to deploy the detection equipment at some locations and has opened a lab for testing purposes.

Soon enough, Marinelli hopes the facial recognition technology will help him with his strange task – refusing entry to some of his best customers.

Exit mobile version