Canadian mall caught collecting facial recognition data on the sly

Originally published at:


Surely any such software must have such poor accuracy as to be practically worthless? Not that the vendors will tell you that.

The nagging fear is that “fake it 'till you make it” may actually be solid advice for anything you can dystopia up right good so long as the VCs don’t lose faith too early.

1 Like

And at the same time we’re supposed to look forward with breathless anticipation to the day when our biometrics will serve as our password.




Hold a tablet up to the camera with tRump’s image on it.


For those interested, here’s the Reddit post mentioned in the article:

The comments discuss ways of thwarting facial recognition.

1 Like

If it’s going on there, it’s going on in a lot more malls. Unless of course we imagine they developed their own facial recognition software and deployed it themselves then the only other rationalization explanation is that someone is marketing and selling these systems to malls and other businesses.


I go to that mall pretty regularly, but I haven’t used the directory kiosk since they say they installed that software. I was there last week to see Sorry to Bother You.

A bit of old news really. I did my master’s thesis on this type of software over a year ago. Affective recognition software has been in trials for years and is improving rapidly. Check out work by Dr. Rosalind Picard on affective computing at MIT’s Media Lab.

I agree that the legal aspects still need a lot of work but that’s true for just about all new technology. Let’s face it, when you’re out in public there is no reasonable expectation of privacy and there are lots of nefarious uses of biometric information such as government CCTV monitoring. From a merchant standpoint, this type of software can be really useful in tailoring services and gathering demographic data for marketing purposes.

Disclosure should be a given however along with the ability to opt-out. That’s the greasy part IMHO.

It’s another fine example of technology outpacing the laws designed to keep our private lives, private.

I know the idea that laws can’t keep up with tech is basically gospel in some circles, but I really don’t think that’s the case here. Alberta’s privacy legislation, like a lot of Canadian privacy legislation, regulates the use of “personal information”, defined as as “information about an identifiable individual”. That means that information that CAN’T identify a specific individual isn’t caught by the legislation. Approximate age and gender on its own doesn’t identify a specific individual - “man in his mid-30s” describes me and about a hundred other people just within my office alone. Yes, you can argue that this could identify someone if you combined it with other information, but as long as CF can plausibly argue that they can’t and don’t do that, they can collect the information with impunity.

Saying technology is outpacing the law is letting the politicians who vote to enact legislation like this off the hook. It suggest they just didn’t see this coming. That’s not the case. They made a deliberate choice to permit exactly this kind of activity.

I recognize the software from the image. Microsoft has demoed this system for a few years, looking for a problem it might solve. This isn’t “facial recognition”, it’s “emotion recognition”. The software continually returns a score based on whether the user is probably happy, baffled, or angry. Did she get frustrated after tapping on the screen for 30 seconds because she couldn’t spell Szjoberg’s Bridal? Did he get happy when the kiosk showed him the way to the yoghurt shop?

Facial recognition is all about identifying a person, and as such would fall under data privacy laws. Emotion recognition, as long as they only keep the output (scores) and not input (images), does not.

They are probably testing the effectiveness of the kiosk software with it.

The ACLU needs to update their video.

This topic was automatically closed after 5 days. New replies are no longer allowed.