The use of facial recognition is not something that only comes from totalitarian regimes. It’s being used for domestic spying, in malls and casinos. Combined with AI, biometrics can turn PI (Personal information) and PII (personally identifiable information) into a weapon. I bring this up every semester to make sure my students are aware of what they are opening themselves to, should they share information even on benign sites, or “trusted” browsers.
Biometrics involve “biological measurements” such as fingerprints, facial features, and retina scans. The Department of Homeland Security, explains that “biometrics are used to detect and prevent illegal entry into the U.S., grant and administer proper immigration benefits, vetting and credentialing, facilitating legitimate travel and trade, enforcing federal laws, and enabling verification for visa applications to the U.S.” You would think biometrics is something average citizens only need to worry about if they own a passport (the new ones have an embedded chip with biometric markers), or a smart phone with facial recognition.
But biometric detection is coming closer to us than we realize. Kaspersky, the software cybersecurity company explain how “Researchers at the University of North Carolina at Chapel Hill downloaded photos of 20 volunteers from social media and used them to construct 3-D models of their faces. The researchers successfully breached four of the five security systems they tested.” Rental cars may soon come with biometric analyzers. Cities may use facial recognition without our knowledge as a pre-emptive way to assist law enforcement.
More alarming is the use of ‘public domain’ images to fuel the facial recognition business. The New York Times reports that family photos scrubbed off Flicker have been used to power surveillance technology. Hiding our faces, or making sure our children’s faces don’t show up in unscrupulous social media sites such as Instagram and Facebook may become a necessity. Or is it too late for those who have uploaded hundreds of photos to these leaky sites? As I warned manhttps://www.nytimes.com/interactive/2019/10/11/technology/flickr-facial-recognition.htmly times here and elsewhere these sites are “free” for a reason – they trade the data and meta-data of these posts and pictures without your knowledge. Digital human trafficking, in which many of us have become unwilling accomplices.
Interesting controversy. Tiffany and Co had to withdraw an ad that had a model covering her right eye. Why? It was accused of imitating the symbol of the pro-democracy movement in Hong Kong, where protestors routinely cover their face, or eyes with a mask or helmet so as to avoid facial recognition cameras. In fact, the mock eye patch has itself become a symbol of the protest.
Pingback: A Tale of Two ‘Mobs’ |