It seems like every sci-fi or political drama set in the future has extremely accurate and seamless integration of facial recognition technology. We’re all thinking “what’s taking them so long to get there, we all have iPhones with decent cameras, isn’t the issue of accuracy all about clarity?”
A new breakthrough in technology may finally bring us closer to the future, thanks to the discovery of a solution to handling the variations in picture quality associated with selfies.
While fairly accurate facial recognition software already exists, it has always required perfectly centered and evenly lit objects. Any slight deviation could render it useless.
Recent breakthroughs in computer science hope to solve these issues with a new algorithm that reads faces in real-time, detecting shape and unique features.
The precursor to this technology already exists in standard cameras today, but only works when the person is faced directly at the camera lens, not when they are tilted or looking another direction.
The scientists at Yahoo Labs have been working hard to build upon this algorithm, programming it to learn the angles of faces and orientations through a database of 20 million images without faces and hundreds of thousands with them. The result is a brand new algorithm called the Deep Dense Face Detector.
This new face recognition software will affect many parts of the tech industry. Proposed uses include the implementation of a working facial recognition “key” to unlock phones and computers, and technology that can search pictures with pictures.
Improved search accuracy is a double-edge sword, though: along with its potential benefits and diverse uses, facial recognition software carries with it a host of privacy issues, and often encounters resistance from those trying to avoid unwanted contact with strangers or advertisers on social media.
As with any other digital tool, it can and will be used for both good and bad, it’s our jobs, as citizens, to be educated on its uses and what we can do to limit the potential for abuse.