Page 82 - Security Today, March 2019
P. 82

Facial recognition databases can compromise democracy or be used for big data—or far worse—they can be wrong. According to a study done by the ACLU, Amazon’s Face Rekognition software incorrectly matched 28 members of Congress, identifying them as people previously arrested for a crime. Out of the 28 members of Congress wrongly identified, 40 percent of them were people of color.
Solution: Resort to Physiologic Biometrics for Everyday Use Cases to Protect Facial Identity
Facial recognition has the potential to be dangerous. In practice, we see that it can be hacked or spoofed, databases can be breached or sold, and sometimes it’s just not effective; as such, we should restrict facial recognition to viable use cases like airport and border security. In the example of airport and border security, we need facial recogni- tion technology to use databases to make sure someone boarding a plane is not on a no-fly list. This will uphold a level of security we expect when traveling. Security and safety are not synonymous. As a society, we need to define which biometric solution will be the most successful for each given use case.
When we talk about biometric authentication for the IoT, we need to act safely, as all connected devices are susceptible to online threats. We cannot rely on facial recognition that is easily compromised by a mere picture of the device’s owner or tricked by an adversarial neural net. And, further, what is to happen if someone steals your facial identity? You can’t simply “cancel” your face like you would a stolen credit card.
We turn to the brain for answers. According to an article in Fast Company, researchers from Binghamton University used a combi- nation of how the human brain reacts to stimuli along with the unique brain structure to create a “brain password,” a biometric solution relying on the brain’s “inexhaustible source of secure pass- words.” Still in its infancy, this technology is contingent on 32 elec-
trical sensors placed on one’s head; in the future these sensors can be put in a headset to compute accurate readings. However, there are other, less invasive ways to obtain this neural information. We can capture neuro-muscular data with high sensitivity kinetic sen- sors using Micro-Electro-Mechanical Systems (MEMS) present in standard mobile devices. Extracting this information can yield a stable and unique neural signature with the potential to act as our key to the IoT.
At Aerendir we believe the future of biometrics should be as fric- tionless as facial recognition, but as strong as “brain passwords;” with this in mind, NeuroPrint was born. While our NeuroPrint technology can extract a unique neural signal from any muscle in the body, we started with the hands due to their connection to mobile devices. We are currently focused on adding sensors and microcontrollers into the seat of a car—the possibilities are endless.
The body can truly become our own personal password, our digi- tal identity. Our brain provides a solution that authenticates, while at the same time shielding us from preying actors. Because if our neural signal is equivalent to a one million character-long password, we can safely encrypt all of our activities and communications if we were to decide to do so.
Of course, there are other physiological biometrics that could be used, including heartbeat and voice, but using the physiological sig- nals of the body seems to be the most promising and SAFE way to avoid the associated dangers.
Safety is not security. The IoT needs safety. Safety is, by defini- tion, something we as users should control. The IoT has to be user- centric to be the powerful tool it is bound to be; it
should never become the door of a prison, which
it could potentially become if we allow facial rec-
ognition to enter every facet of our lives.
Martin Zizi is the founder and CEO of Aerendir Mobile.
80
0319 | SECURITY TODAY
FACIAL RECOGNITION
Andrey_Popov/Shutterstock.com

















































































   80   81   82   83   84