Security Program Controls/Technologies

How facial authentication promises the automated future techies seek without compromising user privacy

Share
Facial authentication

Facial recognition has long been a staple among writers of science fiction. From George Orwell’s novel 1984 where the “telescreens” follow a person’s every move to well, pretty much every Black Mirror episode. The idea of living under a state of constant surveillance has become frightening prospect to those of us who cherish our privacy and freedom.

Whether it’s a company scraping billions of publicly available photos from social media which law enforcement can then use to identify potential suspects or an authoritative government using AI-based recognition tools to monitor its citizens, it’s not hard to understand why legacy facial recognition technology has been controversial.

Fortunately, new technology promises to help us move beyond the challenges of legacy facial recognition, with facial authentication emerging as the preferred pathway for those who prioritize privacy.

Facial recognition software applies algorithmic pattern matching software to identify an individual's face in a digital image or video. Organizations can use it for a variety of purposes, such as verifying a person's identity for security purposes or tracking individuals as they move through a public space.

By contrast, facial authentication uses the unique characteristics of an individual's face to verify their identity to authenticate the person at access points or for certain transactions. This gets done by using an algorithm to compare a live image of the user's face to reference data to confirm that they are who they are supposed to be and that they are authorized to access a designated space, device or information, such as their iPhone. Unlike facial recognition, which often takes place without either the knowledge or permission of an individual, facial authentication requires the user’s consent.

Apple’s FaceID has become the most well-known use of facial authentication that anyone with a recent iPhone will know about. It’s fast, frictionless, accurate and ensures user privacy by encrypting a user’s data directly on the phone’s chip so in the event that a device becomes lost or stolen, no personally identifiable information can gets revealed.

While both facial recognition and facial authentication use the human face as their basis, they serve entirely different purposes and are applied in very different ways.

Retire the keycard

While biometric factors like facial authentication have become commonplace for modern smartphone users, most physical spaces rely on some combination of older  technologies, such as encoded badge cards or knowledge-based PIN pads to identify and authorize users. Such credentials can easily get lost, stolen, copied, or even spoofed – in fact, anyone can buy the technology required to spoof a low-frequency proximity card on Amazon for less than $20.

In the catalog of social engineering techniques, the simplest, yet effective method of getting past card readers or PIN pads is "tailgating" or "piggybacking," which as the name implies entails following an authorized person through a secured door or entryway without using proper access credentials.

For instance, one security researcher could successfully breach an FTSE-listed financial institution by pretending to have a conversation on the phone and then simply followed an authorized employee into a swipe-card controlled elevator. In 2014, nearly 34,000 patient records were stolen from a healthcare provider when an intruder gained undetected and unauthorized access to the hospital and stole a USB drive from an employee’s locker.

It’s hard to say how often these types of incidents happen, as they are rarely publicly disclosed. But we do understand why they work — if someone looks like they belong or are with another person, authorities are less likely to challenge them, especially once they have made their way past an initial checkpoint.

In this recent Darknet Diaries episode, a researcher conducting a pentest audit demonstrates this when attempting to gain access to a secured floor: “I decided I was going to impromptu follow this person to see if I can do tailgating and to see if they would challenge me at all. Sure enough, he walks up, scans his badge, and opens up the door and holds it for me. I’m like thanks, appreciate it, and just kinda walked on in.”

Skilled social engineers are conditioned to exploit the psychological idiosyncrasies of human behavior. They understand that helping behavior has been hardwired into all of us and they know how to turn it into an advantage. It also underscores the importance of employing multiple layers of defense measures to prevent unauthorized access.

How facial authentication augments security

The number of use cases and industries that struggle to keep unauthorized individuals from gaining access to secured areas are as ubiquitous as they are diverse. At the most hardened end of the spectrum, there’s critical infrastructure whose assets, systems, and networks are considered so vital that their disruption could cause fundamental harm to the nation. A nuclear facility will require multiple factors of user authentication, which might include a combination of biometric data and PIN codes that serve to verify their identity and validate their access privileges.

Or consider the authentication challenges of a hospital maternity ward where access controls are both dynamic and high-stakes. There are typically dozens of doctors and nurses wearing scrubs and masks at any given time, so it’s challenging for a security guard to confidently identify authorized personnel as they hustle between stations. New parents will also require some type of authorized access badge for the duration of their stay. Needless to say, administering these types of security controls in a frenetic environment creates more complexity and can significantly increase their risk exposure.

Unlike facial recognition – which scans an image and compares it to an actual picture of the user, which exposes personally identifiable information (PII) that’s open to compromise – facial authentication, when done correctly, can actually enhance a user’s privacy. When a user enrolls in a facial authentication system, the scanned image gets converted into an encrypted binary large object – aka a ‘BLOB’ – essentially a scrambled mathematical equation. Even in the unlikely event that the encryption was broken, the system can’t reconstituted it to reveal the original image or any aspect of a user’s private information. 

The enrollment process for facial authentication also offers another important advantage over facial recognition systems: companies can deploy it in an incremental and contextual manner, making it easy to enroll and remove people out of the system without running afoul of data privacy regulations. In the maternity ward example, that means hospital administrators could seamlessly enroll expecting parents for the duration of their stay and remove them when they leave the hospital, purging all of the biometric information permanently.  

New users don’t need to schedule time to fill out forms to secure a badge card, instead they just simply opt-in and within a few seconds, their facial data gets scanned and encrypted. The more times a user interacts with the authentication system, the greater the systems's ability to improve its matching capabilities via machine learning to capture minor changes over time, enhancing both the individual user’s experience while delivering a critical additional layer of security for the organization.

Security has always been a delicate balancing act. Users demand a frictionless experience, one that’s mindful of their privacy. Organizations meanwhile must safeguard their most valuable assets and do so in the most cost-effective and scalable manner possible. While facial authentication techniques are still relatively new, all of the individual facets that make each face unique also hold the key for how we will unlock the future.

Tina D’Agostin, chief executive officer, Alcatraz AI

 

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.