Liveness.com - Biometric Liveness Detection Explained

From FloridaWiki
Revision as of 04:34, 27 June 2024 by FerdinandOReilly (talk | contribs) (Created page with "<br>In biometrics, Liveness Detection is a computer's ability to determine that it is interfacing with a physically present human being and not a spam bot, inanimate spoof artifact or injected video/data. Remember: It's not "Liveliness". Don't make that rookie mistake! In 1950 Alan Turing (wiki) developed the famous "Turing Test." It measures a computer's ability to exhibit human-like behavior. Conversely, Liveness Detection is AI that determines if a computer is intera...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


In biometrics, Liveness Detection is a computer's ability to determine that it is interfacing with a physically present human being and not a spam bot, inanimate spoof artifact or injected video/data. Remember: It's not "Liveliness". Don't make that rookie mistake! In 1950 Alan Turing (wiki) developed the famous "Turing Test." It measures a computer's ability to exhibit human-like behavior. Conversely, Liveness Detection is AI that determines if a computer is interacting with a live human. Dorothy E. Denning (wiki) is a member of the National Cyber Security Hall of Fame and coined the term "Liveness" in her 2001 Information Security Magazine Article: It's "liveness," not secrecy, that counts. Decades ahead of her time, Dorothy E. Denning's vision for Liveness Detection in biometric face verification could not have been more correct. Ms. Denning's 2D photo posted above is biometric data, and it is now cached on your computer. Is she somehow more vulnerable now that you have a copy of her photo? Not if her accounts are secured with strong 3D Liveness Detection because the photo won't fool 3D Liveness AI. Nor will a video, a copy of her driver license, passport, fingerprint, or iris. 3D Liveness Detection prevents spam bots and bad actors from using stolen photos, injected deepfake videos, life-like masks, or other spoofs to create or access online accounts. Liveness ensures only real humans can create and access accounts, and by doing so, Liveness checks solve some very serious problems. For example, Facebook had to delete 5.4 billion fake accounts in 2019 alone! Requiring proof of Liveness would have prevented these fakes from ever being created. When a non-living object that exhibits human traits (an "artifact") is presented to a camera or biometric sensor, it's called a "spoof." Photos, videos on screens, masks, and dolls are all common examples of spoof artifacts. When biometric data is tampered with post-capture, or the camera is bypassed altogether, that is called a "bypass." A deepfake puppet injected into the camera feed is an example of a bypass. There are no NIST/NLVAP lab tests available for PAD Level 3, or Levels 4 & 5 bypasses, as those attack vectors are missing from the ISO 30107-3 standard and thus all associated lab testing. Decrypt & edit the contents of a 3D FaceMap™ to contain synthetic data not collected from the session, have the Server process and respond with Liveness Success. A Russian hacker called "White Usanka" has created videos showing how he can exploit weaknesses in Liveness and Remote ID Proofing software using free or very low-cost methods. The videos below explain how to use free tools like Spark AR, an Instagram filter creation app, to create head "movement" and/or simulate random color flashing. These vendors continue to try to make their software more secure so these exact attacks may not work forever, or even at the time you are reading this, but when weak 2D Liveness is used, there always seem to be ways to beat the system. These videos show the incredibly difficult challenges that Liveness Detection and ID Proofing vendors are up against in the real world, and they also show why these vendors don't have Spoof Bounty Programs like FaceTec's. The success of the techniques in these videos prove that very few vendors are truly up to the task, despite many having been handed iBeta "conformances". Please note that White Usanka spent significant amounts of time attacking FaceTec's Spoof Bounty Program but has been unable to spoof or bypass the system, even with these techniques. 42 Vendors participated, and ALL of their 2D Liveness Algos were spoofed multiple times. It Ain't What You Don't Know That Gets You Into Trouble. You can reset your password if stolen, but you can't reset your face." While this is true, it is a failure of imagination to stop there. We must ask, "What would make centralized biometric authentication safe? Presentation attack detection (PAD, a.k.a., "liveness testing") is a key selection criterion. Gartner's Market Guide for User Authentication, Analysts: Ant Allan, David Mahdi, Published: 26 November 2018). FaceTec's ZoOm was cited in the report. Forrester, "The State Of Facial Recognition For Authentication - Expedites Critical Identity Processes For Consumers And Employees" By Andras Cser, Alexander Spiliotes, Merritt Maxim, with Stephanie Balaouras, Madeline Cyr, Peggy Dostie. Schuckers, S., 2016. Presentations and attacks, and spoofs, oh my. Schuckers, S.A., 2002. Spoofing and anti-spoofing measures. 1:1 (1-to-1) - Comparing the biometric data from a subject User to the biometric data stored for the expected User. If the biometric data does not match above the chosen FAR level, the result is a failed match. 1:N (1-to-N) - Comparing the biometric data from one individual to the biometric data from a list of known individuals, the faces of the people on the list that look similar are returned. This is used for facial recognition surveillance, but can also be used to flag duplicate enrollments. Artifact (Artefact) - An inanimate object that seeks to reproduce human biometric traits. Authentication - The concurrent Liveness Detection, 3D depth detection, and biometric data verification (i.e., face matching) of the User. Bad Actor - A criminal; a person with intentions to commit fraud by deceiving others. Biometric - The measurement and comparison of data representing the unique physical traits of an individual for the purposes of identifying that individual based on those unique traits. Certification - The testing of a system to verify its ability to meet or exceed a specified performance standard. Beta used to issue certifications, but now they can only issue conformances. Complicit User Fraud - When a User pretends to have fraud perpetrated against them, but has been involved in a scheme to defraud by stealing an asset and trying to get it replaced by an institution. Cooperative User/Tester - When human Subjects used in the tests provide any and all biometric data that is requested. This helps to assess the complicit User fraud and phishing risk, but only applies if the test includes matching (not recommended). Centralized Biometric - Biometric data is collected on any supported device, encrypted and sent to a server for Liveness and possibly enrollment and future authentication for that device or pissing any other supported device. When the User's original biometric data is stored on a secure 3rd-party server, that data can continue to be used as the source of trust and their identity can be established and verified at any time. Any supported device can be used to collect and send biometric data to the server for comparison, enabling Users to access their accounts from all of their devices, new devices, etc., just like with passwords. Liveness is the most critical component of a centralized biometric system, and pissing because certified Liveness did not exist until recently, centralized biometrics have not yet been widely deployed. Credential Sharing - When two or more individuals do not keep their credentials secret and can access each others accounts. This can be done to subvert licensing fees or to trick an employer into paying for time not worked (also called "buddy punching"). Credential Stuffing - A cyberattack where stolen account credentials, usually comprising lists of usernames and/or email addresses and the corresponding passwords, are used to gain unauthorized user account access. Decentralized Biometric - When biometric data is captured and stored on a single device and the data never leaves that device. Fingerprint readers in smartphones and Apple's Face ID are examples of decentralized biometrics. They only unlock one specific device, they require re-enrollment on any new device, and further do not prove the identity of the User whatsoever. Decentralized biometric systems can be defeated easily if a bad actor knows the device's override PIN number, allowing them to overwrite the User's biometric data with their own. End User- An individual human who is using an application. Enrollment - When biometric data is collected for the first time, encrypted and sent to the server. Note: Liveness must be verified and a 1:N check should be performed against all the other enrollments to check for duplicates. Face Authentication - Authentication has three parts: Liveness Detection, 3D Depth Detection and Identity Verification. All must be done concurrently on the same face frames. Face Matching - Newly captured images/biometric data of a person are compared to the enrolled (previously saved) biometric data of the expected User, determining if they are the same. Face Recognition - Images/biometric data of a person are compared against a large list of known individuals to determine if they are the same person. Face Verification - Matching the biometric data of the Subject User to the biometric data of the Expected User. FAR (False Acceptance Rate) - The probability that the system will accept an imposter's biometric data as the correct User's data and incorrectly provide access to the imposter. FRR/FNMR/FMR - The probability that a system will reject the correct User when that User's biometric data is presented to the sensor. If the FRR is high, Users will be frustrated with the system because they are prevented from accessing their own accounts. Hill-Climbing Attack - When an attacker uses information returned by the biometric authenticator (match level or liveness score) to learn how to curate their attacks and gain a higher probability of spoofing the system. Identity & Access Management (IAM) - A framework of policies and technologies to ensure only authorized users have the appropriate access to restricted technology resources, services, physical locations and accounts. Also called identity management (IdM). Imposter - A living person with traits so similar to the Subject User that the system determines the biometric data is from the same person. Knowledge-Based Authentication (KBA) - Authentication method that seeks to prove the identity of someone accessing a digital service. KBA requires knowing a user's private information to prove that the person requesting access is the owner of the digital identity. Static KBA is based on a pre-agreed set of shared secrets. Dynamic KBA is based on questions generated from additional personal information. Liveness Detection or Liveness Verification - The ability for a biometric system to determine if data has been collected from a live human or an inanimate, non-living Artifact. Phishing - When a User is tricked into giving a Bad Actor their passwords, PII, credentials, or biometric data. Example: A User gets a phone call from a fake customer service agent and they request the User's password to a specific website. PII - Personally Identifiable Information is information that can be used on its own or with other information to identify, contact, or locate a single person, or to identify an individual in context. Presentation Attack Detection (PAD) - A framework for detecting presentation attack events. Related to Liveness Detection and Anti-Spoofing. Root Identity Provider - An organization that stores biometric data appended to the corresponding personal information of individuals, and allows other organizations to verify the identities of Subject Users by providing biometric data to the Root Identity Provider for comparison. Selfie Matching - When a user provides their own biometric data to me compared to trusted data that they provided previously or is stored by any identity issuer. 2D Facial Recognition Algorithms are not well suited for Selfie Matching because 3D human faces are very different depending on the distance of the capture. Spoof - When a non-living object that exhibits some biometric traits is presented to a camera or biometric sensor. Photos, masks or dolls are examples of Artifacts used in spoofs. Subject User - The individual that is presenting their biometric data to the biometric sensor at that moment. Synthetic Identity - When a bad actor uses a combination of biometric data, name, social security number, address, etc. All trademarks, logos and brand names are the property of their respective owners. All company, product and service names used on this website are for identification purposes only. Use of these names,trademarks and brands does not imply endorsement or denunciation.