The PKI Fallacy

Episode 259

State of Identity Podcast

2/3/2022

Episode 259

The PKI Fallacy

Are strong biometric liveness detection and face matching the key to protecting authentication processes against account takeover? On this week’s State of Identity podcast, host Cameron D’Ambrosi is joined by FaceTec SVP of North American Operations Jay Meier at FaceTec to explore the above question, unwrap the PKI Fallacy, and explain why device authentication is not user authentication.

Host:

Cameron D'Ambrosi, Managing Director at Liminal

Guest:

Jay Meier,  SVP of North American Operations at FaceTec, Inc.

Links:

Share this episode:

Cameron [00:00:03] Welcome everyone to State of identity, I’m your host, Cameron Ambrosi. Joining me this week is Jay Meier, senior vice president of North American operations with FaceTec. Jay, welcome to State of Identity.

 

Jay [00:00:15] Hey, thanks a lot. I appreciate the opportunity to share some insights with you.

 

Cameron [00:00:21] Yeah. You know, it’s facial detection, liveness detection, face matching. Super, super interesting area right now, a lot of investment flowing into the space, lot of adoption across industries and across applications, maybe outside of some of the areas where I think folks traditionally might assume it has applicability. Really excited to dove into that with you, as well as, I think some of the unique, you know, differentiators of the FaceTec platform. But before we do all that, I do always like to ask folks a little bit about their background, their on ramp into the broader digital identity space. I think you have a particularly interesting career path in terms of kind of how you got into the mix. So toss it over to you. You know, how do you find yourself joining FaceTec and getting immersed into this world of facial biometrics and aliveness?

 

Jay [00:01:16] OK, yeah, I’ll give you a I’ll try and make it brief. So I am serving as the SVP of North American Operations for FaceTec. However, I am technically a consultant and I have a consulting practice called Sage Capital Advisors, and I consult primarily to companies in the identity management space and more specifically in biometrics, cryptography and smart card type solutions. I started my career if you want to call it that in identity management. While I was an economist and a securities analyst in capital markets, I worked for various investment banks and I was a research analyst and my specialty was in fact, biometrics, public key infrastructure, cryptography and smart cards and in particular, contactless smart cards. In 2006, I wrote a big book about this stuff and launched my consulting career, and I first consulted Toothpaste Tech in 2016. Apparently, my recommendations were on the mark, and the company decided that they wanted to have me back as of late 2020, and I’m very glad to be here.

 

Cameron [00:02:39] Fantastic. Yeah, it’s so great to to have you. So hopefully, I think many of our listeners here are going to be familiar with FaceTec and the platform. But let’s presume, you know, we have some folks tuning in for the first time. Maybe specifically to hear about FaceTec from a 15000-foot perspective would love to hear, you know what you do at FaceTec and what you consider your mandate to be here in the identity market.

 

Jay [00:03:06] Sure. So first, tech is a is the world leading provider of 3D facial aliveness and matching face, biometric likeness and matching software. We have a stated goal at FaceTec to eliminate fraud in the logical domain in the digital world. And so we have developed we were founded in 2013. We changed our first revenue in 2018 and we’ve been growing like a weed ever since. We recently recorded 300 plus percent annualized growth in, I believe it was our second quarter and it’s pretty consistent. We owe him a very sophisticated face liveness matching software package to other identity management vendors. Usually, we have 70 or so, or maybe it’s close to 80 email integration partners around the world. They take our client, our device SDK and embedded into their app, and then they load our server SDK in their stack behind their firewall, and they can use the most advanced face liveness and biometric matching on the planet for whatever purposes they choose.

 

Cameron [00:04:31] Makes a lot of sense, and I might just to help folks who I think maybe are a bit more dialed in to some of the unique nuances of the space. Ask some clarifying questions. Is it safe to say that you guys do both one to one, as well as one to end facial matching?

 

Jay [00:04:51] Yes, that is safe. We do do both. However, I want to make it very clear that we are not a biometric surveillance vendor. We do not license our technology or we do not allow end users to use our technology to survey involuntary people, people that are not volunteering to be enrolled in the system. We are used primarily for identity verification purposes and identity authentication purposes and access control negotiations.

 

Cameron [00:05:25] I think that’s a that’s a really important distinction. So I guess to put a finer point on that, when you’re using the one-to-end capabilities, you would see folks looking to bring that to bear, for example, say, an onboarding process, but where it is a platform? I don’t know. Let’s let’s presume it’s an online dating platform that has people who have been on the platform before and maybe been abusive or done certain behaviors that cause them to be asked to leave. You can essentially have a blacklist of folks whose faces you have seen before and you say, I don’t want them coming back on the platform. And when they try and enroll, you can take the face that they have submitted during that onboarding process and match that against the list of people who you’ve seen before and have been rejected and that you don’t want to be able to, like, spin up a fake name and try and come back on the platform.

 

Jay [00:06:15] So that’s one use case. You know, we are typically used in an anti-fraud type of a situation, a scenario where because we can move, but we move biometric data this and matching data from the device to the server. It means that the vendor can match our 3D face maps against existing 3D face maps that are already associated with a customer or an enrollee. Right. And once we do that that we can then open a new user enrollments or some other situation. We can match the fresh 3D face map with the new customer against the existing database to see if there’s any duplicity. Do they have an account there already? And it also can be used in a fraud mitigation strategy where you can determine if the same face is associated with different accounts under different names, for example. So we recently signed Tinder and Tinder. I believe it’s not my account. I’m not in sales, but my understanding is that Tinder likes the opportunity to compare new enrollees’ face maps with existing face maps because they have a large contingent of their accounts that are either duplicates or fraudulent. Right? And I like to joke about the show catfish and catfish on MTV. We stop catfish if you want to think of it that way.

 

Cameron [00:07:55] Yeah, I that’s a doozy. I haven’t fired up any of those episodes in a while, but I remember when that show came out and thinking, You know, this is definitely something that we have the technology to solve if folks are willing to use it. Drilling down again on the platform capabilities and what we know is out there in market, I think the other distinction that I would make is around your liveness detection capabilities and this notion of active versus passive liveness detection. I know with your Zoom 3D product, you have an active liveness detection approach. Would you mind diving a little bit deeper into that distinction between active versus passive liveness and why you feel that the active approach is better suited towards making this determination of, you know, is this a real person behind the camera and not a spoof?

 

Jay [00:08:46] Well, in fact, I need to correct you. We use passive lameness, we don’t use active loudness, and there are a relatively nascent technology. Liveness, as a research subject, has been around for a very long time, but it hasn’t really come to bear until really the last few years. FaceTec has been a an outspoken leader and advocate in the reliance on liveness first before matching we. We believe that we are a juggernaut in in liveness, but in fact, there are different kinds of liveness capabilities there, and you can categorize them as passive or active. We use passive and I’ll describe active first so you can get a sense of what it is. Active liveness implies a command and response dynamic. It is a conscious decision to do what the technology is asking you to do. So, for example, you may be enrolling in a system or authenticating yourself, and they want to test liveness. And they do that by asking you to blink your eyes or move your head left to right, right, face, left or right, or in other cases, they will on your phone. For example, they will have a an orb, a ball that bounces around your screen and they ask you to track the ball with your finger, for example. Those are commanded responses to determine aliveness. And that’s. Called Active Passive, on the other hand, and uses involuntary human cues, things that are secret things that people don’t even know are being measured. So, for example, in a face biometric match situation, we can determine aliveness by studying the behavior of the subject’s pupils, for example, eye dilation, right? We can determine aliveness by certain textures of the skin or the hair or things like that. And so the key here is that we believe passive is superior because it’s a secret. If, if, if it’s a secret measurement, then the attacker has to guess what to try and break. The attacker is randomized as their attack vector. They don’t know what we’re measuring, so they’re guessing about what they have to try and attack. And for that reason, we believe passive is not only reduces friction, but it also increases strengthens the viability of the virus.

 

Cameron [00:11:38] You know, shame on me for, I suppose, mixing my terms, I guess. Is it safe to say that the motion, I guess where I was. Maybe there’s there’s further terminology that’s required. You know, we’ve seen some solutions that are attempting to do what I’ve seen described as, you know, single frame liveness posted those that are having, you know, taking the motion of the device, for example, into account. So maybe there’s a bit nuance there in terms of that approach and its impact on, you know, false acceptance rate and false reject rate. I think this is another kind of prime battleground that we really see when the product managers are, you know, evaluating solutions. How do you think about tackling some of those measurements of the effectiveness of the platform and getting that right balance of user experience, as well as rejection of bad actors? Because I think we’ve seen, you know, some folks in the space who I won’t name by name having processes that are maybe a little bit easier to spoof. And because the folks who are actually caught as bad actors are rejected, maybe fudging those numbers to say, Oh, you know, your pass rates aren’t going to be as high if you go with a more robust version. How do you guys think about kind of balancing those needs of platforms want to bring as many folks through the funnel as they can, obviously, but at the same time, we maintain that platforming integrity.

 

Jay [00:13:08] Well, our goal, you know, it’s a great question, and there’s so many things about that question that we could address, not least of which is just the overall quality of the liveness and matching the confidence that you get once the Ladenism and matching comparisons are done. Our goal at phase tech is to eliminate fraud. And so our goal is to provide technology that, hey, first of all, keeps the bad guys out and only allows the good guys to get in right and do that as accurately and confidently as possible. And secondly, we believe we understand that user experience is important. And so we have utilized infrastructure on the phone and developed a system that is really very intuitive and easy to use and quick. You know, our system, if you were to use an app with our technology embedded in it, you’re simply asked to look at your phone and then move the phone closer to your face and you’re basically taking selfies and the whole process takes two minutes. Or, excuse me, not two minutes, two seconds. We capture by the time you, finish the process. Over the course of about two seconds, we will have captured as many as 150 video frames and started to compile aliveness data and matching data. And it’s the same data in our case. We encrypt it, send it to the server behind the firewall. The data is decrypted, the loudness is checked. If it is confirmed alive, then we move on to match. So in our case, you know, we start with security and we start with fraud. Because let’s face it, the statistics are alarming about how much fraud is it’s pervasive on the internet companies that argue that user experience is important in my experience and in almost 30 years in this industry, it’s. It’s either because they’re just simply trying to sell something that doesn’t work very well or they’re just trying to sell something. There is an offset. If you use an inferior technology that allows a lot of people in easily, it is going to allow more bad people in than you want. And so really, you know, those are going to be less expensive systems. They are not going to fulfill the security requirements that many entities prefer. And you know, it’s really based on the risk of the asset that you’re trying to secure or the privilege. If it’s a very, very important privilege, a very, very important asset that you were trying to protect. You are going to focus on security first, and thankfully we can accomplish both quite easily, and that’s why we are growing so much faster than anybody else in the space.

 

Cameron [00:16:24] Frankly, in terms of that growth, I know we we touched on this a little bit, but I think in many ways you could probably trace a similar line around the growth and expansion of your customer base that you could more broadly within the digital identity market. You know, when we started liminal, the previously known as one world identity almost six years ago, you know who were the first attendees to our conferences and events? It was, for the most part, folks who had this definite regulatory obligation around. I need to know who’s on the other side of a transaction. So USA Patriot Act Bank Secrecy Act Anti-Money Laundering Know your customer. Fast forward a year or two later online to offline platforms growing massively. This notion of trust and safety emerges right. All of these Big Tech platforms spinning up trust and safety teams understanding, OK, we don’t necessarily need to know who you are from a payments perspective to meet a legal requirement. But we have this existential threat to our platforms that come from not knowing who is a member and making sure that we keep our users, customers, employees and providers safe from threats, you know, from criminals or other bad actors. And then now I think as we are continuing along this digital identity adoption curve as we sometimes refer to it, I think you’re continuing to see expansion of the types of businesses that are understanding they need a digital identity strategy and that there is an advantage to understanding who actually is behind these accounts on our platform. Is this a real person? Are you tracking, you know, similar? I know you talked about this stratospheric 300 percent quarter over quarter growth. You’ve been seeing what kind of verticals have been driving that? And have you seen this similar expansion in terms of the types of companies that are now engaging with you to understand who is behind a transaction or an account?

 

Jay [00:18:26] You know, we have over I think it’s close to 80 integration partners globally right now. Once we integrate into a platform like on Fetal, for example, Jimeno is a partner or was a partner, and we’ve got a number of other companies that provide identity management services to a broad array of their own customers, right? And I like to say we’re kind of like Intel Inside. Once we get built into the system, we don’t touch any data. We don’t have any idea exactly what’s being. It’s being used for, right by the end of the integration partner. And but we obviously with 80 partners around the world, we find that our customer that our revenue is coming from coming globally. South America, Asia, Africa, Europe tend to be very, very big markets for us. And the United States has been a little bit slower, probably because of the increased regulation there and the existence of very, very large infrastructure costs, right? But we are making a lot of progress in the United States, and so we expect that to be one of our biggest growth areas going forward, our customer base. We have again these integration partners that do all kinds of different things, you know, from one of the largest e-commerce providers in South America to Tinder. I already mentioned, but we also work directly with with individual customers if they’re significant enough. So, for example, the U.S. Department of Homeland Security is using our technology now. We’ve we’ve moved into a full-blown production rollout in an app that is used to authenticate. Somebody in international travel or when they’re coming into the United States when they present a passport? Right. And that is, you know, we work directly with a few other names like ATENDER, for example. And it’s very diverse. I think it’s going to continue to grow this way for a very long time. We were just seeing the tip of the spear. If you want to think of it that way and and if you if you think about it, you think about it like this. In the last few years, we’ve seen a huge growth rate in the number of companies that are doing, you know, enrollment services and then authentication services. So things like what on feto does and but but it’s the enrollment that is really new. I mean, biometrics and user authentication is are not new capabilities, right? But what the industry has lacked is this understanding and acceptance that if you don’t know who you’re enrolling, you don’t know who you’re authenticating, right? So suddenly, especially with the pandemic and, you know, $100 billion in pandemic relief fund fast in 2020, you know, we’re starting to realize that we don’t know who we’re enrolling in the system. So they steal identities and they enroll in a welfare program of some type or a social services program of some type or anything, and they start accessing those services, taking funds, et cetera, et cetera. And once the system, once they’re enrolled in the system, they can authenticate as themselves every time because the system doesn’t know that they’re not who they say they are. Right. SolarWinds is a perfect example of this, right? SolarWinds, we talk about the breach in SolarWinds. Well, you know, and people say, well, it’s a Russian hacker that got in and uploaded a Trojan malware package into an upgrade deliverable from a software vendor. And when the when the upgrade deliverable was distributed to thousands of customers all over the world, the Trojan went with. Right. But what people don’t talk about is how a hacker got in in the first place. The hacker got in through a password. Phishing the scheme? Right? A Russian hacker stole, got a password of an employee of this company that was working remotely. Right? And once he got that password, he was able to enroll on the employer’s network as if he was the employee. And then he registered his own computer on the network and they passed the SAML token to that computer. And since he had a similar token and he had the password he could and he could log on to that network as often as he wanted to and dig around the network. And that’s how he’d think he was there for nine months. So it was a combination of existing systems, and they didn’t really know who was using the system. They thought they were authenticating the right person, but in fact, they weren’t. So if they went through a verification process, it would have I if they would have FaceTec a shameless plug. We would have prevented that problem. It’s a long way to answer. I’m sorry, but I hope that makes sense.

 

Cameron [00:24:02] No, no apology necessary. I mean, look, I think we are fundamentally aligned in the sense that when you start, you know, trying to think of again, an elegant way to articulate this. But you know, in many ways when you start stripping away the layers of abstraction around these challenges, whether it’s onboarding, whether it’s identity and access management, the key question you’re trying to ask is who is the person that I am dealing with at a transaction level, whether it’s creating an account or whether it’s coming back in to an account that’s already created? Obviously, usernames, passwords, multifactor authentication, token possession, all of this stuff is really just a proxy, right? It’s a technical proxy for is Jay the person using Jay’s login and password, right? You’re using shared secrets or again, possession of something as this substitute for being able to like knock on Jay’s door and say, Hey Jay, is that you logging in, trying to access the trade secrets? And so I think these technologies really, we’re getting to the point where I think identity is taking its primary role in adjudicating these types of questions. And you know, to the extent we had these other technologies, they were stopgaps. Maybe that’s a bit aggressive, but. In the sense that we developed these things because we did not have the capability of everyone who wanted to log into a system, you know, having a computer with a high-resolution camera at their immediate disposal. But now we do, and we have the technology to not just say, Hey, do we have this shared secret in common, but let me look at you and see if you’re actually who you’re claiming to be? And more importantly, is this a real face and not some sort of spoof attempt, whether it’s a previously uploaded file or a video that I captured of you over lunch that I then try and use to

 

Jay [00:26:02] Deep fake puppet or something?

 

Cameron [00:26:04] Right, exactly.

 

Jay [00:26:06] You know, I like to say that any type of authenticator that is at least one degree separated from the privilege holder can be transferred to someone else. Okay. Now let me try and explain that a little bit. We have these things called authenticator is a biometric. It is an authenticator, right? A password is an authenticator. A, you know, a token is an authenticator. There’s all kinds of things that we use today to try and determine if this is actually Jay trying to get into the domain. Right. And any of those authenticators that are not physically connected to the person can be transferred to someone else. And this has created massive, massive vulnerabilities. And like the FIDO system, you know, I was in, I was in a panel discussion not too long ago with a a very smart person in the industry who was advocating FIDO. And you know, Fido has its place, I suppose. But she openly admitted that FIDO has not been able to figure out an efficient way to bind and authenticator to an actual identity and then bind the identity to an actual person. Right. And so presumably what that means is that if you had my token, my PKI, a stick. You could potentially get in any of the places that I am privileged to be able to get into. Not so with biometrics. And the reality is, is that until relatively recently, biometrics were not ready for primetime in this capacity. Right? Face verification face authentication simply was not good enough. And there’s reasons for that, right? This is why we like three-dimensional at FaceTec. The reality is is that a two dimensional sensor, a camera trying to capture data from a three dimensional object, your face. It creates the depth, the varying depth of field, meaning the distance between your face and the camera that creates perspective distortions. Right. If you move your face and you can see this on your phone, if you were if you were taking a selfie, you could move the camera toward your face and you’ll see the face sort of the shape of the face changes. Right? That’s because the camera cannot compensate for perspective distortions between a two dimensional sensor and a 3D face. Right. So those perspective distortions cause face most two-dimensional all two-dimensional face biometric technologies to have lower confidence ratios, right? What Paystack did was figure out how to generate a 3D face map from two dimensional using a two dimensional sensor. You know, Apple uses 3D also, but they have a series of hardware components of their phone that allow them to do that. And that costs a lot of money, right? But we figured out how to do it with software and with the existing infrastructure. So we have the same three-dimensionality with no additional cost burden, infrastructure cost burden. What the 3-D does for us is it provides a number of different liveness capabilities, but it also provides orders of magnitude more data. OK. The more this is a probabilistic calculation, it’s a statistical probability. Is this Jay’s face? Is it not Jay’s face right? Is this the guy? And we, you know, we like to think that 90, 95 percent, we think about it like a high school grade, right? 95 percent is maybe 99 percent is going to be spectacular. Well, it’s not. It’s actually horrible. And the reason is, is exactly the statistics that you were citing earlier, right? It has to work all the time. If we’re talking about very, very expensive assets and privileges, so usually there’s with the two kinds of errors you can have, most of these systems get tuned one way to make it easy for the people to get through, and that means lowering the accuracy rate, lowering the confidence. And it hasn’t worked in the past. Well, now with the use of A.I. and three dimensionality, now we can raise our confidence and with liveness and in particular, we can raise the confidence of the match outcome, potentially by orders of magnitude. And that is changing the face of pardon the pun, changing the face of biometric authentication and verification in the digital world.

 

Cameron [00:31:21] Yeah, 100 percent. You know, just to unpack, you know, one of the things that you hit on regarding FIDO and and kind of this is this journey to the password list world. I think one of the main stumbling blocks we have seen in terms of moving beyond the password in terms of widespread industry and therefore consumer adoption is how do we solve this notion of account recovery where you have platforms that rely on, you know, a possession token to replace the password? The fundamental question is always, well, what do I do about recovery? And the usual answers are OK, recovery seed of some kind or backup device? The problem there is people are inherently fallible. You have two phones, you smash one and then you can’t find the other one. You write down your recovery seed, but you Typekit it when you saved it. Or the hard drive that your archive is stored on gets corrupted. Or, you know, there’s any number of ways where that can go wrong. And I think biometrics really is one of the few ways that can be, to some degree, unimpeachable when it comes to providing a foolproof backup mechanism that doesn’t require, you know, some of these more fallible methods for being able to recover access in this password lost future.

 

Jay [00:32:40] Well, it’s more than that. Biometrics are the only way that you can attach an authenticator to an actual person. Right? But the key is really the key is being able to bind the biometric authenticator to a verified identity profile somewhere and then authenticate to that, OK? You know, people talk about biometrics like they’re, you know, it’s not one size fits all, okay? The biometrics that are in these phones today, the ones that come OEM with the phone from Apple or Samsung or whoever, those biometric technologies don’t know who they’re who it’s authenticating. And that’s because the biometric is not associated with any type of an identity profile anywhere, not even in your phone. When you enroll on an iPhone, it doesn’t associate your face with Jay Meier, my face with Jay Meier and associate it just takes the biometric data and stores it anonymously. I like to say it’s anonymous biometrics. What’s stored in the phone is just a biometric template, right? And then when I try and log on to the phone, when I try and open the phone, it captures another template and compares the two. But it doesn’t say that this is J. And what that means is, is that the service provider on the other end of the transaction, the app, right, the bank app or the dating app, or whatever it is. That third-party service provider also cannot know for sure if it is actually Jay on the phone because presumably, you could use Jay’s identity data. You know it’s Jay’s phone, but you know, Bobby could enroll on Jay’s phone, and it’s even worse with some of these fingerprint sensors on the phone because most of those captured multiple fingers. Right? They don’t even determine if the fingers are from the same hand, much less the same person. So an iPhone you can you can put up to five fingers in that sensor, and that sensor was developed in the 1990s for crying out loud. It’s old you can put five fingers in that sensor, and it does not know if it’s Jay or if it’s Jay and Jon, or if it’s Jay, Jon and Matt each putting their own fingers in there. And that means that any, any body that is ruled on that phone can access any of the services that are provided through the apps in that phone. Makes sense. So the key here is is binding a liveness proven biometric template, which we call a face map to a verified identity somewhere. That is why it’s critically important to be able to move biometric data off the phone or the phones have to be designed to associate user identity with the biometric on the phone. But the phone manufacturers don’t want to do that because that means they have to basically redesign the phone. OK, so if you move biometric data and I think this is a pretty critical thing to talk about if you move the biometric data off the phone and move it to a database behind a firewall where it can be processed, then you can associate the biometric. You can then bind the biometric to the identity profile and then authenticate to that every time. Right? So, you know, it’s Jay’s, Jay’s identity. You know, it’s Jay’s face. And then when this guy tries to log on, it matches the new face, the liveness-proven new biometric data to the existing one associated with the profile. And we know for sure if it’s actually Jay logging on.

 

Cameron [00:36:47] Yeah, I mean, I think that’s a fundamental illustration of, you know, where we’re headed and the needs of these platforms that are going to continue to evolve as threat vectors evolve. And I think, you know, this notion of so-called friendly fraud or the types of threats to platforms that come from someone who is, maybe, you know, willingly participating in in terms of trying to help circumvent, you know, access controls or in the case of data breaches and other, you know, types of leaks that that breach, you know, things that are are what you know or what you know, authentication factors make, you know, applications of these types of technologies all the more relevant

 

Jay [00:37:31] what you think about SolarWinds. Again, if SolarWinds, if that if that Russians computer had a fingerprint sensor on it, it would. And he enrolled the password. He enrolled the fingerprint sensor in the authentication process for the entity that he was trying to breach. With the SAML token, the computer would have no way of knowing that it’s not the actual employee. So all the guy would have to do is approach the website or approach approached, log on right and then put his finger on the sensor, and the computer would say that he’s the employee. It’s a it’s a flawed system, right? You know, and I think it’s important to talk about moving biometric data because there’s a lot of really bad information out there. And this is something that we feel very strongly about because you have to associate the biometric template with an identity profile that is stored somewhere. You have to move biometric data to that somewhere, right? Well, you know, the people that advocate against moving biometric data like Fido, Fido is a PKI architecture. It’s a cryptography architecture. Cryptography is about authenticating devices. That’s why it largely requires you to have a device, right? And that’s the key here is that they say, Oh, well, you can’t trust that the biometric template is safe. Well, here’s the problem. Cryptography, the same technology that Fido is based on is what we use to secure data, right? So if you’re saying that data can’t be secured by cryptography, then you’re saying that the FIDO system is built by a problematic, flawed technology. It doesn’t make sense. It’s cryptography, secures data at rest and in motion. That’s all that it does, and biometric data is data. So if we use a properly designed cryptographic system, we can surely secure that data as it moves. It’s not. It’s really not a problem, and it’s safe. And if we go one step further, we capture liveness data and we capture biometric matching data through the same data flow. We use cryptography to ensure the integrity of the sensor and the data flow from the sensor, from the camera. Then we encrypt this stuff six ways from Sunday, and it gets sent to the database behind the firewall where it is decrypted. Then we check the liveness. OK, well, the thing about the blindness is it’s it’s from the same data flow as the biometric data. So if your system detects data that came from two different. Services, you automatically know that it’s fake, right, because it can’t come from two different sources in our system. So we do the loudness check and if and if it’s deemed to be alive, then we eliminate loudness data. We delete it. It goes away. Right. And so what’s left is matching data? Well, here’s the thing in our system, the matching data cannot be resubmitted. It must have the appropriate loudness data wound in with with the matching data. And so if someone even got into the database and stole all the face maps, they couldn’t use it anywhere because it can’t be resubmitted to our system without the appropriate loudness data. And so we don’t believe we have a honeypot risk at all, and our customers are all doing it this way and no one’s having problems. It’s working wonderfully. So if you go one step further and you think about what I’ve been talking about, finding a biometric to an identity data at FaceTec, we like to say that you are who the government says you are. OK. You get a birth certificate. It’s a government document. You get a death certificate, a government document, driver’s license, government document, national ID cards, government document. If you want to change your name, you have to ask the government you are who the government says you are. So if they are the original issuer and arbiter of our identities out there in the marketplace, then why don’t we use that data to verify somebody when they are trying to enroll in something? Why don’t we associate the biometric with that identity profile and then authenticate to that? I mean, that would solve so many problems. And that is the promise of a mobile driver’s licenses that are now just starting to trickle out into into the marketplace.

 

Cameron [00:42:20] I love it. So let’s look to the future. You know, bring us to the finish line here. Obviously, you know, a lot of waves being made across the globe in terms of government ID initiatives, you know, ideas 2.0 out of Europe, other mobile driver’s licenses, other initiatives globally that are really looking to streamline this ecosystem to some degree and enable, you know, relying parties to connect in some cases directly back to a government source of truth with regard to that underlying identity data. You know, what role do you see a FaceTec playing in that world where ideas become much more prevalent?

 

Jay [00:43:02] Well, first of all, we are in at least a couple of mobile driver’s license programs, and now we’re in Colorado and we’re in Utah through one of our integration partners called Scott Talis. We are, I don’t want to say, consulting, but we are submitting recommendations and and proposals to various government entities across the world in this in Europe, which is the EU equivalent to the Department of Homeland Security Cybersecurity Command. We made a series of recommendations to them recently and we think that they accepted it quite easily. We submitted to DHS, we submitted to Nest and we’re really all about biometrics and liveness because this has to happen as mobile. You know, we talked, I mentioned that you are who the government says you are. Well, in the United States, your driver’s license is that credential. Right? I mean, you have to flash it to buy a six-pack of beer. You have to flash it to get in buildings or, you know, to get on an airplane or whatever it is. Right? Well, you can use right now, there are a lot of companies are taking an ID card and then matching a selfie against that with a face authentication system, right face matching system. And most of those suffer from the same type of perspective distortions that all 2-D systems use. We can do that with 3-D, and we are orders of magnitude more accurate than the others. But we also believe that you can safely move and store biometric data for the reasons that we already discussed, cryptography actually works. If people use it right. And if you change the data by removing symbol limits data, it can’t be resubmitted so we can take a biometric face map and associate it with a a government identity profile, potentially at the DMV when they issue an electronic ID or mobile driver’s license. Right. And from that point forward, the every time you use that, you could take a new face selfie, a new face scan and match it against the stored value at the DMV. And then the DMV uses an API system and they simply say yes. Think of it this way you want to get on an airplane, right? And you normally would flash your driver’s license. Well, the only thing you need to do is to take the face scan associated with a driver’s your driver’s license number. Send that to the DMV. The DMV pulls up the face map associated with that driver’s license number doesn’t even care about PII at all and and matches the new face map against the stored value. And if it’s if it is the same, they send back a yes or no right? No. PII moves around. No biometric data leaves the government database. Nothing happens like that. It’s privacy is preserved and we have a much, much higher confidence level that this is actually the person that they say they are. That that’s the future. That’s what’s going to happen. Passports, driver’s licenses, national ID cards, Medicare Medicaid cards, VA benefits, cards. It’s all the same thing, and it’s all going to work the same way.

 

Cameron [00:46:48] I’m greatly looking forward to hopefully helping to usher that paradigm shift into existence. Jay, thank you so much for your time. Really, really appreciate it. Before we do wrap here, quick opportunity for a shameless plug for folks listening who are either, you know, vendors in the space who are looking. You partner with you or end platforms looking to get directly in touch, you know, how should they reach out? What’s the best way to get in touch or to learn more about the FaceTec platform?

 

Jay [00:47:18] Well, w w w. FaceTec dot com is a FACETEC dot com is available to anybody that wants to go there through that website. You can start the process of testing our SDKs. And there’s examples. Then you can even get to a thing we call a spoof warning, right? If you don’t believe our technology works. We have a spoof bounty program on our platform. If you if you think you can break it, go ahead. W w w spoof money.com and try it out. If you can get through, we’ll pay you money. Of course, that’s the best R&D money that we can. We can spend because we get to see the vulnerabilities right. We also get to see the threat vectors that are happening today. We’ve strengthened our technology that way, so go to FaceTec dot com, bounty dot come, liveness dot com and reach out. It’s easy, very easy to implement. Very easy to run.

 

Cameron [00:48:24] Fantastic. Thank you again. Looking forward to following up again soon to check in on all of everything we talked about and enjoy your holidays and we’ll talk to you again soon.

 

Jay [00:48:37] Thank you very much. I appreciate the opportunity. Have a great day.

 

Episode 331

Onfido CEO Mike Tuchen shares his insights on the digital identity space, and the challenges businesses and consumers face. Tuchen discusses the need for a privacy-first approach, the growing demand for reusable digital identities, and the shift towards user control of personal information.

Episode 330

Secfense Chief Technology Officer, Marcin Szary, joins host Cameron D’Ambrosi to explore the current authentication landscape. They discuss why FIDO Alliance has been a truly transformative moment for the death of the password, how Secfense sets itself apart in a crowded and competitive landscape, and Marcin’s predictions for the future.

Episode 329

Measuring the reach of digital advertising and smartphone app performance is a difficult task made more challenging by tightening data privacy regulations. Edik Mitelman, SVP & GM of Privacy Cloud at AppsFlyer joins host Cameron D’Ambrosi to discuss the current state of the consumer data landscape, how platforms must balance first- and third-party data usage, and why the death of cookies is a tremendous opportunity.

Episode 328

John Bambenek, Principal Threat Hunter at Netenrich, joins host Cameron D’Ambrosi for a deep dive into the current trends across the cybersecurity landscape, from ChatGPT and deepfake offensive threats to leveraging data analytics across your XDR, SIEM and SOAR technology stacks for improved defenses.

Episode 327

Vyacheslav Zholudev, Chief Technology Officer of Sumsub, discusses the current state of the identity verification market with podcast host Cameron D’Ambrosi. They explore the factors driving platforms to move beyond basic identity verification and into other aspects of the digital identity lifecycle. They also discuss the challenges of implementing artificial intelligence in regulated use cases such as anti-money laundering (AML) transaction monitoring.

Episode 326

Host Cameron D’Ambrosi is joined by guest Marcus Bartram, General Partner and founding team member at Telstra Ventures, to dive into his company’s digital identity investment thesis, its transition from corporate VC to an independent fund, Strata Identity’s right to win, and the expanding role of identity in the cybersecurity landscape.

Filter by Content Type
Select all
Research
Podcasts
Articles
Case Study
Videos
Filter by Category
Select all
Customer Onboarding
Cybersecurity
Fraud and Risk
Go-to-Market
Growth Strategy
Identity Management
Landscape
Market Intelligence
Transaction Services