The Looming Age Verification Challenge

Episode 324

State of Identity Podcast


Episode 324

The Looming Age Verification Challenge

Age verification is one of the hottest topics in digital identity today. Jurisdictions globally are cracking down on youth access to adult content and social media while limiting data collection from underage users. Join host Cameron D’Ambrosi and Executive Director of the Age Verification Providers Association, Iain Corby, to discuss how age verification vendors are helping platforms meet this growing challenge.


Cameron D'Ambrosi, Senior Principal at Liminal


Iain Corby, Executive Director

Share this episode:

Cameron: Age verification is the hottest topic in digital identity today. Jurisdictions globally are cracking down on youth access to adult content and social media, while also fundamentally restricting the ability of platforms to collect data from young users. Stay tuned for my chat with the head of this growing industry’s leading trade group.

Welcome, everyone. To the State of Identity podcast, I’m your host, Cameron D’Ambrosi. Joining me this week is Iain Corby, Executive Director of the Age Verification Providers Association. Iain, welcome to the State of Identity.

Iain: Thank you very much for having me.

Cameron: It is my pleasure. Without exaggeration, I think age is an area where there’s a lot of new regulation and a lot of demands from platforms for these types of services, but not necessarily a lot of knowledge about what the art of the possible is and where this market needs to be headed, both from a policy as well as a technical perspective. So hopefully, our audience is on the same wavelength as me, and let’s get into it. To start, give us a quick 15,000-foot overview of the AVPA and your role.

How did you find yourself leading one of the top trade groups for age verification?

Iain: The AVPA was started back in about 2018 here in the UK because we had some legislation passed the year before to bring in a requirement for age checks for pornography. There was a group, the Digital Policy AllIaince, working with parliament and finessing that legislation. A number of people who decided they would step up and provide age assurance for that purpose felt it was worth getting together. When I started, six people were meeting in the pub once a month. I was brought on board to formalize it a little bit. I persuaded them that maybe we’d meet in an office for an hour before we went to the pub. And then we’ve grown from there. So we now have 25 members from around the world, as far afield as Australia, across the United States, and the European Union. We focus on three things: building standards, aligning regulation around the world, and promoting the industry and dealing with some of the criticisms, some of which are quite outdated tropes that are leveled at age verification.

Cameron: What role does the AVPA currently play within the ecosystem, and what role do you hope to continue to play? Where do you see yourself helping bind this nascent industry together?

Iain: The development really flows from legislation through regulators and then into implementation. So if we start with the legislation, we’ve got an avalanche of legislation which requires age verification of one sort or another. We’ve got the GDPR, the Data Protection Regulations in Europe, which already say if you’re under anything between 13 and 15, you can’t give consent to process your data. That actually implies that if you want to process data on the basis of consent, you better check the person who’s giving you consent is over 13 years old. Otherwise, it may just not be valid consent, and you’ll breach that legislation. We’ve got the Digital Services Act saying you can’t target kids with AI-based advertising. We’ve got the online safety bill here in the UK. We’ve got the Irish Fundamentals of Design for Children, which is the same as the age-appropriate design code here in the UK, where we are not just looking at whether somebody’s a child or an adult, but what age range they are as a child, and we’re expected to provide different content to a five-year-old from a 15-year-old. And then of course, we look across the United States and we see at least 20 different states legislating at the moment, either on the basis of content or on data and privacy in a way that wants you to treat children differently from adults. And literally, anytime you see that in legislation in relation to the internet, the first question has to be how do I know which users are adults and which are children? We are working with all the legislation we see as it comes up to try to get in early enough to persuade legislators to write that in a way that will stand the test of time. And that means not being tied to a particular technology or source but ideally writing legislation in an outcome-focused manner. What is it you’re trying to achieve? How strongly do you want to check age? You probably want a firmer check for allowing somebody to see pornography than to open a Facebook account. And equally, you probably want an even firmer check if they’re going to buy an offensive weapon online, rather than just looking at some porn. We’re trying to get people to coalesce around some standards of different levels of age assurance because that also means that we can build tech that will work on a global basis. Just take the United States. I think there are 50 states at the last count, if they were all just regulating three different products but they have three separate regulators in every state, then a platform, say EBA, would be looking at complying with 150 regimes just for the United States, and that’s going to be obviously hugely inefficient and very expensive. So we work on trying to get legislation and the regulation that follows it to align to the international standards that we are writing with our ISO and IEE. We already have one on the British standards, surpassing 1296. So we’re building up from that baseline. And then in terms of implementation, and this is a bigger story, so I’ll just elude to it for now. We think interoperability is critical here. When Americans come to Europe and go online, they first complain about the cookie pop. In fact, I was in Dublin just earlier this week at an event on child protection hosted by Google, and their folks who’d flown in from the states were very audibly complaining about cookie popups. Imagine the cookie popup where you can’t just click okay, but you have to pull out your driver’s license, scan that in, take a selfie to prove that you are the owner of the driver’s license, and then give consent to release your age to every single website you are trying to surf. That’s clearly not a sustainable solution, and I think we’ll lead to any legislation being immediately retracted. Interoperability and making the user experience as seamless as possible is then the third and final sort of leg of the stool that we are working on.

Cameron: So, from that perspective, I think education remains one of the industry’s fundamental challenges. The notion that, hey, if a website claims I’m going to do on-device biometric age estimation, they’re not actually capturing your full face and running a one-to-end biometric check about it. Or if you are providing information for an age verification check against, let’s say, data records, that information isn’t being stored to track you across sites. How do we as an industry continue this conversation? Where do you think the industry can move forward in terms of educating consumers about these requirements from a regulatory side, as well as building trust around things like use of data, use of biometrics, and the notion that the data itself is not being harvested and exploited for other purposes?

Iain: Millions of people have done age checks and voluntarily shared their biometrics. They’ve shown their face or uploaded their passports or given their name, address, and date of birth and consent to go and check their credit report. They’ve done that because they want to place an order for a case of wine or access a particular website. So actually, when people have some need and this is just a means to the end, then they will go through with it. Now, if you were to introduce that on one adult website tomorrow, not many people would do it because they’d have plenty of choices to go elsewhere. So they may choose not to do it, and that site would lose all its traffic. So you do need a level playing field. But once everybody requires it, then people just will put up with it. But there will, of course, be informed critics, media, and so on that we have to persuade. And that’s why we’ve gone over and above the GDPR and again in Europe, we do have good data protection laws, which obviously are not mirrored in the states yet at least at the federal level. But we recognize even with those safe protection laws, we’ve got to go further. We have a code of conduct, but we also now have audit and certification schemes that members can be checked against on a preemptive basis. So it’s not a question of them just being fined if they lose your data. We check that they are designed based on privacy by design and data minimization in such a way that they won’t lose your data because they don’t keep it in the first place. And then data protection authorities are all over us because we’re a very high-profile and high-risk industry from their perspective. But also, if you create trust frameworks and interoperability, then those trust frameworks will also be auditing members before they join them to participate in that interoperability network. And that’s yet another element of protection that’s confirming that every age verification provider in the network is compliant.

Cameron: Thank you for sharing that. I think it’s really interesting to hear how the AVPA is working to build trust and ensure that users’ data is protected. Going back to the topic of education, do you think there’s a role for governments or other organizations to play in educating users about age verification and digital identity?

Iain: Absolutely. I think the government is very important in educating users about age verification and digital identity. It’s not just about the government, but it’s also about the industry as well. We need to work together to ensure that users understand why age verification is important and how it works. We also need to be transparent about how we use data and what we do to protect users’ privacy. It’s important that users feel confident that their data is being handled in a responsible way. So I think there’s a lot of work to be done in terms of education, both by the government and the industry.

Cameron: That’s a really good point. It’s not just about the technical aspects of age verification but also about building trust and transparency with users. Thank you for sharing your insights on this topic. Do you have any final thoughts on where the age verification industry is headed in the future?

Iain: Yes, I think the age verification industry is going to continue to grow and become more important as more and more legislation is passed around the world. I think we’re going to see more interoperability and more standardization, which is going to make it easier for users to verify their age across different websites and platforms. I also think we’re going to see more innovation in terms of the technology that’s used for age verification, such as biometric verification and other forms of identity proofing. Overall, I think it’s a really exciting time for the age verification industry, and I’m looking forward to seeing where it goes.

Cameron: Thank you, Iain, for joining me today and sharing your insights on this important topic. It’s been a pleasure speaking with you.

Iain: Thank you very much for having me. It’s been great to be here.


Explore The Podcast Library

Episode 343

In the latest State of Identity podcast, hosted by Cameron D’Ambrosi, we’re joined by Laura Spiekerman, co-founder and president of Alloy, a global identity risk solution for financial services and a Liminal 2023 Company to Watch. We’ll discuss its pioneering role in the orchestration-centric approach to Digital Identity in Fintech. Spiekerman delves into the challenges Alloy addresses in the fintech space, where compliance and fraud often hinder innovation. Join us to explore the evolving landscape of digital identity in Fintech, trends in fraud prevention, and the critical intersection of customer experience and security.

Episode 342

In the latest episode of the State of Identity podcast series, we delve into the ever-evolving world of customer identity and access management (CIAM). Join host Cameron D’Ambrosi from Liminal as he sits down with Brian Pontarelli, the founder and CEO of FusionAuth, to explore the exciting developments and challenges in the realm of passwordless authentication, user data management, and the quest for seamless transitions in the digital landscape. Bryan shares his expertise and unique perspective, shedding light on the fascinating journey of FusionAuth and its pivotal role in this dynamic landscape. Tune in for a thought-provoking discussion that promises to expand your understanding of CIAM and its critical role in the modern enterprise.

Episode 341

Tune in to the latest episode of the State of Identity podcast series, where Data Security expert Shane Curran, Founder and CEO of Evervault, dives deep with host Cameron D’Ambrosi into the intricacies of data security. Discover why basic encryption methods aren’t enough, understand innovative data security strategies that ensure functionality, learn how encryption safeguards AI model training without compromising customer data, and grasp the significance of prioritizing current cybersecurity threats over quantum computing concerns.

Episode 340

In the latest episode of the State of Identity podcast, host Cameron D’Ambrosi is joined by Gadalia Montoya Weinberg O’Bryan, an ex-NSA crypto mathematician and the Founder and CEO of Dapple Security. Learn about Gadalia’s remarkable journey from the National Security Agency to the forefront of identity-focused cybersecurity. Learn about the limitations of current passwordless approaches, particularly in scenarios involving lost or stolen devices, and delve into the crucial distinction between authenticating the user behind the device rather than the device itself. Gadalia introduces Dapple Security’s unique solution, which involves generating an on-demand passkey using a user’s fingerprint—emphasizing the company’s commitment to user privacy by avoiding the storage of biometrics on the device or in the cloud—and how this approach is a key element in enhancing overall security posture.

Episode 339

In this episode of the State of Identity podcast, host Cameron D’Ambrosi talks with Eric Olden, the co-founder and CEO of Strata Identity. Join us as they discuss the challenges faced by today’s multi-vendor/multi-cloud enterprise technology landscape and how forward-looking executives view identity as an opportunity, not a cost center. They also delve into the importance of moving towards passwordless authentication and the role of identity orchestration in addressing these challenges.

Episode 338

In this episode of the State of Identity podcast, Liminal host Cameron D’Ambrosi and Justin McCarthy, the co-founder and CTO of StrongDM explore the dynamic landscape of digital identity and access management, addressing the challenges and trends that shape the industry. They talk about what it means to move towards a “credential-less” world and discuss the complexities of authentication, authorization, and the role of proxies in bridging old and new technologies. McCarthy highlights the imperative for convergence among various tools, including the essential role of AI, providing a unified approach to access control, governance, and policy enforcement.

Filter by Content Type
Select all
Case Study
Filter by Category
Select all
Customer Onboarding
Fraud and Risk
Growth Strategy
Identity Management
Market Intelligence
Transaction Services