Subscribe to the Liminal Newsletter
Stay updated with the latest news, data and insights from Liminal
Age verification is one of the hottest topics in digital identity today. Jurisdictions globally are cracking down on youth access to adult content and social media while limiting data collection from underage users. Join host Cameron D’Ambrosi and Executive Director of the Age Verification Providers Association, Iain Corby, to discuss how age verification vendors are helping platforms meet this growing challenge.
Cameron D'Ambrosi, Senior Principal at Liminal
Iain Corby, Executive Director
Cameron: Age verification is the hottest topic in digital identity today. Jurisdictions globally are cracking down on youth access to adult content and social media, while also fundamentally restricting the ability of platforms to collect data from young users. Stay tuned for my chat with the head of this growing industry’s leading trade group.
Welcome, everyone. To the State of Identity podcast, I’m your host, Cameron D’Ambrosi. Joining me this week is Iain Corby, Executive Director of the Age Verification Providers Association. Iain, welcome to the State of Identity.
Iain: Thank you very much for having me.
Cameron: It is my pleasure. Without exaggeration, I think age is an area where there’s a lot of new regulation and a lot of demands from platforms for these types of services, but not necessarily a lot of knowledge about what the art of the possible is and where this market needs to be headed, both from a policy as well as a technical perspective. So hopefully, our audience is on the same wavelength as me, and let’s get into it. To start, give us a quick 15,000-foot overview of the AVPA and your role.
How did you find yourself leading one of the top trade groups for age verification?
Iain: The AVPA was started back in about 2018 here in the UK because we had some legislation passed the year before to bring in a requirement for age checks for pornography. There was a group, the Digital Policy AllIaince, working with parliament and finessing that legislation. A number of people who decided they would step up and provide age assurance for that purpose felt it was worth getting together. When I started, six people were meeting in the pub once a month. I was brought on board to formalize it a little bit. I persuaded them that maybe we’d meet in an office for an hour before we went to the pub. And then we’ve grown from there. So we now have 25 members from around the world, as far afield as Australia, across the United States, and the European Union. We focus on three things: building standards, aligning regulation around the world, and promoting the industry and dealing with some of the criticisms, some of which are quite outdated tropes that are leveled at age verification.
Cameron: What role does the AVPA currently play within the ecosystem, and what role do you hope to continue to play? Where do you see yourself helping bind this nascent industry together?
Iain: The development really flows from legislation through regulators and then into implementation. So if we start with the legislation, we’ve got an avalanche of legislation which requires age verification of one sort or another. We’ve got the GDPR, the Data Protection Regulations in Europe, which already say if you’re under anything between 13 and 15, you can’t give consent to process your data. That actually implies that if you want to process data on the basis of consent, you better check the person who’s giving you consent is over 13 years old. Otherwise, it may just not be valid consent, and you’ll breach that legislation. We’ve got the Digital Services Act saying you can’t target kids with AI-based advertising. We’ve got the online safety bill here in the UK. We’ve got the Irish Fundamentals of Design for Children, which is the same as the age-appropriate design code here in the UK, where we are not just looking at whether somebody’s a child or an adult, but what age range they are as a child, and we’re expected to provide different content to a five-year-old from a 15-year-old. And then of course, we look across the United States and we see at least 20 different states legislating at the moment, either on the basis of content or on data and privacy in a way that wants you to treat children differently from adults. And literally, anytime you see that in legislation in relation to the internet, the first question has to be how do I know which users are adults and which are children? We are working with all the legislation we see as it comes up to try to get in early enough to persuade legislators to write that in a way that will stand the test of time. And that means not being tied to a particular technology or source but ideally writing legislation in an outcome-focused manner. What is it you’re trying to achieve? How strongly do you want to check age? You probably want a firmer check for allowing somebody to see pornography than to open a Facebook account. And equally, you probably want an even firmer check if they’re going to buy an offensive weapon online, rather than just looking at some porn. We’re trying to get people to coalesce around some standards of different levels of age assurance because that also means that we can build tech that will work on a global basis. Just take the United States. I think there are 50 states at the last count, if they were all just regulating three different products but they have three separate regulators in every state, then a platform, say EBA, would be looking at complying with 150 regimes just for the United States, and that’s going to be obviously hugely inefficient and very expensive. So we work on trying to get legislation and the regulation that follows it to align to the international standards that we are writing with our ISO and IEE. We already have one on the British standards, surpassing 1296. So we’re building up from that baseline. And then in terms of implementation, and this is a bigger story, so I’ll just elude to it for now. We think interoperability is critical here. When Americans come to Europe and go online, they first complain about the cookie pop. In fact, I was in Dublin just earlier this week at an event on child protection hosted by Google, and their folks who’d flown in from the states were very audibly complaining about cookie popups. Imagine the cookie popup where you can’t just click okay, but you have to pull out your driver’s license, scan that in, take a selfie to prove that you are the owner of the driver’s license, and then give consent to release your age to every single website you are trying to surf. That’s clearly not a sustainable solution, and I think we’ll lead to any legislation being immediately retracted. Interoperability and making the user experience as seamless as possible is then the third and final sort of leg of the stool that we are working on.
Cameron: So, from that perspective, I think education remains one of the industry’s fundamental challenges. The notion that, hey, if a website claims I’m going to do on-device biometric age estimation, they’re not actually capturing your full face and running a one-to-end biometric check about it. Or if you are providing information for an age verification check against, let’s say, data records, that information isn’t being stored to track you across sites. How do we as an industry continue this conversation? Where do you think the industry can move forward in terms of educating consumers about these requirements from a regulatory side, as well as building trust around things like use of data, use of biometrics, and the notion that the data itself is not being harvested and exploited for other purposes?
Iain: Millions of people have done age checks and voluntarily shared their biometrics. They’ve shown their face or uploaded their passports or given their name, address, and date of birth and consent to go and check their credit report. They’ve done that because they want to place an order for a case of wine or access a particular website. So actually, when people have some need and this is just a means to the end, then they will go through with it. Now, if you were to introduce that on one adult website tomorrow, not many people would do it because they’d have plenty of choices to go elsewhere. So they may choose not to do it, and that site would lose all its traffic. So you do need a level playing field. But once everybody requires it, then people just will put up with it. But there will, of course, be informed critics, media, and so on that we have to persuade. And that’s why we’ve gone over and above the GDPR and again in Europe, we do have good data protection laws, which obviously are not mirrored in the states yet at least at the federal level. But we recognize even with those safe protection laws, we’ve got to go further. We have a code of conduct, but we also now have audit and certification schemes that members can be checked against on a preemptive basis. So it’s not a question of them just being fined if they lose your data. We check that they are designed based on privacy by design and data minimization in such a way that they won’t lose your data because they don’t keep it in the first place. And then data protection authorities are all over us because we’re a very high-profile and high-risk industry from their perspective. But also, if you create trust frameworks and interoperability, then those trust frameworks will also be auditing members before they join them to participate in that interoperability network. And that’s yet another element of protection that’s confirming that every age verification provider in the network is compliant.
Cameron: Thank you for sharing that. I think it’s really interesting to hear how the AVPA is working to build trust and ensure that users’ data is protected. Going back to the topic of education, do you think there’s a role for governments or other organizations to play in educating users about age verification and digital identity?
Iain: Absolutely. I think the government is very important in educating users about age verification and digital identity. It’s not just about the government, but it’s also about the industry as well. We need to work together to ensure that users understand why age verification is important and how it works. We also need to be transparent about how we use data and what we do to protect users’ privacy. It’s important that users feel confident that their data is being handled in a responsible way. So I think there’s a lot of work to be done in terms of education, both by the government and the industry.
Cameron: That’s a really good point. It’s not just about the technical aspects of age verification but also about building trust and transparency with users. Thank you for sharing your insights on this topic. Do you have any final thoughts on where the age verification industry is headed in the future?
Iain: Yes, I think the age verification industry is going to continue to grow and become more important as more and more legislation is passed around the world. I think we’re going to see more interoperability and more standardization, which is going to make it easier for users to verify their age across different websites and platforms. I also think we’re going to see more innovation in terms of the technology that’s used for age verification, such as biometric verification and other forms of identity proofing. Overall, I think it’s a really exciting time for the age verification industry, and I’m looking forward to seeing where it goes.
Cameron: Thank you, Iain, for joining me today and sharing your insights on this important topic. It’s been a pleasure speaking with you.
Iain: Thank you very much for having me. It’s been great to be here.
Despite growing cybersecurity vulnerabilities, the enterprise shift from on-prem to multi-cloud IT infrastructure has been fantastic for scale and flexibility. Valtix co-founder & CEO Vishal Jain joins host Cameron D’Ambrosi to discuss the current cloud security landscape and why a unified platform approach is critical for identifying and mitigating cyber threats.
Private fund investing has remained anchored to PDFs and wet-ink signatures in a world where a smartphone can open a bank account in minutes. Join host Cameron D’Ambrosi and Passthrough CEO & Co-Founder Tim Flannery as they explore how his team brings digital onboarding flows to emerging and established fund managers.
While consumer data breaches have continued to occur at a record pace, options for proactive identity data protection have remained limited until now. Join host Cameron D’Ambrosi and Hush Co-Founder & CEO Mykolas Rambus to discuss why previous solutions have remained inadequate and why artificial intelligence is the key to protecting consumers.
Security Information and Event Management(SIEM) solutions are only as effective as their coverage. Analytics and automation are mission-critical for eliminating hidden detection gaps and maximizing attack coverage. Join host Cameron D’Ambrosi and CardinalOps VP of Cyber Defense Strategy Phil Neray for a conversation on the latest cybersecurity threats and why orchestration is the key to a robust defense.
How can you protect a cybersecurity perimeter that you can’t define? Join host Cameron D’Ambrosi and JupiterOne Founder & CEO Erkang Zheng as they discuss the value of cyber asset attack surface management (CAASM) and the role identity must play in bolstering an organization’s cybersecurity posture.
APIs are the backbone of the modern internet. Yet with this interconnectivity comes risk, and organizations of every scale struggle to ringfence their API inventories, securely authenticate access requests, and monitor for unauthorized data access. Host Cameron D’Ambrosi welcomes Traceable Chief Security Officer Richard Bird as they unpack this new threat landscape, how enterprises must adapt, and how Traceable can help.