As a social platform with 200 million monthly users Discord tests facial scanning for verifying user ages in UK and Australia which represents a major change in online security practices. Discord started as a gaming platform but evolved into a social space which hosts various groups including adult content communities. The implementation of facial recognition by Discord aligns with a wider industry shift as social media expert Matt Navarra explained to the BBC “This isn’t a one-off—it’s the start of a bigger shift.”
The Online Safety Act establishes maximum penalties of 10% global turnover for non-compliance which drives platforms to implement strict verification procedures. According to Navarra facial recognition technology will become the norm because the traditional “click to confirm you’re 13” approach no longer works. Users of Discord can verify their identity through face scanning or ID uploads to access sensitive content and adjust settings. The company maintains that face scan data stays on the device while ID data gets deleted following verification to reduce privacy risks. The platform uses blocking technology to hide sensitive content from teens.
Big Brother Watch along with other privacy advocates express concerns about excessive dependence on technological age verification methods. The senior advocacy officer Madeleine Stone expressed multiple concerns about data breaches alongside privacy violations and digital exclusion risks that age checks fail to solve. According to Iain Corby of the Age Verification Providers Association, privacy-preserving methods exist while modern technology can estimate ages accurately to within 1-2 years through selfie or hand movement analysis. Social networks maintain the freedom to block dangerous material while restricting complete websites or prohibiting entry to risky sections.
Facial recognition age verification made its first appearance on Instagram in 2022 through selfie video submissions that used AI to determine user ages. The worldwide movement toward stricter controls becomes evident as Australia plans to ban social media for under-16s while studies indicate that 80% of children aged 8-12 use platforms designed for users over 13. According to Navarra age assurance will become as common as internet seatbelts in the UK. Platforms including Discord must now navigate between safety measures and privacy protection and user accessibility as new regulations bring both financial risks and legal challenges.