UK government officials are preparing to request that Apple and Google integrate nudity-detection software directly into their device operating systems to restrict the taking, sharing, and viewing of sexually explicit content without prior age verification. The proposal, expected to be formally announced by the Home Office soon, represents a significant step in expanding child protection measures beyond platform-level enforcement to device-level safeguards. By embedding these systems into iOS and Android, officials aim to prevent minors and vulnerable users from accessing explicit content at the point of capture, upload, or display.
Under the proposed plan, users would be required to confirm their age before viewing any nudity or sexually explicit material, potentially using biometric verification or official identification checks. This approach extends the existing requirements of the Online Safety Act 2023, which mandates age verification for online pornography and harmful material but can be circumvented with VPNs or proxy services. Device-level enforcement would cover cameras, photo galleries, messaging apps, and web browsing functions, providing a more consistent and immediate layer of protection for children.
Apple and Google have traditionally emphasized user privacy and control, but the UK government’s request signals growing expectations for technology companies to take responsibility for detecting and limiting explicit content at the operating system level. Apple has already implemented some measures within the Messages app, where sexually explicit images received by children in an iCloud Family group appear blurred with a warning. If the child chooses to view the photo, a notification is sent to the parent or guardian, providing a model for how device-level safeguards might operate. Meanwhile, other countries, including the United States, Canada, and various European nations, are also discussing regulations aimed at balancing online safety with privacy considerations.
The move comes amid broader conversations in the tech industry regarding age verification and child protection. Meta and other major companies have expressed support for measures that hold device manufacturers accountable for restricting access to explicit content. While the specifics of how Apple and Google might implement the UK’s proposed requirements are not yet clear, the approach could potentially extend across multiple devices and apps, reducing opportunities for bypassing protections. PTA in Pakistan and other regulatory bodies have similarly debated the scope of content control and social media regulation, reflecting a growing global trend in digital safety measures for minors.
Follow the SPIN IDG WhatsApp Channel for updates across the Smart Pakistan Insights Network covering all of Pakistan’s technology ecosystem.