As Australia enforces a ban on social media access for children under 16, Indonesia is moving forward with its own regulatory framework aimed at protecting children online. The Communications and Digital Ministry is preparing a ministerial regulation to serve as technical guidance for implementing the Government Regulation on Electronic Systems Providers Governance in Child Protection, known as PP Tunas, which was signed by President Prabowo Subianto in March. This regulation is designed to establish minimum age limits for platform users, prohibit profiling children’s data for commercial purposes, and define sanctions for platforms that fail to comply, creating a structured system for safeguarding underage users across social media, e-commerce, and online gaming platforms.
Communications and Digital Minister Meutya Hafid explained that platforms failing to meet the requirements will face gradual sanctions, ranging from warnings to fines and eventual termination of access. While detailed provisions of the ministerial regulation remain under development, the government aims to finalize the technical guidelines within the one-year transition period mandated for PP Tunas implementation. Minister Hafid emphasized that the rules are intended to create a safe online environment for children, balancing protective measures with continued access to digital media and learning opportunities. The regulation will also include a risk assessment framework, requiring platforms to conduct self-evaluations to determine whether their products and services pose low or high risks to children, based on factors such as exposure to strangers, violent or pornographic content, and addictive features.
Under PP Tunas, children under 13 may only access platforms specifically designed for their age group. Users aged 13 to 16 will be permitted on low-risk platforms, while those aged 16 to 18 can access all platforms, provided parental consent is secured at every level. Minister Hafid highlighted that the technical determination system, including criteria for risk categories and sanctions, will be elaborated in the forthcoming ministerial regulation. Platforms such as Meta, Google, and TikTok have expressed support for the regulation and confirmed efforts to align with child protection requirements. Meta has stated readiness to collaborate with the government to improve online safety for teenagers, TikTok has implemented safety measures for underage users, and Google has voiced support for compliance with the framework.
Digital policy experts have cautioned that effective implementation of PP Tunas is complex, given the multilevel risk profile, age categories, and the wide range of platforms involved. Wahyudi Djafar, executive director at Catalyst Policy Works, highlighted the challenge of balancing children’s right to access information with the need for safety, noting that social media platforms may adjust more easily than e-commerce or online gaming services, which may need separate systems for child and adult users. Experts also stress the importance of integrating parental guidance and digital literacy education in schools to complement platform obligations. Djafar noted that meaningful improvements will require coordinated efforts between government authorities, platform providers, educators, and parents, ensuring that children can safely benefit from digital technologies while minimizing exposure to risks.
Follow the SPIN IDG WhatsApp Channel for updates across the Smart Pakistan Insights Network covering all of Pakistan’s technology ecosystem.