The Economic Survey 2025–26 has reopened a sensitive but increasingly unavoidable debate: how should governments regulate children’s access to social media and digital advertising? By recommending age-based limits and tighter controls on platforms, the Survey signals that India may be moving towards a more interventionist approach to online child safety—one that could significantly affect global technology companies operating in their largest user market.
What the Economic Survey is proposing
The Survey calls for the government to consider age-based access limits for social media, especially for younger users who are more vulnerable to compulsive use and harmful content. It does not explicitly recommend a blanket ban, but stresses that platforms should be made responsible for:
- Robust age verification mechanisms
- Age-appropriate default settings
- Restrictions on auto-play features and addictive design
- Limits on targeted digital advertising to children
The underlying concern is “digital addiction”, driven by platform designs that maximise screen time and expose children to violent, sexual, gambling-related, or otherwise harmful content.
Why children’s online safety has become a policy concern
Children form one of the fastest-growing segments of social media users in India. Algorithm-driven feeds, infinite scroll, and targeted advertisements can shape behaviour at a formative age, affecting mental health, attention spans, and social development. International research and regulatory experience increasingly link excessive screen exposure with anxiety, cyberbullying, and reduced wellbeing among minors.
For India, the issue is magnified by scale: even marginal regulatory changes could affect tens of millions of young users and reshape platform business models.
India’s existing legal framework on children and data
India’s data protection framework already lays the groundwork for stronger regulation. It mandates parental consent for digital services offered to users below 18 years and explicitly prohibits behavioural tracking and targeted advertising to children. Although the framework has been notified, it is yet to be fully operationalised, leaving enforcement gaps that the Survey now seeks to address through clearer age-based rules.
States stepping ahead of the Centre
Interestingly, the push is not limited to the Union government. States such as Andhra Pradesh and Goa have publicly explored the idea of banning social media access for children. While such state-level bans raise constitutional and enforcement questions, they reflect growing political consensus that unchecked digital exposure poses real risks to child welfare.
The Australian precedent and why it matters
The Survey draws attention to Australia, which has become the first country to enforce a minimum age of 16 for social media use. Under its new law, platforms such as Meta’s Instagram, Google’s YouTube, and Snap are required to identify and deactivate accounts belonging to under-16 users and prevent new ones from being created.
The legislation obliges companies to take “reasonable steps” to prevent circumvention, while also providing mechanisms to correct errors so that legitimate users are not unfairly excluded. The Australian government has justified the law on the grounds of protecting young users from cyberbullying, addictive design features, and harmful content—concerns echoed in India’s Economic Survey.
Global trend towards tighter regulation
Australia’s move is already influencing debates elsewhere. Several countries in Europe, including France, are considering similar restrictions. If India follows even a partial version of this approach, it would add momentum to a global regulatory shift that challenges the long-held assumption that online platforms should self-regulate children’s safety.
Implications for technology companies
Age-based limits and advertising restrictions strike at the core of platform economics. Children and teenagers represent a critical user base that fuels engagement metrics and long-term brand loyalty. Mandatory age verification, default content restrictions, and bans on targeted ads would force companies to redesign products and potentially sacrifice revenue in markets like India, where user growth is central to their global strategy.
Challenges in enforcement and design
Implementing age limits is not straightforward. Reliable age verification without excessive data collection, avoiding privacy violations, preventing circumvention through fake credentials, and ensuring inclusion for educational and social needs are all unresolved challenges. The Survey’s emphasis on simpler devices, education-only tablets, content filters, and usage limits suggests a layered approach rather than outright prohibition.
What to note for Prelims?
- Economic Survey 2025–26 recommendation on age-based social media limits.
- Ban on behavioural tracking and targeted ads for children in India’s data protection framework.
- Australia’s minimum social media age of 16.
- Concept of “digital addiction” among children.
What to note for Mains?
- Discuss the need for regulating children’s access to social media in India.
- Examine the balance between child protection, privacy, and freedom of expression.
- Analyse challenges in enforcing age-based digital regulations.
- Evaluate the implications of such regulations for Big Tech and digital economies.
