Childhood and Algorithms: India's Inaction
Independent, Journalism URGENTLY Needs Support Today.
Childhood and Algorithms: India's Inaction
Reports of children spending six to eight hours daily on games, suffering academic decline and psychological distress, are no longer exceptional.
The problem is no longer theoretical
Social media’s harmful impact on children’s mental health, attention, sleep and emotional regulation is now well documented globally. The uncertainty that once justified regulatory delay has ended.
Platform design, not parental failure, drives harm
Infinite scroll, algorithmic feeds and engagement based rewards are engineered to maximise attention and are especially damaging for children whose cognitive control systems are still developing.
Australia signals a regulatory turning point
By barring under-16s from social media accounts and placing liability on platforms rather than parents or children, Australia has moved from digital fatalism to democratic responsibility.
India remains dangerously under-regulated
Despite widespread child exposure to social media, gaming addiction and ed-tech marketing excesses, India lacks enforceable age limits, platform accountability and meaningful safeguards.
Regulation is about timing, not prohibition
The case is not for banning technology, but for delaying exposure, protecting childhood from commercial exploitation and asserting the state’s duty to govern powerful digital systems.
For more read the full article .....
DECEMBER 2025
Every serious attempt to regulate technology runs into a structural paradox first articulated by David Collingridge in his 1980 work The Social Control of Technology.
The Collingridge Dilemma holds that regulation is hardest precisely when it is most needed.
Early in a technology’s life cycle, its long-term effects are uncertain, making regulation appear premature.
Later, when harms are fully visible, the technology is already deeply entrenched, economically indispensable and politically defended, making intervention disruptive and costly.
Social media regulation today sits squarely inside this dilemma.
For much of the last two decades, governments treated digital platforms as neutral tools whose social consequences were unknowable in advance. That period of uncertainty has ended.
The effects of social media on children are no longer theoretical risks. They are documented, measured and lived.
“Social media regulation today sits squarely inside Collingridge dilemma.
What remains unresolved is the second half of the dilemma: whether democratic states have the political will to regulate a technology after it has become powerful.
Australia’s decision to bar children under 16 from holding social media accounts marks a significant moment in this global reckoning.
It is not flawless, but it represents something increasingly rare in technology policy: a willingness to act late, but decisively.
Keep Reading ..... 25% Complete
Similar Stories you may be interested
Childhood as an Asymmetrical Battlefield
The debate over the “right age” for social media is often framed as a matter of parental discretion or individual maturity.
This outlining is discomforting and deeply misleading.
It assumes a level playing field between children and platforms designed using behavioural psychology, data analytics and attention engineering.
Globally, children are entering social media ecosystems far earlier than platforms officially allow.
Despite minimum age requirements of 13, the average age at which children create their first social media account is between 10 and 11.
UNICEF estimates that one in three internet users worldwide is a child.
In India, industry estimates suggest over 175 million minors are active online.
This is not light or occasional use. According to the OECD, over 95 percent of teenagers in high income countries use social media daily.
“Globally, children are entering social media ecosystems far earlier than platforms officially allow.
The average time spent exceeds three hours per day, with Indian urban adolescents often reporting even higher figures due to inexpensive data and smartphone access.
The consequences are measurable.
A 2023 study published in JAMA Pediatrics found that adolescents who spend more than three hours a day on social media face twice the risk of developing anxiety and depression.
The World Health Organization has repeatedly linked excessive screen exposure to sleep disruption, emotional dysregulation and declining attention spans.
Teenage girls are particularly vulnerable due to image and looks -based comparison and algorithmic amplification of appearance driven content.
These outcomes are not accidental. They are produced by design.
Design as Policy Failure
Australia’s eSafety Commissioner has been explicit in identifying the source of harm. (In India, there is nothing like esafety commissioner.)
It is not only the content children encounter, but the structural features of platforms themselves, like
· Infinite scroll,
· algorithmic feeds,
· engagement driven ranking systems and
· feedback loops based on ‘likes’ and ‘shares’ are not neutral choices.
They are mechanisms optimised to maximise time spent and emotional arousal.
Former employees of Meta, Google and TikTok have testified that these systems are built using insights from behavioural science and neuroscience.
“Expecting children to resist such systems through individual self-control is unrealistic.
The 2021 disclosures by Facebook whistleblower Frances Haugen revealed internal research showing Instagram worsened body image and looks issues among teenage girls.
The company chose growth over reform.
Expecting children to resist such systems through individual self-control is unrealistic.
Neurologically, the prefrontal cortex responsible for impulse regulation and long-term judgment does not fully mature until the mid-twenties.
Treating children as fully autonomous digital actors is not empowerment. It is abandonment.
Australia’s Intervention and Its Significance
Australia’s under 16 social media restrictions, effective December 2025, apply to major platforms including:
· Facebook,
· Instagram,
· TikTok,
· YouTube,
· Snapchat,
· X,
· Reddit and
· Twitch.
Platforms are required to take reasonable steps to identify and deactivate underage accounts and prevent new ones from being created.
Notably, the law places no penalties on children or parents.
Liability rests entirely on companies, with fines reaching nearly Rs. 30 Crores (50 million Australian dollars) for serious or repeated violations.
This is a critical shift away from the long-failed model of parental consent and user self-certification.
Australia has also rejected the fiction that platforms can rely on honesty alone.
Companies are expected to deploy age assurance technologies including identity verification, facial or voice analysis and behavioural inference.
While privacy concerns are legitimate, the current alternative is unrestricted exposure to systems known to cause harm.
The law also draws a functional distinction between platforms.
Messaging services, educational tools and child specific platforms such as WhatsApp, Google Classroom and YouTube Kids are excluded, recognising that not all digital interaction poses the same risks.
Importantly, the government has retained the power to revise this list as platform features evolve.
Keep Reading ..... 50% Complete
Similar Stories you may be interested
A Global Shift in Regulatory Mood
Australia’s move reflects a broader international recalibration.
France requires parental consent for users under 15.
Germany mandates parental approval for children aged 13 to 16.
Spain is raising its minimum age from 14 to 16.
Denmark’s Prime Minister has publicly stated that social media is stealing children’s childhoods.
In the United States, several states now require parental consent for under 16 accounts.
The U.S. Surgeon General has compared the mental health impact of social media on children to tobacco in the early twentieth century, widely consumed before its dangers were acknowledged.
“India’s position is more troubling. There are no restrictions.
The United Kingdom has taken a different route. Its Online Safety Act focuses on content regulation rather than age-based access.
Accordingly, platforms must prevent minors from accessing pornography, self-harm material and extreme violence, using age verification methods such as facial scans and ID checks.
Yet even in the UK, the government is consulting on whether a full under 16 ban is necessary, an implicit admission that content moderation alone may be insufficient.
India’s position is more troubling. There are no restrictions. Rather, there is an encouragement from the prime minister to use and create reels for social media, as source of employment.
India’s Regulatory Vacuum
In the Indian context, the information problem identified by Collingridge no longer exists.
Social media platforms have operated in India for over two decades. Their harms are neither abstract nor imported.
They are visible in schools, households and clinics.
Cyberbullying, online grooming, exposure to graphic content, algorithmic radicalisation and digital addiction are increasingly reported among Indian children.
Paediatricians and psychologists warn of rising anxiety, sleep disorders and attention deficits linked to excessive screen use.
And yet, India lacks a coherent regulatory framework for children’s social media use.
“Paediatricians and psychologists warn of rising anxiety, sleep disorders and attention deficits
The Digital Personal Data Protection Act, (DPDP) 2023 provides only indirect protection. Section 9 requires verifiable parental consent for processing personal data of individuals under 18.
Since social media platforms process extensive personal data, this provision theoretically restricts minors from using such platforms without parental approval.
Section 2 defines a child as anyone under 18, setting a high threshold for digital majority.
In practice, this protection is largely symbolic.
There is no mandatory age verification and works without enforcement mechanism.
There is no platform accountability for algorithmic harms.
Thus, Indian children remain exposed to persuasive systems without meaningful safeguards.
India Specific Harms: Education, Entertainment and Exploitation
India offers particularly stark examples of how platform driven ecosystems intersect with childhood vulnerability.
The collapse of BYJU’S is instructive. Once celebrated as an ed tech unicorn, the company aggressively marketed educational content through Instagram Reels and YouTube Shorts.
The boundary between learning and engagement farming blurred.
Children were drawn into high pressure aspirational narratives, while parents were nudged toward expensive subscriptions through emotionally manipulative advertising.
What was presented as education increasingly resembled attention extraction.
Additionally, India has seen a surge in online gaming among minors, particularly real money gaming and battle royale formats.
That makes ‘Gaming’ addiction present another crisis.
The Ministry of Electronics and Information Technology has acknowledged the addictive nature of these platforms, yet regulation remains fragmented.
“India has seen a surge in online gaming among minors.
Reports of children spending six to eight hours daily on games, suffering academic decline and psychological distress, are no longer exceptional.
Instagram Reels and short video platforms also deserve particular scrutiny.
Their algorithmic structure favours rapid emotional reward, constant novelty and social comparison.
For adolescents, this translates into compulsive use, body image anxiety and declining tolerance for sustained attention.
These are not incidental outcomes. They are the product presented to Indian children.
The Political Economy of Inaction
Why the delay? Part of the answer lies in political economy.
Social media platforms are deeply embedded in India’s digital public sphere.
They are tools of political mobilisation, commercial advertising and state communication. Regulating them risks confrontation with powerful corporate and domestic interests.
But delay carries its own costs.
A 2022 Lancet Commission warned that adolescent mental health is emerging as a global public health crisis with long term economic consequences.
Reduced educational attainment, increased healthcare burden and declining civic trust are not distant projections. They are foreseeable outcomes.
“The persistent engagement with social platforms can divert attention from academic activities
Though not directly linked using formal studies or survey reports, the average reading and writing skills, disproportionately weaker than their class levels, in school may be a consequence of social media distractions.
While there may not be direct evidence from formal studies or survey reports, it is plausible that the average reading and writing skills of students, often found to be disproportionately below their expected class levels, may be influenced by distractions arising from social media use.
The persistent engagement with social platforms can divert attention from academic activities, potentially contributing to weakened literacy outcomes among schoolchildren.
Each year of inaction by the government and relevant authorities, deepens platform entrenchment, reinforcing the power problem at the heart of the Collingridge Dilemma.
Keep Reading ..... 75% Complete
Toward a Coherent Indian Response
The choice before India is no longer between innovation and regulation. It is between governance and neglect.
First, India must adopt clear age-based access rules for social media platforms, rather than relying solely on content moderation.
Second, platform liability must be explicit and proportionate to global revenue, not domestic turnover, to prevent regulatory arbitrage.
Third, privacy preserving age assurance mechanisms must be standardised, independently audited and legally mandated.
Fourth, digital literacy must be treated as a public good, embedded in school curricula and public health frameworks, not outsourced to platforms whose incentives are misaligned.
Finally, policy must acknowledge a moral truth long obscured by technological optimism: childhood is not a market.
Resolution
Australia’s decision will not eliminate teenage social media use. Nor will any single law. But it redraws a boundary between platform and users, and that matters.
It recognises that freedom of social media use without capacity, is not freedom at all.
It accepts that democratic states have a duty not merely to enable markets, but to protect those who cannot yet protect themselves.
The most responsible response to powerful technology is not blind embrace or total rejection. It is the courage to assert and tell children, ‘not yet’.
And that, increasingly, is what childhood requires.
..... 100% Complete
Support Us - It's advertisement free journalism, unbiased, providing high quality researched contents.