Tech giants are putting profits ahead of the mental health of children, say multiple national health bodies.
Social media age restrictions may not be in the best interests of young people, the AMA has argued in a new submission that urges the government to introduce more pragmatic regulations.
Established in May, the Joint Select Committee on Social Media and Australian Society is investigating the potential use of age verification and the way that corporate decision-making related to algorithms and recommender systems affects mental health.
The AMA said it was particularly concerned with the way children were targeted online with content promoting smoking, vaping, gambling, alcohol and junk food.
“A major public health concern is the ease of access to purchase e-cigarettes through platforms such as Snapchat, Instagram or Facebook,” the AMA submission read.
While acknowledging that social media usage was correlated with online harassment, poor sleep, low self-esteem and poor body image in young people, it stopped short of endorsing a minimum age of 16 to create social media accounts.
“The [association] is concerned increasing age-verification moves the emphasis back on the individual as a consumer, rather than the social media platforms that have control over the content children are being exposed to, using algorithms and commercial tactics to push harmful content at consumers,” it said.
Further to this, it was concerned that children would simply lie about their age to evade the ban and, as a consequence, would be targeted with content intended for older audiences.
It also pointed out that a simple age ban would undermine and ignore what children themselves wanted.
Youth-centred research from the eSafety Commissioner found children wanted better written guidelines and boundaries, improved recognition of the positives of online spaces and less exposure to unwanted content and contact.
“Perhaps most importantly, and in opposition to the age-verification reform suggestion, is a desire from children for improved monitoring, swift action, and accountability regarding online safety practices, rather than responsibility placed purely on the user,” the AMA said.
“Children also suggest instead of reforming age-verification laws, protections should be improved with stronger technology such as digital passwords and secure apps.”
A joint submission from mental health organisations Reach Out, Beyond Blue and Black Dog Institute also warned against introducing a minimum age requirement while agreeing that social media was harming young people.
Related
Instead, the groups said, policy should focus on promoting healthy use of social media.
“Recent research has found that using social media to passively consume content is associated with higher symptoms of depression, anxiety, insomnia, and disordered eating in adolescents,” they said.
“By contrast, using social media to communicate with people adolescents know in real life is associated with lower symptoms of depression and anxiety.”
Common themes related to negative mental health include upward social comparisons, exposure to age-inappropriate or distressing content and sleep disturbance.
“Adolescents are developmentally primed to seek social connections, making them particularly susceptible to the influences of digital technology,” the three organisations said.
“Technology companies leverage these developmental drives with algorithms designed to capture and hold adolescent attention, creating challenges for their developing self identity and emotion regulation.”
Introducing mandatory safety-by-design principles, they argued, would protect users while preserving some of the positive aspects of social media.
Under these principles, tech companies would have to limit features like infinite scroll and move away from prioritising time spent on the website or app above all else.