2.5 C
Munich
Monday, December 2, 2024

Children first see ‘unavoidable’ violent content at primary school, says Ofcom | Science & Tech News

Must read


Children first see violent online content at primary school and believe it is “unavoidable”, according to an Ofcom study.

All 247 children who the watchdog spoke to said they had seen things such as adult-only video game content, fighting and verbal discrimination.

Social media and group chats were the most common way they came across the content, and many said they had seen it when they were below the sites’ minimum age.

The sharing of videos showing school fights was normal for many children, according to the study.

Others said they had seen more extreme violence – involving gangs for example – but far less often.

Some 10 to 14-year-olds said they felt pressure to watch violent content and find it “funny” and feared being isolated if they didn’t.

Ofcom said teenage boys were the most likely to share such videos, and often did so to become more popular by attracting comments or likes – or simply to “fit in”.

Some children said they came across violence via strangers’ posts on their newsfeed or via what they called “the algorithm”.

Many felt they had little control over it and sometimes felt upset, scared or anxious.

Molly Russell's family have campaigned for better internet safety since her death in 2017.
Image:
Molly Russell’s father has fought for sites to do more to protect young people. Pic: PA

Another study for Ofcom – by Ipsos and social research agency Tonic – said young people who had seen content about self-harm, suicide and eating disorders described it as “prolific” on social media.

The research said it amounted to a “collective normalisation and often desensitisation” of the issues.

Some children said it made their symptoms worse and others said they found out about other self-harm techniques.

A third study for Ofcom – by the National Centre for Social Research and City University – found cyberbullying occurred anywhere children interact online, with comment functions and direct messaging the main enablers.

Some children said they had been bullied in group chats after being added without their permission.

Ofcom said a key strand in all three studies was a lack of confidence and trust from children about reporting their concerns.

Those who did said they often received only a generic message, while others said the reporting process was too complex or were worried about their anonymity.

Please use Chrome browser for a more accessible video player

Esther Ghey’s message for tech firms

Campaigners have long been pushing for social media companies to do much more to prevent children from seeing harmful content.

Ian Russell has accused the firms of still “pushing out harmful content to literally millions of young people” six years after his daughter took her own life.

Molly, 14, killed herself after viewing posts related to suicide, depression and anxiety.

The mother of murdered teenager Brianna Ghey has also said mobile phones should be made specifically for children under 16 to protect them from online harms.

The Online Safety Act – which was passed last year – requires providers of online services to minimise the extent of illegal and harmful content.

However, a parliamentary committee said last month that the benefit may not be felt for some time as full implementation of the law had been delayed until 2026.

Its report also pointed out that Ofcom would be unable to act on individual complaints and will only be able to step in if there are “systematic concerns” about a provider.

Read more:
What is the Online Safety Act and how will it be enforced?
Porn sites may have to use ID and card checks to protect kids

Reacting to the recent studies, Ofcom’s online safety group director, Gill Whitehead, said: “Children should not feel that seriously harmful content – including material depicting violence or promoting self-injury – is an inevitable or unavoidable part of their lives online.

“Today’s research sends a powerful message to tech firms that now is the time to act so they’re ready to meet their child protection duties under new online safety laws.

“Later this spring, we’ll consult on how we expect the industry to make sure that children can enjoy an age-appropriate, safer online experience.”



Source link

- Advertisement -spot_img

More articles

- Advertisement -

Latest articles