Last updated on April 5th, 2024 at 11:56 am

All children interviewed by the media watchdog had viewed violent content on the internet

Research from the media watchdog reveals that violent online content is now “unavoidable” for children in the UK, with many first encountering it while still in primary school.

Every British child interviewed for the Ofcom study had viewed violent material on the internet, ranging from videos of local school and street fights shared in group chats to explicit and extremely graphic violence, including gang-related content.

Children were aware that even more extreme material was accessible in the deeper parts of the web but had not actively sought it out themselves, according to the report.

The findings led the NSPCC to accuse tech platforms of neglecting their duty of care to young users.

Rani Govender, a senior policy officer for child safety online, expressed deep concern, stating, “It is deeply concerning that children are telling us that being unintentionally exposed to violent content has become a normal part of their online lives.

“It is unacceptable that algorithms are continuing to push out harmful content that we know can have devastating mental and emotional consequences for young people.”

The research, conducted by the Family, Kids and Youth agency, is part of Ofcom’s preparation for its new responsibilities under the Online Safety Act, passed last year. This act granted the regulator the authority to take action against social networks that fail to protect their users, especially children.

Gill Whitehead, Ofcom’s online safety group director, stated, “Children should not feel that seriously harmful content – including material depicting violence or promoting self-injury – is an inevitable or unavoidable part of their lives online.

“Today’s research sends a powerful message to tech firms that now is the time to act so they’re ready to meet their child protection duties under new online safety laws. Later this spring, we’ll consult on how we expect the industry to ensure that children can enjoy an age-appropriate, safer online experience.”

The research, carried out by the Family, Kids, and Youth agency, is a component of Ofcom’s readiness for its increased responsibilities under the Online Safety Act, passed last year. This legislation empowered the regulator to take action against social networks that fail to safeguard their users, particularly children.

Gill Whitehead, Ofcom’s director of online safety, emphasized, “Children should not consider seriously harmful content, such as material depicting violence or promoting self-injury, to be an unavoidable or inevitable part of their online experiences.

“This research underscores a strong message to technology firms that the time to act is now, ensuring they are prepared to fulfill their child protection obligations under the new online safety laws. We will later consult on our expectations for the industry to guarantee that children can have a safer, age-appropriate online experience.”

“There’s peer pressure to pretend it’s funny,” said one 11-year-old girl. “You feel uncomfortable on the inside, but pretend it’s funny on the outside.” Another 12-year-old girl described feeling “slightly traumatized” after being shown a video of animal cruelty: “Everyone was joking about it.”

Many older children in the study “appeared to have become desensitized to the violent content they were encountering.” Professionals also expressed particular concern about violent content normalizing violence offline, noting that children tended to laugh and joke about serious violent incidents.

On some social networks, exposure to graphic violence originates from the top. Recently, X, formerly known as Twitter before its acquisition by Elon Musk, removed a graphic clip showing sexual mutilation and cannibalism in Haiti after it had gone viral. Musk had reposted the clip himself, tweeting it at news channel NBC in response to a report by the channel that accused him and other right-wing influencers of spreading unverified claims about the chaos in the country.

While other social platforms offer tools to assist children in avoiding violent content, the support is minimal. According to the researchers, many children as young as eight mentioned that they could report content they didn’t want to see, but they lacked trust in the system’s effectiveness.

Concerns about reporting private chats stemmed from fears of being labeled as “snitches,” which could lead to embarrassment or punishment from peers. Additionally, there was a lack of trust that platforms would effectively penalize those who posted violent content.

The emergence of powerful algorithmic timelines, such as those on TikTok and Instagram, added another layer of complexity. Children believed that if they spent any time on violent content (e.g., while reporting it), they would be more likely to receive recommendations for similar content.

Professionals in the study expressed worry about the impact of violent content on children’s mental health. In a separate report released on Thursday, the children’s commissioner for England revealed that more than 250,000 children and young people were awaiting mental health support after being referred to NHS services. This means that one in every 50 children in England is on the waiting list. For those children who accessed support, the average waiting time was 35 days. However, in the last year, nearly 40,000 children experienced a wait of more than two years.

A spokesperson from Snapchat stated, “There is absolutely no place for violent content or threatening behavior on Snapchat. When we discover this type of content, we act swiftly to remove it and take appropriate action on the account responsible.

“We provide easy-to-use, confidential, in-app reporting tools and collaborate with law enforcement to aid their investigations. We endorse the objectives of the Online Safety Act to safeguard individuals from online harms and remain actively engaged with Ofcom on the act’s implementation.”

Meta has been approached for comment, while X has declined to comment.