Tech Firms Must Reform Algorithms to Protect Children or Face ‘Hefty Fines’: Ofcom

1 week ago 25

New guidance by the communications watchdog comes amid concerns that current moderation efforts to safeguard underage users are not enough.

Ofcom has warned that online sites, including popular social media platforms, will face hefty fines and enforcement action, if they fail to reform their algorithms recommending harmful content to children.

A draft guidance, published by the UK communications watchdog, lists 40 safety measures, including robust age-checks, to protect underage users.

In practice, that means popular social media sites, like Facebook, Instagram and Snapchat will have to implement “highly effective age-checks.”

In some cases, this will mean preventing children from accessing the online service altogether.

“ … platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms which too readily mean they come across harmful material online,” said Technology Secretary Michelle Donelan.

Online platforms can verify a user’s age accessing the information their bank has on record, with the user’s consent. Photo ID matching, facial age estimation and credit card checks are also among age assurance methods Ofcom considers highly effective.

Related Stories

Brianna Ghey’s Killers Identified After Judge Lifts Restrictions
Ofcom Suggests Age Checks to Stop Children Accessing Online Porn

The regulator also warned content providers that they should filter out the most harmful content on children’s feeds.

This can be done by configuring the algorithms, which provide personalised recommendations to users.

Ofcom Chief Executive Dame Melanie Dawes said that the guidance goes “beyond current industry standards” and that the watchdog won’t hesitate to use the “full range of enforcement powers” to hold platforms accountable.

“In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms. They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age,” Dame Melanie said.

Enforcement

The draft Children’s Safety Codes of Practice comes under the Online Safety Act, passed in Oct. 2023. The bill created a duty of care for online services to safeguard users from illegal or legal but “harmful” content.

The chief executive of the NSPCC, Sir Peter Wanless, said that tech companies will be legally required to ensure the safety of their platforms, when it comes to underage users.

Once approved by Parliament, the Codes will come into effect and Ofcom can begin enforcing the regime.

“To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines—step up to meet your responsibilities and act now,” said Dame Melanie.

Ofcom highlighted evidence how children get influenced by harmful content online, suggesting current moderation efforts are not enough.

According to research, 62 percent of children aged 13-17 report encountering online harm over a four-week period. Many underage users say that encountering harmful information is an “unavoidable” part of their lives online and includes content promoting suicide or self-harm.

Bereaved Families

Ian Russell, whose daughter Molly took her life aged 14 after viewing disturbing content on social media, said that more needs to be done to protect children.

“The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life,” Mr. Russell said.

Speaking on BBC “Breakfast” on Wednesday, Mr. Russell said that tech companies are delaying to move on stricter controls.

Another parent, whose 13-year-old son died after taking part in a dangerous social media challenge, argued that platforms, like TikTok, should not feed such content to under-18s.

Esther Ghey, mother of 16-year-old Brianna Ghey, who was murdered in 2023, after a premeditated attack by two teenagers, has been campaigning to ban social media apps to all under 16-years-olds.

One of her daughter’s killers had accessed a dark web app to watch torture and snuff videos in “red rooms.” Ms. Ghey told the BBC that she believes social media algorithms are “brainwashing” young people.

Ofcom plans to publish the final Children’s Safety Codes of Practice within a year.

“Services will then have three months to conduct their children’s risk assessments, taking account of our guidance, which we have published in draft today,” the regulator said.

Read Entire Article