Websites and apps hosting potentially harmful content will be held responsible for UK age checks
Websites and apps hosting potentially harmful content will be held responsible for UK age checks AFP

New UK age verification measures to prevent children accessing harmful online content came into force on Friday, with campaigners hailing them a "milestone" in their years-long battle for stronger regulations.

Under the new rules, to be enforced by Britain's media watchdog, websites and apps hosting potentially harmful content will be held responsible for age checks using measures such as facial imagery and credit cards.

Around 6,000 pornography sites have agreed to implement the curbs, according to Melanie Dawes, chief executive of British regulator Ofcom.

Other platforms such as X, which is facing a dispute over similar restrictions in Ireland, must also protect children from illegal pornographic, hateful and violent content, she noted.

"We've done the work that no other regulator has done," Dawes told BBC Radio.

"These systems can work. We've researched that," she said.

Around 500,000 youngsters aged eight to 14 encountered pornography online last month, according to Ofcom.

The long-awaited new rules, which aim to prevent minors from encountering content relating to suicide, self-harm, eating disorders as well as porn, stem from a 2023 Online Safety Act.

It imposes legal responsibilities on tech companies to better safeguard children and adults online and mandates sanctions for those who fall short.

Rule-breakers face fines of up to GBP18 million ($23 million) or 10 percent of their worldwide revenue, "whichever is greater", according to the government.

Criminal action can also be taken against senior managers who fail to ensure companies follow Ofcom information requests.

The measures are coming into force now after the sector and the regulator were given time to prepare.

Children will "experience a different internet for the first time," technology secretary Peter Kyle told Sky News, adding he had "very high expectations" for the changes.

In an interview with parenting forum Mumsnet, he also said sorry to youngsters who had been exposed to harmful content.

"I want to apologise to any kid who's over 13 who has not had any of these protections," Kyle said.

Rani Govender, of the child protection charity NSPCC, said it was "a really important milestone that we're finally seeing tech companies having to take responsibility for making their services safe for children".

Children are frequently "stumbling across this harmful and dangerous content," she told BBC News.

"There will be loopholes," Govender noted, insisting it was still "right that we're introducing much stronger rules to make sure that that can't continue to happen".

Prime Minister Keir Starmer's government is also considering introducing a daily two-hour limit for children on social media apps.

Kyle said he would announce more plans for regulating the sector for under-16s "in the near future".