The UK is calling on search and social media companies to “tame toxic algorithms” that recommend harmful content to children, or risk billions in fines. On Wednesday, the UK’s media regulator Ofcom outlined over 40 proposed requirements for tech giants under its Online Safety Act rules, including robust age-checks and content moderation that aims to better protect minors online in compliance with upcoming digital safety laws.
“Our proposed codes firmly place the responsibility for keeping children safer on tech firms,” said Ofcom chief executive Melanie Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and introduce age-checks so children get an experience that’s right…