Search

Shopping cart

Saved articles

You have not yet added any article to your bookmarks!

Browse articles
Newsletter image

Subscribe to the Newsletter

Join 10k+ people to get notified about new posts, news and tips.

Do not worry we don't spam!

GDPR Compliance

We use cookies to ensure you get the best experience on our website. By continuing to use our site, you accept our use of cookies, Privacy Policy, and Terms of Service.

New Ofcom codes to protect children online branded 'overly cautious'

Social media firms could be hit with heavy fines - or even banned from the UK - under new measures to protect children from online harm.

Watchdog Ofcom has published the final version of its children's codes, part of the Online Safety Act, to set out what sites must do to protect young people. It says they are a "reset for children online" and will mean "safer social media feeds".

However, some campaigners believe they don't go far enough. Ofcom says the codes set out an obligation to protect children from content that is misogynistic, violent, hateful or abusive - as well as safeguarding against online bullying, self harm and dangerous challenges.

Firms must do a risk assessment, and from 25 July the regulator says it will be able to impose fines - and in very serious cases "apply for a court order to prevent the site or app from being available in the UK". More than 27,000 children and 13,000 parents took part in research to develop the codes.

Laying out more than 40 measures and covering social media, search and gaming, they include: - Safer feeds - Algorithms must filter out harmful content - Effective age checks - Ofcom says the "riskiest services must use highly effective age assurance" - Fast action - All sites must have processes to review, assess and quickly tackle harmful content - Easier reporting - There must be a straightforward way for children to report or complain about content - More choice and support - Kids must be able to easily block or mute accounts, and disable comments on their posts Ofcom boss Dame Melanie Dawes said it would mean "safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content". Technology Secretary Peter Kyle called it a "watershed moment" that would help tackle "lawless, poisonous environments" and hold social media firms to account.

However, Ian Russell, whose 14-year-old daughter Molly took her own life after seeing harmful content, said the codes were weak and left too much control with social media firms. "I am dismayed by the lack of ambition in today's codes.

Instead of moving fast to fix things, the painful reality is that Ofcom's measures will fail to prevent more young deaths like my daughter Molly's," he said. "Ofcom's risk-averse approach is a bitter pill for bereaved parents to swallow.

Their overly cautious codes put the bottom line of reckless tech companies ahead of tackling preventable harm." Mr Russell, who now chairs the Molly Rose Foundation, urged the prime minister to personally intervene. Read more:Bill to stop children 'doom scrolling' to get backingPorn sites must have 'robust' age verification by July Hollie Dance, the mother of Archie Battersbee - who died accidentally in a "prank or experiment.

Prev Article
Tech Innovations Reshaping the Retail Landscape: AI Payments
Next Article
The Rise of AI-Powered Personal Assistants: How They Manage

Related to this topic:

Comments

By - Tnews 24 Apr 2025 5 Mins Read
Email : 113

Related Post