Social media platforms like Facebook, X, TikTok and Instagram are about to go through some key changes as new online safety rules are set to be enforced. These have been triggered by the UK’s Online Safety Act and will impact everything from content moderation to how these platforms handle illegal and harmful material.
There are a number of things you need to know about the upcoming changes that will affect your experience on these platforms. And with deadlines set for 2025, users can expect these changes to gradually improve safety across social media – although companies that fail to comply could face significant fines.
Here’s what you need to know about the upcoming updates and how they might change your experience on social media. For the latest Welsh news delivered to your inbox sign up to our newsletter
READ MORE: Police appeal after motorcyclist, 21, dies in crash
READ MORE: Tesco recalls food item as it could be dangerous to eat
New safety duties for social media companies
Under the Online Safety Act, tech firms are now legally required to take action to tackle illegal content such as hate speech, child sexual abuse, terrorism, and online fraud. Ofcom, the UK’s communications regulator, has now published new codes of practice that social media platforms must follow – pushing the message that the aim is to make the online environment safer for everyone, especially children.
These changes are set to take effect over the next few months, with platforms required to complete risk assessments by March 2025.
Key Changes to Expect
-
Senior accountability for safety: Each social media platform must appoint a senior person responsible for compliance with the new safety rules, ensuring that the company is held accountable for its actions in removing harmful content.
-
Improved content moderation: Platforms will need to strengthen their moderation teams and improve the reporting process. This means quicker removal of harmful content, like illegal suicide material or cyberbullying, and better reporting tools for users.
-
Child protection measures: Platforms will need to make changes to protect children from online predators, including blocking non-connected accounts from messaging kids and hiding personal details from strangers. They will also have to use automated tools to detect and remove child sexual abuse material.
-
Fraud and scam protection: Social media firms will be required to provide a dedicated reporting channel for fraud experts, enabling them to flag scams quickly. Platforms must take action against fraudulent accounts and activity.
-
Tackling online harassment: Women and girls, in particular, will benefit from new protections. Social media sites must make it easier for users to block and mute harassers, and quickly take down non-consensual intimate images.
-
Removal of terrorist accounts: Ofcom says it is “very likely” that posts generated, shared, or uploaded via accounts operated on behalf of terrorist organisations proscribed by the UK government will amount to an offence. There will be an expectation for sites and apps to remove users and accounts that fall into this category to combat the spread of terrorist content.
What’s Next?
From March 2025, social media firms must start implementing the required safety measures, and Ofcom will be actively monitoring compliance. Companies that fail to meet these new regulations may face penalties, including hefty fines of up to £18 million or 10% of their global revenue.
The new rules are part of a broad package of safety measures and mark a significant step towards improving online safety. Social media users in the UK can expect a safer online experience, but it will be up to platforms to take immediate action to comply with these new rules.
Dame Melanie Dawes, Ofcom’s Chief Executive, said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.
“The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.
“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”