At a glance: Singapore's proposed rules to reduce online harm

Social media platforms should provide safety information that is easy for young users to access and understand. PHOTO: ST FILE

SINGAPORE - New online safety measures that take aim at harmful content on social media platforms are set to be rolled out as early as 2023, with most people supporting the two new codes of practice during a public consultation exercise.

The measures - proposed under the Code of Practice for Online Safety and the Content Code for Social Media Services - aim to keep harmful content away from local users, especially the young. It will also grant the authorities powers to take action against platforms that fail to comply.

The new codes are expected to be added to the Broadcasting Act if they are passed in Parliament.

These are the key points from the proposals by the Ministry of Communications and Information (MCI):

1. Tools to protect young users

Under the proposed rules, social media services will have to provide tools that allow parents and guardians to manage the content that a young user can encounter online and limit any unwanted interactions.

The tools will prevent others from seeing young users' account profiles, the posts they upload and limit who can interact with their accounts.

MCI proposed that the tools be activated by default for platforms that allow users below the age of 18 to sign up for an account. The platform should also warn young users and parents of the potential risks, should they choose to weaken the settings.

Social media platforms should also provide safety information that is easy for young users to access and understand. It should provide guidance on how young users can be protected from harmful content and any unwanted interaction.

2. Platforms expected to sweep content for online harms

The platforms will be expected to moderate users' exposure or disable access to these types of content when users report them.

The reporting process should be easy to access and use, and platforms should assess and take action "in a timely and diligent manner".

Platforms will also be required to proactively detect and remove any content related to child sexual exploitation, abuse and terrorism.

Tools for users to manage their own exposure to unwanted content and interactions should be implemented as well. This will allow users to hide unwanted comments on their feeds and limit their interaction with other users.

Safety information, such as local support centres, should be easily accessible to users. Details of helplines and counselling services should be pushed to users who search for high-risk content, such as those related to self-harm and suicide.

3. IMDA to be empowered to direct content removal

The proposal intends to grant the Infocomm Media Development Authority (IMDA) powers to direct any social media platform to disable access to certain content for users in Singapore should such content slip through the cracks.

This could be content relating to public health and security, self-harm or posts that risk racial or religious tension.

When the Broadcasting Act is amended to empower IMDA, it can also disallow specified online accounts on the platform from interacting with local users.

4. Clamping down on online harms

The proposed rules require social media platforms to implement community standards for six types of content: sexual content, violent content, self-harm content, cyber bullying content, content that endangers public health and content that facilitates vice and organised crime.

The platforms will be expected to moderate users' exposure or disable access to these types of content when users report them.

Join ST's Telegram channel and get the latest breaking news delivered to you.