Online safety law to protect Singapore users, especially children, from harmful content takes effect

HomeTechnology

Online safety law to protect Singapore users, especially children, from harmful content takes effect

The law was passed in Parliament in 2022 to safeguard users online.

my-portfolio

The Online Safety (Miscellaneous Amendments) law officially took effect on Feb. 1, 2023. But what changes does it bring? Announced and passed in 20

Social Media Users Worldwide Reach 5 Billion, but Misinformation Worries Remain in Singapore
ServiceNow Partners with NTUC LearningHub to Train 1,000 Singaporeans in Digital Skills at New Innovation Centre
S’pore Should Explore Solar & Nuclear Power to Boost Energy Security: Lim Wee Kiak

The Online Safety (Miscellaneous Amendments) law officially took effect on Feb. 1, 2023. But what changes does it bring?

Announced and passed in 2022
The law was first introduced during the Committee of Supply (COS) debate in March 2022 and passed in Parliament on Nov. 9, 2022. Building upon the existing Broadcasting Act (BA), the new law holds social media platforms—referred to as Online Communication Services (OCS)—accountable if they fail to protect Singaporean users from harmful online content.

Platforms that fail to comply could face fines of up to S$1 million or have their services blocked in Singapore.

Targeting Online Communication Services
OCS includes electronic services that allow users to communicate or access content via the internet. Among these, platforms with “significant reach” are categorized as Regulated Online Communication Services (ROCS) and must follow a Code of Practice (COP), which sets measures to limit exposure to harmful content.

Social Media Services (SMS), which focus on user interactions—such as Facebook, Instagram, YouTube, and TikTok—also fall under this regulation.

Large groups under scrutiny
Private messaging groups with “very large memberships,” such as WhatsApp or Facebook Messenger, are also subject to the Infocomm Media Development Authority (IMDA)’s oversight. If such groups spread harmful content, the IMDA can direct these platforms to disable access to such content or block the involved accounts.

Examples of harmful content include advocacy of suicide, self-harm, sexual violence, terrorism, child exploitation, and anything that risks public health or racial and religious harmony in Singapore.

If platforms fail to comply, the IMDA can instruct internet service providers to block access to the non-compliant platforms, and impose fines of up to S$1 million for continued breaches.

Protecting young users
The law emphasizes safeguarding young Singaporeans by minimizing exposure to inappropriate content. Platforms will be required to offer differentiated accounts for children, with stricter default safety settings tailored to age-appropriate use.

These measures aim to empower Singaporean users with tools to manage their safety while holding online services accountable for harmful content. The government will intervene when content threatens racial and religious harmony or other societal values.

COMMENTS

WORDPRESS: 0
DISQUS: