Social Media Platforms to Remove Harmful Content and Add Safeguards for Young People Under S’pore Internet Rules

0

SINGAPORE: Social media platforms such as Facebook, TikTok and Twitter will soon be legally required to implement community standards and content moderation processes to minimize the risk of user exposure to harmful online content, in accordance Singapore’s new set of internet rules.

They will also need to provide additional safeguards for users under 18, including tools to help users or their parents minimize their exposure to inappropriate content and unwanted interactions.

Communications and Information Minister Josephine Teo announced some details of the proposed new rules in a Facebook post on Monday, June 20.

“There is a growing global movement pushing for better online safety, recognizing that with harm comes good when people engage on social media,” she said.

“Many countries have enacted or are in the process of enacting laws to protect users from harm online.”

Teo said Singapore’s preferred approach to strengthening its online regulatory approach is to do so in a consultative and collaborative manner.

“That means learning from the experiences of other countries, engaging tech companies on the latest technological developments and innovations, and understanding the needs of our people.

“These will allow us to develop technologically feasible requirements that can be applied effectively and tailored to our purpose.”

The Ministry of Communications and Information (MCI) said on Monday that it has been consulting with the technology industry since the beginning of the month, and that public consultations will begin next month.

The new Code of Practice for Online Safety and Content Code for Social Media Services aim to codify these standards into law and give authorities the power to take action against platforms that fail to meet the requirements. .

The codes should be added to the Broadcasting Act following the consultations.

The Infocomm Media Development Authority (IMDA) will have the power to order social media services to disable access to harmful online content for users in Singapore.

And the platforms will also be required to produce annual accountability reports to be published on the IMDA website.

These reports should include measures to show the effectiveness of their systems and processes.

Asked about other consequences errant platforms could face, the ministry said it was too early to give details as details are still being worked out in conjunction with the tech industry.

The codes were first mentioned during the debate on its budget in March.

Teo told parliament the codes would focus on three areas: child safety, user reporting and platform accountability.

She also said the MCI was working with the Home Office to provide Singaporeans with better protection against illegal activities conducted online.

This includes strengthening Singapore laws to combat illegal online content such as terrorist content, child pornography, scams and content inciting violence. – The Straits Times (Singapore)/Asia News Network

Share.

Comments are closed.