6 in 10 young people exposed to unsolicited sensitive content online: Study

Of the respondents who said they had seen unsolicited sensitive images, 68 per cent said they were upset after viewing the content. PHOTO: TNP FILE

SINGAPORE - Six in 10 young people reported being exposed to sensitive content without searching for it, and suffered sustained emotional impact, according to a new study on online dangers and well-being.

These forms of sensitive content include body image-related content, graphic violence, nudity and sexual activity, unhealthy eating behaviour, and gender-based hateful or derogatory content. 

Findings from the study by non-profit SG Her Empowerment (SHE) and Global Shapers Singapore Hub, a network of youth leaders under the age of 30, were released at an online panel discussion on Feb 8.

A total of 500 Singapore citizens and permanent residents aged 16 to 35 were surveyed.

Speaking on the panel were Mr Eric Chua, Senior Parliamentary Secretary for Culture, Community and Youth and Social and Family Development, Meta safety policy manager Priyanka Bhalla and vice-curator at Global Shapers Singapore Hub Calissa Man. The panel was moderated by SHE chief executive Simran Toor.

Aside from exposure to sensitive content, the Safeguarding Online Spaces Study covered youth perception of cancel culture, generative artificial intelligence and safety tools and recourse avenues.

Of the respondents who said they had seen unsolicited sensitive images, 68 per cent said they were upset after viewing the content, and half of this group felt so for at least a few hours.

One male respondent recalled a live stream of two boys in China having their arms hacked off appearing on his feed, saying: “I don’t watch violent content on these platforms since I stick to funny, PG (Parental Guidance) topics most of the time. When this video was recommended in my feed, it was quite horrifying for me.”

Yet three in 10 did not take any form of action to address what they saw, such as by blocking or restricting the account or reporting the content. About two-thirds said they took action by blocking or restricting the account that shared the content, or by reporting the content/account to the platform concerned.

Meta’s Dr Bhalla said that community standards, which determine what stays on the platform, are not set in stone.

“It’s a very tricky balance for us because these are global standards, and so we want to make sure that there’s freedom of expression when it comes to young people’s voices, but at the same time, we want to make sure that there’s safety and respectful behaviour online. In addition, we have to take into account cultural context and nuance as well,” she said.

She added that Meta, which owns and operates Facebook and Instagram, tracks violations on its platform including from user reports and some 40,000 human reviewers who act on violations of the community standards. The company also uses machine learning to detect harmful images before users see them.

On the issue of young people being exposed to sexual images online without searching for them, Mr Chua said this is especially worrying, since the average age of children in Singapore getting their first Internet-connected device is eight, compared with the global average of 10.

“Our young ones start roaming around on the online world much younger and the chance occurrence, as the study has noted, whether you seek it (sexually explicit material) out or not, you will chance upon it and that’s almost a certainty,” he said to the 62 attendees, who included students, parents, educators and people in the tech industry.

Yet, Mr Chua noted that social media and technology are not inherently evil. He likened the saying “fire is a good servant but a bad master” to online and social activity.

He said: “Our approach has got to be one of understanding the challenges and also the strengths, the assets that we have… and thereafter taking mindful action in response.”

Online harms have come under the spotlight after various efforts by the authorities to clamp down on these issues. These include the Online Safety (Miscellaneous Amendments) Act, which took effect in February 2023 and gives the authorities the power to direct social media platforms here to remove egregious content like posts advocating suicide, child sex exploitation and terrorism.

The Protection from Harassment Act has also been tweaked to cover doxxing and make it easier for victims of online harms to claim remedies.

Harmful images were a key concern among young people, the study found.

Almost half, or 48 per cent, of respondents selected image-based sexual abuse as one of their top three online concerns. Among females, 57 per cent chose this option, making it the most commonly selected among them, while 47 per cent of males selected doxxing as their top concern.

Other options included impersonation or identity theft, defamation or falsehoods, hate speech and sexual harassment.

Respondents were also asked whether a woman should accept all forms of comments directed towards her, including disrespectful ones, if she uploads an image of herself.

Just over a quarter of respondents agreed with the statement, with a disproportionate number of them being male. Among males, 37 per cent agreed, compared with 16 per cent for females.

On cancel culture, four in 10 of the young people said they had cancelled others, primarily because they felt that what the cancelled person did was wrong.

Yet, seven in 10 said the fear of being cancelled shapes their behaviour online, for example, by causing them to refrain from sharing personal views.

A total of 86 per cent who had witnessed someone encouraging others to cancel a person did not report the instigator’s actions. This is even as two-thirds of them felt it was wrong to encourage others to cancel someone.

The Safeguarding Online Spaces Study is available at https://she.org.sg/news/study-safeguarding-online-spaces

Join ST's WhatsApp Channel and get the latest news and must-reads.