It makes sense Co-founder Jo Robertson told RNZ that New Zealand was more than a decade behind other comparable countries.
“We are just watching and waiting, looking and seeing … and while we do it, children from all over the country are seeing horrible content.
“They are increasingly preparing and exploited. This is affecting their mental health, body image, their relationships … There are significant branches.”
Robertson said there was a lot of data on the damage caused.
“There has been a 550 % increase in cleaning reports only on Snapchat, and we know that social media channels account for about 35 % of how young people experience hygiene behaviors, so adults looking for their business or content.
“We have a fast growing, deeply problematic, in fact, it will be a traumatizing problem for our young people.”
Last month, the New Zealand Herald He reported that the police received 1549 references by Snapchat in 2024.
There was 617 in 2023, 275 in 2022 and 198 in 2021.
A Snapchat -Voice door told RNZ that it had introduced a new report mechanism last year, which was responsible for increasing references.
They said the platform now had “robust measures.”
“We use proactive detection to find and remove the content that explores smaller and, if we are informed of any content – either through our proactive detection technology or confidential report tools in the application – we remove, block the violation account and report to the authorities.”
Internet survey conducted by Internetnz found that 71 % of people were extremely or very concerned about children who accessed harmful on -line content.
In March, the executive -chief of Internetnz, Vivien Maidaborn, told RNZ’s Morning report That New Zealand was “seriously getting behind the rest of the world.”
“The government needs to prioritize changes to protect people,” said Maidaborn.
“The reason many of us are concerned about children who access content on -line is because we know that our laws and processes are not suitable for the world on -line.”
While Tiktok has announced new features that gave parents additional tools to set up their children on Thursday, Robertson said social media platforms cannot be entrusted to keeping their children safely without regulation.
“My experience is that they are still, curiously, they say things they are doing, but it is not really the experience of people on the ground.
“I don’t know if their technology is a little flaw or doing different things in different countries, but I talked to Snapchat, as an example, or Tiktok, you know, directly and let’s say, ‘Hey, what’s going on here?’
“And they say, ‘Oh, we fix it.’ – But it is not fixed.
Make sense, co-founder Holly Brooker also said Morning report that the new resources were not enough.
“There is a lot of confidence. There is a lot of confidence in these security barriers that are kind of placed on the platforms, and my experience is that they are not really good enough for our young people,” Brooker said.
“We have a huge systemic question in New Zealand in the on -line damage approach. We are really neglecting to solve it.
“We do not have a legislative structure in force. We have no type of application mechanism.”