A Government agency is calling for the creation of a new industry-wide regulatory body to protect New Zealanders from harmful content on social media and other digital platforms.
“Child protection and consumer safety for media and online content is not as strong as it should be in Aotearoa. The existing regulatory system is decades old and predates media platforms such as social media,” said Suzanne Doig, general manager for policy at the Department of Internal Affairs (DIA).
DIA say the new body would bring all content-sharing platforms into one cohesive framework, with consistent standards.
The aim is to try and reduce the exposure of harmful content to children and other vulnerable communities across New Zealand, who according to DIA "are being exposed to harmful content and its wider impacts more than ever before".
A research report by the Classifications Office in June 2022 found 83% of respondents were concerned about harmful or inappropriate content on social media, video-sharing sites and other websites - all of which fall in a grey area where regulation is currently unenforceable.
“We need a modernised and consistent approach to the obligations of content providers and a much greater emphasis on the safety of children, young people, and vulnerable groups from illegal and unsafe content. We also need a system that is easier for users to navigate if they need to report harmful content.”
The new proposals would see all forms of media - from traditional broadcasters and publishers, to the social media giants and other online platforms - all adhering to new codes of practice, set out by the new regulator.
Currently, social media and other online platforms are not required under New Zealand law to meet any safety standards on their services, while other parts of the broader media industry rely on voluntary compliance of accepted standards.
Online platforms and companies wouldn't need to be based in New Zealand to fall under the regulator's remit, with their inclusion instead based on their user, or audience base here.
Any codes and standards would be designed in conjunction with industry groups, and backed up by legislation that would set clear expectations and objectives for platforms to meet, aligning with New Zealand values.
Doig said these new proposals aren't designed to limit anyone's freedom to create and post content, and won't be able to tackle the source of harmful content being created in the first place.
The proposal document states "it's not feasible to prevent all harmful content from being created, but it is possible to reduce unwanted exposure to harmful content and improve how we respond".
The new regulator wouldn't come at the expense of other consumer rights.
“It’s important to get these proposals right. We want to create safer platforms while preserving essential rights like freedom of expression, freedom of the press, and the benefits of media platforms," Doig said.
This proposal doesn't seek to change definitions around what is currently deemed illegal, or objectionable material, which are already subject to criminal and civil penalties.
But it is proposing to allow the regulator to have powers to deal with other illegal content, such as harassment or threats to kill.
It would also bring New Zealand in line with international standards set in the EU, the United Kingdom, Australia and Ireland, and better handle ever-evolving technology.
“Safer Online Services and Media Platforms would be a step towards building Aotearoa’s capacity to keep up with changing technologies. It’s time to create a system that better responds to and considers the needs of all New Zealanders," Doig said.
The proposal document is now open to public consultation and feedback.