Public Consultation: Safer Online Services and Media Platforms

Consultation is now open on a proposed approach to regulating online services and media platforms in New Zealand. You can read more about the background to this work here.

Read on to download the discussion document and provide your feedback on the Department’s proposal.

Submissions close Monday 31 July 2023 NZST.

Your feedback on the proposed approach will shape the next stage of this work

We are seeking feedback on the proposed changes. We want to know what the public thinks about the proposed approach and whether the proposals meet New Zealanders’ expectations for a safer and effective regulatory system.

Discussion document and supporting information

Executive Summary

The executive summary is a short summary of the discussion document.

Fact Sheet

The factsheet is an overview of the proposals in the full discussion document.

How can I give my feedback?

The discussion document has a list of focus questions. To make it easier, the questions follow the structure of the document. You can have your say by reviewing the discussion document and making a submission through one of the options below.

You don’t have to answer all the questions or use our submission form, and you can share your feedback on any part of the document.

You can submit your feedback:

The closing date for submission is Monday 31 July 2023 NZST

Information Sessions

We’ll also hold information sessions in June and July 2023 to give people the opportunity to ask us questions about the proposals. This would be helpful for anyone that wants to clarify parts of the proposals before submitting feedback.

You can register to attend an information session through the links below.

Information session details

Next steps

What happens after you have sent your submission?

The Department of Internal Affairs will collect all submissions sent by the closing date.

These submissions will be analysed and will feed into high-level policy proposals for the Government to consider before Government makes decisions on the detailed requirements.

We may contact submitters directly if we require clarification on their submission or would like further information from them.

Release of information

We will keep all submissions. We may publish the submissions we receive and provide a summary of them on our website, www.dia.govt.nz. This would include your name or the name of your group, but not your contact details.

Submissions may be subject to a request to us under the Official Information Act 1982. We can withhold personal details under this Act, including names and addresses. If you or your group do not want us to release any information contained in your submission, you need to make this clear in the submission and explain why. For example, you might want some information to remain confidential because it is commercially sensitive or personal. We will take your request into account.

The Privacy Act 2020 governs how we collect, hold, use and disclose personal information about submitters and their applications. Submitters have the right to access and correct personal information.

Why do we need a new approach?

Our main pieces of legislation that directly regulate content are over 30 years old. Their core features are still relevant (e.g., codes of practice, protecting children from age-inappropriate content and censoring the most abhorrent words and images). But the legislation has a lot of gaps, it doesn’t cover the wide range of harms people are experiencing across online services and media platforms. The system can’t keep up with new technologies and we’ve been relying on slow, reactive interventions, which only take effect after people have already been harmed. It’s time to reset the system.

We are proposing to change the way that online services and media platforms are regulated, with the major change being the way that social media platforms are regulated.

Our objective is to design a framework for safer online and media experiences across all types of platforms. We think this will:

  • provide better consumer protection for all New Zealanders and their communities by setting safety-based outcomes and expectations for platforms
  • better protect children, young people, and other vulnerable New Zealanders
  • reduce risk and improve safety without detracting from essential rights like freedom of expression and freedom of the press; and
  • promote a safe and inclusive content environment while remaining consistent with the principles of a free, open, and secure internet.

Contact us

If you have any questions, please email us at sosmp_consultation@dia.govt.nz

You can subscribe to our mailing list to receive updates on this work’s progress (you can unsubscribe at any time). Subscribe here.

Frequently asked questions

The Executive Summary (PDF, 1.4MB) and Factsheet (PDF, 2.2MB) cover the main aspects of the proposals. The questions below will clarify any further questions you may have.

How will the proposals impact me?

Overall, the proposals aim to improve users’ experiences of content – especially online. You’ll likely have fewer experiences of unintentional exposure to the most harmful content, especially while browsing social media, and platforms would be less likely to recommend harmful content to you. You could see more warning labels and advice while you’re using media platforms. If something goes wrong, the new approach would provide you with better access to complaints processes.

For the most part, though, many of the changes in the proposals are focused on the things that happen ‘behind the scenes’ on platforms’ processes and systems.

What about people who create content online, like content creators or influencers?

Under this proposed approach, people who share content on social media, like influencers and content creators, shouldn’t see much change while using online platforms. However, this will depend on the platform’s approach to reducing harm on their services. For example, content creators may see more prompts if a platform’s system thinks they are trying to post something that potentially breaches the platform’s community standards.

Content creators will still need to comply with safety requirements that platforms set, which most major social media platforms already have. While some of those settings might be adjusted, the regulator would not have powers over individual pieces of legal content.

Why is it important for us to regulate platforms?

Our proposal is to regulate platforms because they are the ones who control the experiences users have on their services. Platforms can determine what a safe experience on their service looks like for their users, including creators and sharers of content.

Recent events like the livestream of the Christchurch terror attacks have shown why it is important for us to get this right. While sharing the video of the attack was and will continue to be illegal, we know that this footage, and clips of this footage, had already been pushed to New Zealanders across multiple online platforms before it was taken down.

We also know that some easily available online material can have serious mental health impacts, for example suicide or self-harm related content on social media being promoted to those who are already vulnerable. This is reflected in what young people in New Zealand tell us about how algorithms will often actively push content that makes light of self-harm to them. These issues are not unique to New Zealand. A coroner in the United Kingdom recently ruled that negative effects of online content on Instagram and similar sites contributed to the death of a British teenager in 2017.

During the targeted engagement phase of the programme, we also repeatedly heard concerns about how social media is used to harass, bully, or otherwise harm people. This concern was consistent across our engagement with Māori, young people, advocates for children, women, older people, disabled people, LGBTQIA+ communities, and communities that have been or are targets of ethnic, racial and/or religious discrimination.

These are real harms that are happening to New Zealanders. That’s why we need to get these settings right.

What do you mean by ‘harm’? How will you define it in the regulatory framework?

When we talk about harmful content, we are talking about content that negatively impacts users’ physical, social, emotional, and mental wellbeing, or causes loss or damage to rights and property. We are not talking about feeling offended, although content that is harmful will often also cause offence.

We are not planning on defining harm, or what “counts” as ‘harm’, in the law. Instead, we intend to lift the bar for safety practices on platforms, through clear objectives and expectations for platforms in codes of practice. This is because the types of harm that New Zealanders might experience from content are wide-ranging, so rather than create a catch-all definition, we want a framework that requires platforms to have policies, systems and processes in place that reduce the risk of harm on their services.

The exception to this is for illegal material – a threshold for harm already exists for this type of content, which includes terrorist content and child exploitation material, and we would carry this threshold into the new framework.

Why can’t we just get rid of everything that is harmful?

What’s harmful to one person is not necessarily harmful to another. This means that removing content is unlikely to be the best way to address potentially harmful content and maintain freedom of expression.

It’s also impractical to remove everything that is potentially harmful. The sheer amount of content that is made and the continued evolution of how it is accessed means that content must be managed at a system level by industry; it is not feasible to reduce harm by reacting to individual pieces and types of content, unless that content is, or could be, illegal (such as terrorist content or child exploitation material).

Send this to a friend