Blog by Ben Creet - Policy Manager (on behalf of the policy team)
Last week Dr Ellen Strickland blogged about how InternetNZ has been engaging with the Christchurch Call. Here in the Policy team, we’re a part of that work and we wanted to follow up Ellen’s blog with an update on what we’ve heard from our members, and the things we’re thinking about when it comes to regulatory change here in New Zealand.
Our members understand the issues but are concerned about overreach
In the leadup to the Christchurch Call summit, InternetNZ hosted conversations with our member community. Based on those conversations, our member community:
is open to a level of responsibility for platforms hosting extremist content, but would want to see infrastructure and access services like ISPs and neutral CDNs excluded
is open to a debate addressing free expression and online harms, but not to moves that damage the open Internet
sees respecting human rights and enabling legitimate uses of the Internet as paramount
sees a role for InternetNZ in helping people to understand and manage risks either instead of or alongside potential regulation
opposes the Australian approach to livestream legislation, based on criminal liability, and believes governments struggle to make good policy for the Internet.
Helping the NZ Government implement good Internet policy
For our team, we’re focussing on how to best work with government agencies as they start to implement regulatory changes following Christchurch. A large part of that will be about scoping workable solutions that respect all New Zealanders’ rights, make needed changes, without regulatory overreach.
To avoid unintended consequences and over-reach, we think more work is needed to develop carefully scoped definitions for key terms in the Call and to monitor the way those terms are used in practice. In particular:
“Online service provider” should include platforms which allow and control sharing of content on the Internet, but not services providing Internet access or infrastructure.
“Terrorism” and “violent extremist content” need to cover new and current threats, but be defined in a way that avoids over-reach and compromises to human rights, including through government action. We’re not the experts here, but we’re very aware of the ways other governments across the world define terrorism in ways that suppress dissidents, journalists, activists and minority communities.
We want New Zealand to uphold its place as a country that respects and protects human rights and avoid autocracies and demagogues from pointing to us as the rationale for their next crackdown.
The commitment to “immediate and permanent removal” needs to be developed and monitored with civil society participation, to ensure adequate regard for transparency, appeal rights, and human rights. Use of automated upload filters is likely and heightens these risks.
Beyond the Call itself, there are proposals that service providers should have a duty of care for harmful content (examples of these have been proposed by the UK Government and the Helen Clark Foundation). General duty of care would risk broad over-blocking to avoid legal liability, with severe impacts on rights.
We are going to be thinking carefully and seriously about these issues, and how to ensure that good policy changes are made. The types of regulatory changes that can make us all safer, while making us more open and a more inclusive society. I want all New Zealanders to be able to safely use their voices, seek out the information they want, and need, to grow and develop into the people they are meant to become.
This is one of our team’s most important roles. As stewards for the Internet we need to make sure that the interests of New Zealanders, and their ability to harness the power of the Internet to better themselves and their communities, are respected and not damaged. Domestically, we expect the New Zealand Government to respond to the Call by reviewing and proposing laws relating to harmful content online in the following areas.
|Area||Our priorities for sensible regulation|
|Online content classification||Make online classification easy. Focus on providing information to help NZers reduce harms from the content. Ensure that interventions related to objectionable material do not break the Internet’s end-to-end principle, or lead to over-regulation of online content.|
|Review of hate speech laws||Consider an effective process for notice and response online. Draw on and review the model under the Harmful Digital Communications Act.|
|Social media regulation?||Consider when it is fair and efficient for carriers of content online to bear responsibility, and what types of response they can offer consistent with human rights.|
Across all of these streams of work, both domestically and internationally, InternetNZ will be championing and working to enable the inclusion of academic, technology, business and civil society experts and perspectives.
Radicalisation and extremism are complex problems that are difficult to address. Doing justice to solving these social problems will require all of us to share and welcome diverse perspectives, offering our ideas and passion, and listening to the ideas and concerns offered by others.
Watch this space
We’re putting together our thoughts and analysis and aim to have some policy products out with our thoughts and recommended approaches to issues like how to define online service providers and whether New Zealand should support a duty of care for platforms. We’re doing that thinking right now so if you do have any thoughts, recommendations or views please do get in touch and email us at firstname.lastname@example.org.