2019 in Internet policy: Facing extremism, Internet filtering, and platform responsibility in a very tough year for New Zealanders online
Kim Connolly-Stone •
2019 has been a challenging year for the Internet on many fronts. InternetNZ has been reflecting on what it means to help New Zealanders harness the power of the Internet to work, play, connect, and learn. Our policy team works to understand and lead conversations on the big policy problems affecting the Internet in New Zealand.
Following the devastating attacks on Christchurch mosques on March 15, we have been at the forefront of advocating for an Internet that is free, open, safe and secure for all New Zealanders. As part of this, we have been working hard to develop policy guidance that will help the Government achieve its goals in responding to the Christchurch mosque attacks.
The New Zealand Government announced a programme of domestic law reform to counter violent extremist content online. This will initially include targeted changes to the Films, Videos and Publications Classification Act 1993, which will be followed by a broader review of media content regulation. We are also expecting to see policy work on the issue of hate speech.
The InternetNZ policy team has already made a number of contributions to the domestic policy process. In addition to responding to government calls for submissions, we are thinking ahead about the issues and ideas likely to be in contention. We have hosted events and published discussion starters around these issues, to help shape the emerging thinking. Our aim in doing this is to ensure that government policy and regulation has fully considered the implications for the Internet.
We are working with government agencies on ways to help New Zealand respond to an event like the livestream of the Christchurch terror attack
We attended the Department of Internal Affairs (DIA) workshops on countering violent extremist content online and made a submission on the Government’s proposed changes to the Films, Videos and Publications Classification Act. Our understanding is that the proposed changes are intended to be non-controversial and things that can be changed fairly quickly. But the problems around violent extremism online are complicated, and require a lot of time, thinking, and open discussion to craft effective solutions in a way that New Zealanders can understand and trust. We have concerns about the tight timeframes for this work, and the lack of a more open process that allows the broader community to participate. We agree that the proposal for quicker interim decisions by the Censor makes sense and can be implemented quickly. But proposals to include live streaming in the definition of publication, to create new content takedown processes, and to consider penalties and safe harbours for intermediaries all require more thinking.
We are urging the Government to approach Internet filtering with caution
Internet filtering is one of the policy options being considered by the Government, and has been proposed not just to address violent extremist content online, but also for gambling and pornography. To help New Zealanders understand the issues and options, we released To block or not to block, our guide for government policymakers on the technical and policy considerations of Internet filtering. The purpose of this document was to explain practically how Internet filtering works and to share our understanding of its effectiveness and its impact on Internet openness and security. We suggest that blocking at the ISP level is a blunt tool which is likely to have unintended effects on how most New Zealanders use the Internet while failing to effectively address the behaviours it targets.
The Government’s proposed change to the Films Videos and Publications Classification Act includes a proposal for a web filter. At consultation meetings hosted by DIA, we heard concerns from a range of community groups about the idea of government Internet filtering on potentially controversial issues. Our submission to DIA talks about the risks of blocking legitimate content, human rights issues, the ease of circumvention, and the increased risk of security vulnerabilities that come from workarounds. We also highlight that the most severe concerns around extremist content online relate to content posted or shared on large social platforms, which cannot be effectively addressed by a technical filter at the ISP level.
We are looking at how social media companies should be held responsible for harm on their platforms
In October, we followed up with a one day workshop with government officials, legal experts, social media representatives and other people interested in Internet filtering. We also talked about the idea of a duty of care for online services, which has been proposed in a number of quarters. We have reflected on the kōrero from that session and produced a conversation starter on duty of care. This is a collaborative Google Doc exploring what a duty of care is and how it might work. We are inviting contributions to the doc. You can also email us with your views at policy@internetnz.net.nz. Next year we will evolve the document into a position paper on whether New Zealand should adopt a duty of care for online services.
We are exploring what Internet openness means in 2019 and beyond
Concepts of Internet openness are an important part of the conversation about online harms. We released a discussion starter on Internet openness: What it is and why it matters. Openness is what has allowed the Internet to evolve. It allows new ways to connect, communicate and innovate, which are not controlled by any business or decision-maker. But with greater global uptake of the Internet, the growth of online services such as social media and concern to address emerging online harms, we need to talk about what openness means in this changing context. Our paper is a conversation starter, and we would love to hear your thoughts - you can send them to us at policy@internetnz.net.nz.
Looking ahead
We expect the domestic policy work on online harms will keep us busy for a long time yet. Coming up next year we will be:
- offering thoughts on the regulatory toolbox available to governments wishing to shape the actions or behaviours of online services
- presenting our position on duty of care
- thinking about the definition of “online service provider”. This term is used a lot at the moment but is not defined in our law
- engaging in the government work on media content regulation and hate speech
- pondering what an Internet for good might look like.
If you have views on any of these issues or ideas on how InternetNZ can better engage on them, we would love to hear from you.