InternetNZ is in the process of crafting our submission to the DIA’s discussion document, Safer Online Services and Media Platforms. After kōrero with the community and stakeholders, we are forming opinions on how this proposal could best impact Aotearoa. This page does not include our entire whakaaro but is a snapshot of some key positions we hold.
What we liked
Transparency reporting from platforms
More transparency from ‘regulated platforms’ (platforms which hold content such as social media) on how they actively manage their algorithms and how they amplify content could curb the mass proliferation of harmful content on larger platforms. There is currently a lack of transparency around how much large platforms push engagement of harmful content. For the public to build trust that platforms are not using algorithms or AI-based tools to spread harmful content, transparent and regular reporting on this particular issue is needed.
The potential for a prescriptive approach to regulation
The Safer Online Services and Media Platforms document asked for feedback on a supportive (focus on collaboration and partnership with industry) or prescriptive (more directive and stronger powers of the independent regulator) approach. InternetNZ supports a prescriptive approach to this regulation instead of a supportive one. A supportive approach has been shown in other countries not to work, don't have enough incentive to implement policies that minimise harm. An approach that gives platforms more latitude to regulate themselves has proven ineffective, both abroad and here in Aotearoa, as evidenced by the Code of Practice for Online Safety and Harms. A prescriptive approach will provide the independent regulator more power to ensure compliance. Jurisdictions that use a prescriptive approach include the European Union.
Expansion of takedown powers to material illegal under other New Zealand regimes
It is InternetNZ’s opinion that there is harmful content that does not, but should meet the threshold to be classified as ‘objectionable’. We believe that if takedown powers were expanded to include other laws, such as incitement laws under the Human Rights Act, then those communities most affected by harmful content would be better protected. We don’t think all illegal content should be in this basket. However, there may be some (such as infringing copyright material) that does not rise to the level necessitating this remedy.
Limited engagement so far
InternetNZ understands that for some, the submission period will not be long enough, meaning that engagement is not adequate. The proposals in the SOSMP doc will have a huge impact on New Zealanders, and it is vital that communities are given the opportunity to have their say. InternetNZ would support the DIA either extending the current submission period or releasing a second discussion document with another more targeted engagement for groups most affected by harmful content.
InternetNZ has heard from Māori stakeholders that there is an expectation of co-design and co-governance in this process. InternetNZ supports in-depth engagement and co-design with Māori on any further development (eg; any future workshops, or planning), a minimum of 50% Māori representation on the board discussed in the discussion document, and a resourced advisory body (that includes Māori) as part of the proposed new regulatory structure.
Exclusion of most affected communities
InternetNZ is adamant that communities most affected by harmful content must be actively involved in every aspect of this proposal's development, creation, implementation and maintenance. For example, there is no mention in the discussion document of the LGBTTQIA+ community in the SOSMP document. There is clear evidence that trans people, for example, face increasing online harm both nationally and internationally. If these communities' involvement is not explicitly written into the legislation, there is no guarantee that their voices will be heard.
Structure of the regulator
InternetNZ is concerned that the proposed structure of the regulator will not include the voices of the communities most affected by harmful content and that it will not include a diverse range of technical expertise and lived experience to ensure the regulator has context for its mahi and decisions.
We would like to see a structure that includes embedded and resourced input from the communities most affected by harmful content and legal, tech, and subject matter experts. We would like to see a structure that includes a separate recourse entity to objectively assess the Regulator's decisions.