Marginalised and at risk groups
InternetNZ acknowledges there are communities that are specifically targeted by online harm. Recently, there has been a notable increase in online abuse of particular communities, such as LGBTTQIA+, Muslim, Pasifika, Asian communities, women, and others. InternetNZ supports these communities having their whakaaro heard during the process of public engagement with the DIA’s proposal to regulate online media.
Discussion of marginalised groups in the SOSMP
The proposal's core focus is regulating harmful content for children. The document does acknowledge a Netsafe study on Asian, Māori, and Pasifika experience of online hate speech (p. 17).
There are three examples of what the proposed regulation may practically look like. Two relate to harmful content for children (content promoting dangerous disordered eating and adult content in video games). The third example discusses what may occur under these new regulations when women are harmed by violent misogynistic content (p. 75).
The Safer Online Services and Social Media discussion document does not mention the LGBTTQIA+ or Muslim communities and there is no specifically defined role for LGBTTQIA+, Muslim, Asian, Pasifika, or other minority communities in either the development of the code of practice or the governance of the regulator. However, several sections commit to the code of practice and the regulator needing to protect vulnerable groups from harmful content (p. 19). There is also mention of both industry and the regulator needing to engage with ‘civil society’ groups, however those groups are not defined (p. 40).
What could change with the implementation of a new regulator?
The discussion document proposes no changes to current legal definition of illegal content, including the definition of ‘objectionable’ material. One of the regulator's objectives will be to ensure ‘freedom from discrimination’ (p. 9). The regulator would have no power to moderate or require platforms to remove legal content (p. 7).
Discrimination based on a person’s sexuality, gender, ethnicity or religion may not be able to be removed because the current ‘takedown powers’ (the ability to remove illegal content online held by DIA) only apply to ‘objectionable material’. The current legal definition of ‘objectionable material’ excludes hatred and discrimination.
There is, however, a proposal to expand ‘takedown powers’ to other definitions of illegal content. For example, online death threats and harassment are included in the harmful digital Communications Act (p. 55). Should other laws be added to the ‘takedown powers’ of the DIA, the Chief Censor or the regulator, then harassment motivated by a person’s sexuality, religious beliefs, etc, would have to be removed by regulated platforms.
InternetNZ’s commitment to supporting marginalised communities
Decisions about these new regulations should be influenced by listening to communities most impacted by harmful content online e.g., Māori, Pasifika, Muslim, Asian, LGBQTTQIA+ etc. If you’d like to make a submission to the Department of Internal Affairs responding to the SOSMP discussion doc, check out our page, ‘Make a submission’.