Children's rights to privacy in the era of Big Data
Ciara Arnot Senior Community Advisor •
Dr Caroline Keen is the Founder of Sociodigital Research. She is a member of the NZ Privacy Foundation Working Group for Children's Privacy, and a recipient of research grants through our community funding.
Dr Keen's 2019 research revealed that parents generally believed that institutions and companies collected a minimal amount of data about them, and that there were regulations to prevent the collection of sensitive data from their children without parental consent. They were unaware of the risks that commercial software services and in particular EdTech, now widely used within New Zealand schools, pose to children's privacy.
She sat down for a kōrero with Senior Community Advisor/Kaitohutohu Matua ā-Hāpori Ciara Arnot.
Qu: In a recent *RNZ article, you discussed EdTech and children's data privacy. Can you start by explaining what you mean by "EdTech"?
Dr Caroline Keen:
EdTech, short for educational technology, refers to the hardware and software now used within schools to manage student data, as well as other software commonly used to enhance teaching learning and assessment in classrooms. During the Covid-19 pandemic Government embraced EdTech as well as many other commercial software services to facilitate remote teaching and learning. This uptake of EdTech software, and additional commercial communication platforms has established digital technologies as a new norm within education.
But, this opens up all sorts of privacy risks.
The Privacy Foundation's recent report suggests that the authorities have not yet addressed the risks to children’s data privacy, and don’t appear to be taking these risks seriously.
Qu: Tell us more about these risks.
Dr Caroline Keen:
Many of the administrative, managerial, educational, measurement and after hours activities of schools now use external software technologies and services so that student data is now collected, managed, created, and shared in ways that parents and students may not be privy to. Apart from the growing amount of information now collected by schools, the use of external commercial software providers exposes children to another layer of data privacy risk. Here, it is almost impossible to fully ascertain what is collected from students due to a lack of transparency from EdTech commercial software providers. We know that the data collected from students by EdTech companies and other commercial services in schools, far exceeds any educational purpose, and that both schools and parents remain largely unaware of its commercial use. For instance, the data collected can be used to piece together a person's learning challenges, emotional and mental wellbeing, behavioural data, and sensitive data such as religion, political views, their personal interests, activities, and social networks, and that these profiles are repurposed by third parties beyond the educational environment.
If we don’t address this issue now, by the time today's children leave school, companies may have accumulated 13 years of personal data from their educational and developmental journey. Our chief concern is that these very detailed and sensitive profiles may follow them into adulthood, and may then have discriminatory impacts on their access to health, education, wealth, and wellbeing – without their knowledge or ability to dispute negative outcomes.
Qu: In 2019, you spoke with parents and children for your research. What did you find out?
Dr Caroline Keen:
Both parents and their kids want to control who can access their personal and sensitive information. They are motivated.
But their idea of privacy risk when using digital services tended to land in the area of ensuring children’s physical safety and preventing immediate reputational harms amongst their peers and community. Their focus was almost entirely on interpersonal communications, so that both parents and teenagers were surprisingly adept at using privacy settings on social media to control who had access to their personal and family information.
It is important to note that when you are aware of the risks, you can try to manage those, and parents and teenagers managed interpersonal privacy risks quite well.
My research revealed that neither parents or children were fully aware of the longer-term risks of children’s data being collected, sold and reused when they are adults.
Most people expect to be able to apply for a job or for university without having to disclose sensitive and personal information about their political views, religion, sexuality, upbringing, educational challenges, socioeconomic status, or lifestyle choices. However, the details now collected, generated and inferred from commercial digital services including EdTech applications can produce profiles from which those companies and other third party companies then profit from.
Despite this, my research found that parents and children didn't think companies were interested in collecting sensitive information about them, and if companies did collect their personal information, they could not visualise how this might harm them in the longer term. This is because they conceptualised privacy risk in terms of the potential for immediate harms to their reputation or physical safety. In fact, many parents believed regulation was in place to protect children’s personal information from commercial interests.
Qu: What did you find in terms of information shared with schools?
Dr Caroline Keen:
The study found that parents and children shared a minimum of information with schools on a ‘need to know’ basis. So, they expected to share things like their family contact information, payment information, grades, absences, and any special health and educational needs with schools. They believed that schools kept these student records secure and confidential, and certainly did not share this without their consent.
Parents and students were not aware of the amount of data that is now recorded and collected by schools, and they definitely were not aware that data that is often collected by external commercial companies providing EdTech services into those schools.
Students were also concerned if their sensitive information was made public and could feel that they had little power in this situation when their privacy expectations were breached.
Another important finding was that many parents did not want negative behavioural data about their children to be permanently recorded or shared with other schools or teachers as they were concerned this would prejudice future teacher-student relations.
Most were unaware that some EdTech used by schools may be collecting behavioural data about their children. While many do collect behavioural metadata without their knowledge, there are even EdApps like Class Dojo for instance that encourage teachers to record micro behavioural data about students. These systems are likely to over represent more vulnerable children. We do know that some parents have concerns about such systems. Whether parents or students have consented to such information being collected, have access to it, let alone be able to contest such information are questions we don’t yet know the answers to.
To sum it up, their perceptions and beliefs are not in sync with the scale and nature of personal data collected from children within education today, and that personal data may be used for commercial purposes unrelated to the child's education.
Qu: What sort of long-term issues could this create for families?
Dr Caroline Keen:
Aside from children being subject to targeted advertising and the harms that this can do, there are concerns about the potential for discrimination later in life. Commercial data processing, repurposing and resale of children’s personal data could result in some being excluded from life opportunities. For example, future employers, universities, banks, or insurers may decline their applications based on commercial data profiles, and the individual concerned will not have the ability to dispute such decisions.
The parents I spoke to in my research had not considered the risks that commercial behavioural profiling and predictive profiling may pose for their children once they become adults. Nor were they aware of institutional monitoring of student behaviour using some EdTech which can result in more sensitive data being recorded about our more vulnerable students.
Qu: You mentioned the assumption that regulation exists to protect children's data, is that not the case?
Dr Caroline Keen:
No. In the broadest sense, the current New Zealand Privacy Act does not acknowledge children's vulnerability to commercial data mining outside or within education. While the Government’s focus has been on developing ethical frameworks to inform the Government's collection and handling of personal data, they have not addressed commercial exploitation of children’s data from their use of digital hardware and software services as either consumers or within education.
So there is a lack of regulation of the EdTech industry. As the NZPF working group for children’s privacy investigations show, there has been no investigation by our Government into the privacy impacts of EdTech and other commercial services used within schools.
Schools are given the responsibility but no guidance on how to audit or manage the privacy risks stemming from EdTech software. Many, if not most, New Zealand schools have not addressed the commercial privacy risks of EdTech.
Qu: What kind of solutions do you see?
Dr Caroline Keen:
New Zealand has insufficient regulation to protect students’ data privacy. Our laws do not yet fully reflect children’s digital rights, their rights to privacy, and to be able to learn in an environment free from commercial exploitation. There is much to be learned from emerging laws and regulation overseas that address commercial processing of children’s data. However, student privacy has been largely overlooked during the pandemic as governments and schools embraced EdTech and other software services to facilitate remote learning. Our lack of data protection regulation requires that we ask some critical questions about the privacy risks to students, their privacy rights, and what the role and responsibilities of the government and the EdTech industry should be in addressing these problems.
Ideally, New Zealand children should have the right to learn in an environment free from corporate surveillance and its long-term consequences, and not be forced into digital ecosystems that track them throughout their education and developmental years.
At the very least, we need to ensure children are not subject to corporate surveillance throughout their education. We need a child-specific data protection framework and to apply this to not just EdTech but all services that children are likely to access at school and at home.
There are many other issues to consider, but perhaps most urgent is that we need the EdTech industry to be transparent about their data collection practices so that resources might be developed to help government, schools, parents, and students understand and negotiate better safeguards for students’ personal data privacy.
Ciara — Qu: Ngā mihi nui Caroline, it's always great to kōrero with you. What can you point to for people who want to dig into this further?
Dr Caroline Keen:
I've been out and about a lot lately, so if you check out my website, you will find a recent interview I did on TVNZ1 Breakfast, and a couple of news articles from RadioNZ. I will be adding more resources, and writing articles that I will post on my blog.
They can check out the issues paper from the NZ Privacy Foundation too.
For organisations and schools who are interested in addressing these issues I am able to provide expertise as a consultant. I am also very keen to continue my mahi by doing further research and consultation within schools.
Dr Caroline Keen
Sociodigital Research Limited
firstname.lastname@example.org or Hello@sociodigitalresearch.net
Cell: 027 2758585