Big data and privacy for students

Home » Education » Big data and privacy for students

Photo by Taylor Vick on Unsplash

This article takes a close look at big data and privacy, how they are connected, how they’re changing and what this might mean for students and schools.

Global data and information flows now have more impact on GDP than the global trade in goods, which is remarkable considering that it was almost non-existent 15 years ago. Business can now be done globally quickly and easily from a laptop, and cross border data flows are expected to increase by about nine times in the next five years. People are trading, learning, growing networks and otherwise sharing information on a global scale. This trend highlights the trend of ‘Internetization’, which we shared previously, in which while the traditional view of globalisation (global goods trade, major capital flows) has slowed, huge multinational companies are being supplanted by millions of small and medium-sized ones, and that’s where the growth is happening.

Opportunity abounds for those who have this information and understand how the world of work is changing. Are our schools and students having these conversations? Are our learning communities and parents aware of what’s happening? Do they understand how the use of data and the idea of opportunity is changing?


With the massive amount of surveillance and data gathering now going on in China, one might think that its citizens are less sensitive about what is being collected and how it’s being used. That, however, may not be true. Due to recent data leaks both in China and internationally, people are becoming more aware of the need for data security. However, it does appear that there is still some way to go before data protection is a real priority, particularly with regard to legal protection and access to personal data. One Chinese company that uses AI is finding it hard to expand and run its business model successfully outside China because of tighter data protection laws. We’ve talked about personal data security and protection before, and it’s something that every human being must start becoming aware of and conversant with. One to watch, we think.

Gathering student data and using it to inform decision-making about learning, resourcing and educational policy is already well established, with the amount of data gathered likely to not only continue but expand massively. We have posted previously about wearable tech. that tracks student attention spans, learning apps that follow and share learning, and schools in China have started using facial recognition technology – all generate huge amounts of data. So who is actually responsible for storing and protecting it?


This is a timely article from Saro Mohammed, Ph.D., who argues that stewardship of data is paramount, especially as data can and is stored by multiple groups, for example schools, technology developers and governments. Each group can have different criteria regarding the safe storage and use of data, which can be problematic as seen in the recent Cambridge Analytica scandal.

Saro Mohammed argues that while laws and ethical guidelines regarding use of data do exist, enforcement can be inconsistent, and we would not be wise to rely on these alone to protect student data. Instead, she argues for a community-minded approach, in which all stakeholders from parents and principals to policy makers share a collective responsibility for protecting data – each needs to demand and see evidence that data is protected. If we don’t, then the danger that others will use it for nefarious purposes is increased.

Continuing the student data thread, here’s a good example via the Hechinger Report of what can happen with student data if it’s not handled and stored properly. As we know, data about students that are collected by apps and other technologies that are used in schools can be stored in multiple locations under the authority of different stakeholders.

Despite laws protecting the misuse of student data, the linked report within the article from Fordham University (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3191436) identifies how data brokers can gain access to student data, and how their activities are not restricted by data laws. The report identifies 14 companies that market student data to commercial interests, with lists such as “Jewish Students in New York by Education Level,” “Rich Kids of America” and “The Awkward Years – High School Students” for sale. It’s unclear how accurate the information is, but it is clear that schools are not selling the data, and that it is being gathered from various third parties. Of particular risk appears to be surveys conducted by senior students with 3rd party companies that promise to provide career pathways information.

As data-driven learning and decision making becomes more embedded in our schools and education systems, institutions need to ensure that they have policies around what happens to the data that is collected, and a clear understanding of which third parties control or have access to student data, and what is or is not allowed to happen to it.


Facial recognition technology is now in schools. It’s already being used to monitor students’ attention in China and prevent school shootings in America. The software is even being offered free to schools, again to prevent school shootings.

The technology is capable of identifying or recognising a person from a digital image – for example facial recognition unlocks your phone when it ‘sees’ you, and helps your phone organise its photos by recognising who appears in them. It is becoming more frequently used by security systems (not just airports anymore), and is slowly but surely making its way into our daily lives.

The problem is that this technology is completely unregulated in America and many other countries. Facial recognition technology can be used for surveillance, data and images collected can be shared and used without users’ consent, and ethical concerns are growing. Europe has passed laws that require those being recognised to give their explicit consent, and as the technology becomes more powerful more voices are speaking out.

The voices are led by the people who recognise just how advanced this technology is becoming: Microsoft president Bradford L. Smith has called for facial recognition regulations in the U.S., Google employees don’t want their tech. being sold to defense companies, and Amazon employees don’t want their facial recognition software being used by U.S. border enforcement contractors.

The implications for schools are many. What happens to students’ privacy? Who owns and has access to the data that’s stored? How about the temptation to use the technology to monitor friendships and behaviour? Might there be any unintended consequences?


Privacy is a big theme in the conversation above, and continues in this article. It looks at how phone companies are selling their users’ location data to companies who can track their phones in real time. These companies operate in a legal grey area, privacy laws are not keeping up, and anyone with a bit of money can pay companies to track down any individual.

The data is sold in the first instance to ‘reputable’ companies (location aggregators) that use it for legal means by people such as bondsmen and car salesmen, and others who are trying to find people as part of their jobs. However these companies also on-sell the data to organisations about which less is known, with the end result being anyone can easily find your phone without your knowledge or permission.

Our mobile phones are constantly communicating with cell phone towers, even when not in use, and this communication can be triangulated to provide users’ real-time location. In short, there is almost no regulation of the data ecosystem in the United States. An extract:

“Microbilt buys access to location data from an aggregator called Zumigo and then sells it to a dizzying number of sectors, including landlords to scope out potential renters; motor vehicle salesmen, and others who are conducting credit checks. Armed with just a phone number, Microbilt’s “Mobile Device Verify” product can return a target’s full name and address, geolocate a phone in an individual instance, or operate as a continuous tracking service.

“You can set up monitoring with control over the weeks, days and even hours that location on a device is checked as well as the start and end dates of monitoring,” a company brochure Motherboard found online reads.”

The implications are serious and worthy of discussion: predators can know when someone is home alone, houses are empty, and when workers in sensitive industries leave the office. Some phone companies are claiming to have terminated their agreements with location aggregators, but are now coming to new arrangements in which data is still shared. Is regulation needed?


More privacy provocations, and one that our teenagers need to grapple with – for their own sakes. Guess what the most heavily funded AI startup in the world is, right now? The Chinese facial recognition company Sensetime. It’s working on technology that sources 100,000 real time video and data streams to keep track of individuals using CCTV, facial recognition technology, mobile and Internet data, apps, location services, everything.Nervous? We should be. Privacy? There won’t be any. Have a look at Sensetime’s strategic partners, and take a good read through their website. Follow the money – this could be our future if we’re not careful, and we need to engage our young people in this discussion.

The research conducted and insights gained during the writing of this article have inspired the Indigo Schools Framework, the details of which can found in the Primer on our Resources Page. Send us an email at info@indigoschools.net or complete the form below if you’d like to learn more about how the Indigo Schools Framework can be successfully applied within your school. Also be sure to follow us on Facebook and Linkedin for our latest updates.

Interested in transforming your school? Let’s start a conversation.

Scroll to Top