The Nigerian aspect of Human Rights Watch’s report on children surveillance by 164 EdTech products

By : Chukwuyere Ebere Izuogu

On 25 May 2022, the Human Rights Watch (HRW) released a report (the report) wherein it conducted both a technical and policy analysis to show that 164 EdTech products used in 49 countries including in Nigeria appeared to engage in data collection practices that risked or infringed on children’s rights. These EdTech products were endorsed by the government in these 49 countries for children’s learning during Covid-19 school closure in 2020. According to HRW, these EdTech products monitored or had the capacity to monitor children, in most cases secretly and without the consent of children or their parents, in many cases harvesting personal data such as who they are, where they are, what they do in the classroom, who their family and friends are, and what kind of device their families could afford for them to use. 

This article provides an insight into the Nigerian dimension of the report, particularly with regard to the data protection analysis of these EdTech products and how their data collection practices may have infringed specific provisions of the Nigerian Data Protection Regulations 2019 (NDPR).

How these EdTech products trigger the application or the NDPR

The NDPR is triggered anytime personal data is processed. Art. 1.3 xix of the NDPR defines personal data as ‘any information relating to an identified or identifiable natural person (“data subject”)’. When a child uses an EdTech product, that child is tagged with a string of alphanumeric characters that acts as an identifier for the child. Through this identifier, both EdTech and Advertising Technology (AdTech) companies are able to collect data about how the child uses the EdTech product. AdTech companies also use this identifier to monitor the activities of the child while online to gain an insight into the child’s interests and characteristics. These insights can then be sold or shared with data brokers or other advertisers who may wish to target the child with personalised ads.

Under the NDPR, personal data includes unique identifiers such as a MAC address, IP address, IMEI number and IMSI. Thus, every time this identifier is used to monitor the child (a natural person) while using the EdTech product and/or is online, the NDPR is triggered. In other words, the use of an identifier to monitor the activities of a child online constitutes (data) processing within the meaning of art. 1.3 xxi of the NDPR.

Ground rules for the lawful processing of personal data in Nigeria

As an initial matter, for the processing of personal data to be lawful and thus not prohibited in Nigeria, the person processing the data (or in charge of the data processing operation) needs to identify at least one of the legal bases provided for in the NDPR. In the context of the report, consent given by either the child or the parent presumably formed the legal basis for the data processing operation involving the use of these identifiers. In addition, every data processing operation must comply with all the principles governing the processing of personal data. These principles serve as the conditions that must be complied with every time personal data is processed and unlike the legal bases, they apply cumulatively. Some of these principles are stated in art. 2.1 (1) of the NDPR and are that personal data must be: collected and processed in accordance with specific, legitimate and lawful purpose consented to by the data subject; adequate, accurate and without prejudice to the dignity of human person; stored only for the period within which it is reasonably needed; and secured against all foreseeable hazards and breaches.

Specific data collection practices of some EdTech products that may trigger an infringement of the NDPR

  • Invalid consent (lack of choice)

According to the report, some of the EdTech products observed did not present users with the choice of declining to be tracked. Under art. 1.3 iii of the NDPR, one of the elements of a valid consent is that it must be ‘freely given’, that is the data subject must be able to exercise a real choice. Satisfying this element requires users to be presented with a genuine choice with regard to accepting the terms offered for using the EdTech product or declining them without suffering any detriment. However, in this case, users were unable to decline being tracked which fundamentally invalidated any consent that they may have given as the basis for using the EdTech product. This is a potential violation of the governing principles of data processing (art. 2.1 (1) a)) and lawful processing (art. 2.2 a)) under the NDPR.

Inability of users to make a choice is also seen in the report where persistent identifiers are used by EdTech products to access the Wi-Fi Media Access Control (MAC) address or the International Mobile Equipment Identity (IMEI) on children’s devices. These identifiers are persistent because it cannot be changed by wiping the device clean with a factory reset. This is also a potential violation of art. 2.1 (1) a) and art. 2.2 a) of the NDPR. The EdTech products involved in these two scenarios in Nigeria are Edmondo and YouTube.

  • Invalid consent (lack of information) and lack of transparency

Another element of a validly given consent in accordance with art. 1.3 iii of the NDPR is that it must be informed, that is adequate information must be provided to the data subject prior to data collection to enable him to make an informed decision on whether or not to consent to the data collection. Good practices require one of such information to be information about the type of data collected and why it is collected. In addition, art. 2.5 b) of the NDPR requires the data controller and/or processor (the EdTech and AdTech companies in this case) to disclose in their privacy policy the type of data collected. Lastly, art. 3.1 (1) of the NDPR grants the data subject (especially when a child) the right to relevant information relating to the data collection in a concise, transparent, intelligible and easily accessible form, using clear and plain language.

In analysing the EdTech products, HRW observed that some granted to themselves the ability to collect precise location data, or GPS coordinates that can identify a child’s exact location to within 4.9 meters. Some of them also had the ability to collect the time of the child device’s current location, as well as the last known location of the device thereby revealing exactly where a child is, where they were before that, and how long they stayed at each place. Some of these EdTech products did not disclose this fact in their privacy policy. This failure to disclose the collection of location data potentially infringes arts. 1.3 iii, 2.5 b) and 3.1 (1) of the NDPR. The EdTech product implicated in this activity in Nigeria is Telegram.

  • Lack of data minimisation

One of the principles required for processing personal data under art. 2.1 (1) (b) of the NDPR is that the personal data processed is adequate. This means that the data controller must collect and process personal data that is relevant, necessary and adequate to accomplish the purpose pursued by the processing operation. This requirement is even more restrictive when taking into consideration the overall best interest of a child.

As explained in II above, Telegram collected near precise location data of users, it is hard to explain how this type of personal data benefits educational requirement of children which is the primary purpose for using the EdTech product. Nor is it apparent why precise location data would be processed for the purpose of personalised ads to be served to a child. In another case, EdTech products were observed to have the ability to collect information about their users’ friends, family, and other acquaintances by accessing the contacts list saved on users’ phones. The information collected is used for shadow profiling, that is to develop profiles on people who have never used the EdTech products. The collection of information relating to third-party through shadow profiling is unnecessary and not proportional having regard to the original purpose pursued by the processing activity of the EdTech product.

The failure to apply data minimisation in these scenarios is a potential infringement of art. 2.1 (1) of the NDPR. The EdTech products implicated in Nigeria in collecting near precise location data is Telegram while in shadow profiling are YouTube and Edmondo.

Conclusion

The foregoing highlights a few examples where these EdTech products may have potentially infringed the NDPR. While this may call into question the data collection practices of the EdTech companies it is important to note that in the report, some of them either denied collecting children’s data or clearly stated that their products were never intended for children’s use. Nonetheless, the report highlights the growing need for an enhanced level of protection for children when their personal data is processed online for purposes of personalized ads. This is because children can easily be influenced by these ads and are less aware of the risks involved. In addition, regulatory guidance from other jurisdictions discourages the processing of children’s data for profiling because of a seemingly lack of comprehension by the child which in turn renders the processing operation unlawful under the applicable data protection rules.

This same standard should be encouraged by stakeholders in Nigeria where there has been a significant adoption of EdTech products by children since the Covid-19 pandemic. In this regard, a child’s use of an EdTech product should never be based on collecting more information than is reasonably required by the service. As a matter of fact, data processing operations of children’s data should be guided by the following considerations: overall best interest of the child; be age appropriate; be transparent in how data is collected; collect the minimum amount of data; utilise parental control, provide innovative tools that enable the child to exercise data subject access rights; regularly conducted data protection impact assessments; and discourage profiling and geolocation of children. On a positive side, the draft National Strategy on Child Online Protection (NSCOP) policy document recently issued for stakeholders’ comment by the Federal Ministry of Digital Economy and Communications is commendable as it recognises the need for stakeholders and policy-makers to work together to ensure that the privacy of children becomes an achievable reality.

Chukwuyere LL.M (Hannover) CIPP/E, is the head of the Telecoms, Media and Technology (TMT) practice at Streamsowers & Köhn and a Senior Research Fellow at the African Academic Network on Internet Policy.

Related Posts