Where Intellectual Property Fails, Privacy Sails: Asake and the Policewoman’s Cry for “Epp”

By Olumide Babalola

Lawyers and researchers continue to decry the disruption of traditionally settled intellectual property (IP) principles by the characteristically unpredictable growth of AI and technology. Since time of yore, copyright in literary, artistic works or sound recordings had been attributable to the author – a term with clear definition generally recognising the ‘creator’ of work – but AI has come to blur such lines.

While stakeholders are still struggling with the uncertainty of IP claims on AI-generated content, social media introduced another twist where sound recordings are randomly or deliberately picked by musical artistes who casually add such clips to their songs without recourse to the characters involved in the clips. Here, Asake’s song ‘Peace be unto you’ comes to mind. In the song, Asake sampled the voice of a distraught policewoman (Helen Utabo) who cried for help in a video when a motorist attempted ferrying her away.

From an IP perspective, Raphael Irenen argues that Utabo is not entitled to royalties since she does not qualify as an ‘author’ under section 28 of the Nigeria Copyright Act but without referencing any provision of law, Stanley Alieke conversely argues that the woman is entitled royalties on the ground that she is a ‘co-owner’ whose consent was not sought. (see The Woman In The Help Me, Help Me Vocals Is Entitled To Royalties From Asake < https://www.tekedia.com/the-woman-in-the-help-me-help-me-vocals-is-entitled-to-royalties-from-asake/>)

However, that is not why I am here!. Without agreeing or joining issues with the commentators on the IP claims, I choose to identify privacy or data protection as a ‘less’ argumentative claim Utabo can make to her distinct voice in the sound recording.

Epp me Epp me” – a personal data?

I approach this with caution bearing in mind, first, the controversial elasticity that has been ascribed to the term ‘personal data’ in the respective data protection legislation and the often-unobserved qualifying adverbs in such definitions. The stretchy scope of personal data is further made worse by the unpredictability of technology as observed by some researchers thus:  “Even though the definition of personal information might seem straightforward, recent rapid advancements in technology have made it surprisingly difficult to characterise its scope.” (See Rahime Saglam et al, ‘Personal information: Perceptions, types and evolution’ (2020) 66 Journal of Information Security and Applications). Decrying the universally unclosed categories of information that could constitute personal data, Raphaël Gellert – an assistant professor of ICT at the Radboud University, Netherlands notes that “The EU debate—both at policy and academic level—around the notion of personal data has been a steadily ongoing one” (see Raphaël Gellert, ‘Personal data’s ever-expanding scope in smart environments and possible path(s) for regulating emerging digital technologies’ (2021) 11(2) International Data Privacy Law, 196–208).

From a Nigerian’s perspective, Omotubora (alongside Bashu) expresses her concerns on the expansive outlook of personal data thus: “This interpretation of personal data is inherently “expansionist” and regarded as fundamental to the protection of individuals” (see Omotubora, A and Basu, S orcid.org/0000-0001-5863-854X (2020) Next Generation Privacy. Information & Communications Technology Law. ISSN 1360-0834).

Regardless of the existing academic paroxysms against the amoebous definition legislatively given to personal data, its extant statutory idea protects any information that indirectly identifies a person including distorted information. From this potentially problematic definition, once a piece of information can be remotely or closely linked to a person, then such is his/her personal data.

Notwithstanding autotune or any other AI-enabled voice-tweaking device, the policewoman’s now famous “Epp me,  epp me, he dey carry me go where I no know” identifies her or is somewhat linked to her, hence such constitutes her personal data under the Nigeria Data Protection Act, 2023 (NDPA). Elsewhere, some German researchers argue that: “In addition to the linguistic content of speech, a speaker’s voice characteristics and manner of expression may implicitly contain a rich array of personal information, including cues to a speaker’s biometric identity, personality, physical traits, geographical origin, emotions, level of intoxication and sleepiness, age, gender, and health condition. Even a person’s socioeconomic status can be reflected in certain speech patterns.” (See Jacob Leon Kröger et al, ‘Privacy Implications of Voice and Speech Analysis – Information Disclosure by Inference’ Privacy and Identity 2019: Privacy and Identity Management. Data for Better Living: AI and Privacy pp 242–258).

Her cry is on social media – still protected?

A ready-made moral defence instantly wielded by privacy-violators is the publicly accessible nature of personal data as if social media or the public completely erases reasonable expectation of privacy. This should be approached from both privacy and data protection perspectives – a necessity that amplifies the distinction between the closely related concepts. While some information proactively displayed on social media by data subjects may not enjoy reasonable expectations of privacy, they are nevertheless provided by relevant data protection legislation.  As far as privacy is concerned, the Nigerian Court of Appeal confirms Nigerians’ entitlement to a reasonable expectation of privacy in public via the decision in Nwali v EBSIEC (2014) LPELR–23682(CA) thus:

“The correct approach to the determination of the scope of the right of privacy vested by S. 37 of the 1999 Constitution is to first state the right as prescribed by the express provisions of the section without omitting any aspect of the right…Therefore, the privacy of his choice of that candidate and the privacy of his voting for that candidate constitute part of his “privacy” as a citizen… Therefore requiring or compelling him to vote openly in the public watch and knowledge by queuing in front of the poster carrying the portrait of the candidate he has decided to vote for intrudes into, interferes with, and invades the privacy of his said decision, choice and voting, completely removing that privacy, therefore amounting to a clear violation of his fundamental right to the privacy of a citizen guaranteed him and protected by S.37 of the 1999 Constitution.” (Emphasis mine)

The decision above speaks to the protection of privacy even when someone or his personal data is in the public and for fuller measure. In the policewoman’s case, she was not the one that pushed the sound recording into public space, hence she can claim protection from multiple angles and sources. However, for data protection, many provisions of the NDPA relating to principles of data processing, obligations of a controller, data subject’s rights and proof of lawful basis etc bind anyone who processes personal data irrespective of where the data is harvested, found or received.

Voice-featured without authorisation or other lawful grounds

From whatever perspective this is viewed, the policewoman or her (un)altered voice was made to feature in Asake’s song reportedly without her consent. This activity qualifies as ‘processing’ in data protection parlance by Asake and/or his record label as controllers or joint-controllers. On the current position of our data protection legal framework, such processing without any identifiable lawful basis constitutes an infraction of the policewoman’s data subject’s rights as well as the artiste or record label’s obligations under the NDPA.

The woman’s voice was arguably collected from social media (a third-party platform) and this activates her right to be informed as well as the artistes’ obligation to ensure the processing was lawful on any of the grounds provided by law (consent, contract, public interest, vital interest, legitimate interest etc). None of these grounds can however be relied upon without the controller’s satisfaction of certain requirements especially legitimate interest which is often used as a shield by controllers to evade liability under other grounds.


Unlike IP legislation that requires a victim to establish uphill requirements to prove entitlement to statutory protection, for privacy or data protection, it is much easier to prove that a victim is a living natural person, the recording used identifies him/her and that same was used without any lawful grounds under relevant data protection legislation. This denial then shifts the burden of proof to the ‘user’ who must establish the existence of a lawful basis or exemption to avoid liability.

Related Posts