AustLII Home | Databases | WorldLII | Search | Feedback

University of New South Wales Law Journal Student Series

You are here:  AustLII >> Databases >> University of New South Wales Law Journal Student Series >> 2022 >> [2022] UNSWLawJlStuS 23

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Separovic, Joss --- "Does Instagram Own My Dna?" [2022] UNSWLawJlStuS 23; (2022) UNSWLJ Student Series No 22-23


DOES INSTAGRAM OWN MY DNA?

JOSS SEPAROVIC

A NOTE FROM THE AUTHOR

We, as an emerging Information Society, are currently nestled, perhaps naively, between two colliding spheres. The first is a world where machine use is optional, something voluntary, an action performed by choice; let us dub this the world of yesterday. The second sphere is the world of tomorrow; where all human interactions generate ‘data’ and are governed, documented and archived by machines and their owners. This essay will argue that biometric data represents the turning point from yesterday to tomorrow. In the words that follow, and from the vantage point of the world of today, the writer will outline why biometric data is an important issue and make specific recommendations with respect to its use and regulation; it is up to us to ensure the peace and security of yesterday is not completely forgotten for the commerce of tomorrow.

I INTRODUCTION

This essay argues for the need of a global approach to the regulation of biometric data and calls for a shift in thinking about data generally. The essay strives to communicate two fundamental points: 1) Ownership of biometric data should inalienably rest with the individual whose body was scanned to create that data; and: 2) Biometrics of children require prompt and further consideration by the law.

These points permeate throughout the paper’s structure which is: firstly, an outline of the importance of biometric data and why traditional privacy concerns are inadequate; secondly, an exploration of emerging biometric developments in the US and how legal ownership could be improved; thirdly, an analysis children’s relationship with data, including children with disabilities; and finally, specific recommendations regarding data protection.

But why is this important? Internet use is bordering on compulsory in our society; it is required for most jobs, it practically governs our social relationships, even the writer of this paper is furiously web-searching for resources this moment. When a human uses the internet, they inevitably leave a digital footprint and create data. From this perspective, data can be viewed as a natural bi-product of being in the world – it is inescapable. As we will see, all around the world laws concerning biometrics are being generated in slightly different ways creating a multiplicity of inconsistencies where uniformity is crucial. The unfortunate commonality among these laws is an absence of clarity with respect to who owns biometric data and in what capacity. Accordingly, the need for action with respect to children’s data is immediate. This is because unless consistent action is taken, a generation gap could be exploited allowing for the capture of an entire generation of children’s data (if not several generations) for commercial use. If society is comfortable with this, then we must ensure adequate controls are in place to ensure the ethical treatment of captured data.

II THE LANDSCAPE OF TODAY – WHY OWNERSHIP IS WARRANTED

Traditionally, data has been viewed as a privacy issue, with nations enacting legal instruments protecting the digital privacy of their populations. While this is a good start, it ignores this simple fact; “data is now the world’s most valuable asset”[1] and despite this, the creators of data (you and I) have little visibility over its use. Biometric systems, can be defined as “a variety of technologies in which unique attributes of people are used for identification and authentication[2].” These systems enable the capture of biometric data; that is “unique behavioural or physiological attributes[3] taken from individuals. Biometric data is therefore distinct from other forms of data in that it shares an indivisible relationship with the human bodies.

Despite the unique relationship, current conceptions of how this data should be characterised tend to revolve around traditional conceptions of privacy as opposed to ownership. This is because biometric data is still viewed as an extension of personal data which is routinely utilised by organisations for authentication and identification. Privacy laws, then, typically map out what an individual can do with respect to their captured data while not considering as an asset owned privately by the person it was taken from. For example, in Australia, under the Privacy Act 1988[4], biometric data is classified as sensitive information which is itself a subset of personal information; that is, it is not considered to be an asset or commodity belonging to the individual it was taken from, but shared information used to facilitate commercial or other professional transactions.

From a global perspective, we can see privacy itself is being forced to encompass an area that is so far beyond its initial scope, it is at breaking point. For example, Article 12 of the Universal Declaration of Human Rights (UDHC) addresses the concept of privacy directly:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.[5]

Reviewing this language, it is clear the collation and commodification of biometric data was not a consideration when privacy become a human right. It is entirely possible, for example, that an individual’s biometric information could be obtained illegally and used for commercial purposes without that individual ever knowing it had occurred. This would not constitute interference or an attack on that person’s honour or reputation in any way. Biometric data has therefore created a new problem for humanity and, therefore, a new approach is required to handle it.

Even the European General Data Protection Regulation (GDPR) which is perhaps the best example of an attempt to empower individuals with respect to data, does not explore a concept of ownership in detail, but instead maps out the relationships between those collecting the data and those surrendering it. Article 5 of the GDPR clearly outlines that data controllers (those who capture and yield data) “shall be responsible for, and be able to demonstrate compliance[6] with respect data treatment. Reviewing this language, we can see data controllers still being thought of as owning valuable assets requiring care and maintenance. As our principle area of concern here ownership, neither the GDPR nor the UDHR will be a primary focus for this paper.

If data can be construed personal property, it would be absurd for individuals to not be the legal owners of their own biometrics. While corporations have been at liberty to convert other forms of personal data into profit for decades, an economy of biometrics is the digital equivalent of organ harvesting. It is therefore necessary to explore biometric data as both a privacy and an ownership problem.

III TRADITIONAL PRIVACY CONCERNS

There is clear public concern regarding the security of biometric data. This concern has been validated through real-life incidents such as the Biostar 2 event. In Biostar 2, security company (ironically!) Suprema, owned a database containing biometric data including fingerprints of over 1 million people. In 2019 from 5 August until 13 August, the database was found to be publicly accessible, unprotected and mostly unencrypted after it was integrated into another system[7]. The leak mostly impacted people in the UK, but drew global attention to the dangers of corporations failing to adequately protect data. Because the bulk of the data was consistently being used to ensure company employees were appropriately identified and authenticated, Suprema still satisfied components of Article 5 of the GDPR as the data was collected for a “legitimate purpose” and was not being kept for “longer than is necessary.”[8]

What ramifications could stem from the Biostar 2 event are uncertain. While there have not been reports of the data being captured by third parties, the potential remains real. As biometric data is an electronic copy of a person’s unique attributes, once it is compromised, it cannot be changed like a password or destroyed like a credit card; it remains permanently connected to the person it was sourced from, with “counterfeiting and identity fraud [being] the most evident risks”.[9]

Another privacy concern is that technologies such as facial recognition could be easily used to identify individuals without their knowledge or consent.[10] This concern can be extended from corporations to law enforcement and, in Australia, that is exactly what happened. Clearview AI, which proudly presents itself as “the world’s largest facial network[11] provides facial recognition services to its law enforcement agency clients. The facial network was built by “collecting several billion publicly available images from the web, including from social media sites such as Facebook and YouTube.”[12] In 2019, Clearview AI offered a free trial to the Australian Federal Police (AFP) as well as several other Australian police agencies.[13]

The Australian Information Commissioner initiated investigations into both Clearview AI and the AFP. The first investigation determined Clearview AI failed to comply with Australian Privacy Principle (APP) 1.2 in Schedule 1 of the Privacy Act 1988. Fundamentally, Clearview AI’s approach to the collection of sensitive information was found to be unlawful as there was no provision of consent[14], the personal information was not collected by lawful or fair means[15], no reasonable steps were taken to notify the individuals whose data was collected[16] and no reasonable steps were taken to ensure the data was accurate, up-to-date, complete or relevant[17].

The investigation into the AFP[18] determined the law enforcement agency had interfered with the privacy of the persons whose personal information they shared with Clearview AI by failing to conduct a Privacy Impact Statement for a high-risk privacy project[19] and failing to take reasonable steps to implement practices, procedures and systems relating to the entity’s functions or activities[20]. While it was determined that the AFP acted inappropriately, in the absence of any available guidelines with respect to the treatment of biometric data, perhaps any course of action taken by law enforcement would have been treated similarly.

Both Clearview AI and the AFP were ordered to not repeat or continue the acts and practices they engaged in pursuant to s 52(1A) of the Privacy Act 1988. This involved Clearview AI ceasing the collection, and destroying the images already collected, of individuals from Australia. However, given Clearview AI are a US based company and their database was gleaned from publicly available websites, it seems unlikely that the organisation could ever truly be prevented from collecting similar images in the future. Ultimately, the victory could prove illusory as it is practically unenforceable.

Looking at the matters of Biostar 2, Clearview AI and the AFP, we can see biometric data has not been safeguarded adequately and that, despite public concern and legal intervention, organisations remain at liberty to collect data for commercial purposes. Furthermore, without clear guidelines for law enforcement’s treatment of biometrics, what they can and cannot do when attempting to solve cases remains uncertain. So, what if biometric data was considered the property of individuals from whom the data was sampled? Presumably, Suprema would have kept their data much more secure if they were at risk of paying damages to over a million people. Similarly, Clearview AI’s business model would be less lucrative if they had to pay for the data they sourced for free. Let us now explore what ownership may look like using US law an example.

IV CURRENT US LAW ADDRESSING BIOMETRICS

The first major piece of biometric legislation in the US was the Illinois Biometric Information Privacy Act 2008[21] (BIPA). BIPA was established in the wake of the bankruptcy of Pay By Touch, who were “operating the largest fingerprint scan system in Illinois”[22] at the time. With the company selling off its assets, legislators and the public suddenly realised there was no legal instrument preventing the sale of biometric data they had collected to enable the payment services they offered. While BIPA is a landmark development of biometric law, we can see its origin stemmed from a position of reactive law making rather than proactive. This confirms that, as a society, we were unprepared for the advent of biometric data. BIPA is disappointingly silent when it comes to the biometrics of children, but we will return to this momentarily.

BIPA establishes a dichotomy with respect to biometric identifiers and biometric information. Under the law, a biometric identifier is “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry” while biometric information means “any information, regardless of how it is captured, converted, stored, or shared, based on an individual's biometric identifier used to identify an individual[23]. Instantly, we can see BIPA’s biometric identifiers do not attempt to catch emerging or yet to emerge biometric technologies, a common problem that can afflict reactive law making. This was addressed with the California Consumer Protection Act 2018[24] (CCPA) which uses the phrase “includes but is not limited to[25] when listing biometric information. This means in theory CCPA would be more capable of protecting biometric data into the future than BIPA. (Semantically, biometric information under CCPA appears to be the equivalent to biometric identifier under BIPA, indicating inconsistencies in language).

BIPA grants any person aggrieved by a violation a right of action against the offending party[26]. In the case of Rosenbach v Six Flags[27] it was determined that aggrieved in the context of BIPA is the situation that occurs “when a legal act is invaded by the act complained of[28] This means that no actual harm or injury is required in order for a course of action to be initiated, only “improperly collected and stored biometric information[29]. This is important as it hints at the possibility that biometric data belongs to an individual, and as such, must be treated with respect whether it impacts the livelihood of that individual or not. Similar to Article 5(f) of the GDPR[30], the CCPA strongly encourages corporations to secure biometric data in a manner that will prevent misuse or infiltration.

BIPA maps out how biometric identifiers and information must be permanently destroyed once the “purpose for collecting or obtaining such identifiers or information has been satisfied or within 3 years of the individual's last interaction with the private entity, whichever occurs first[31]. As the longer data is kept, the more susceptible it becomes to being compromised by hacking or identity theft, the placement of time constraints is appropriate. However, analysing the language here we identify a vulnerability: if a corporation used biometric data to identify an individual every time they spoke with them, all that would be required to keep the data in perpetuity would be contact the individual every three years.

CCPA can be seen as attempting to pick up where BIPA left off. Rather than concentrate on biometric identifiers, the CCPA defines personal information as “information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household” with biometric information included under this definition.[32] Unlike the reactive and exhaustive of BIPA, CCPA considers future technologies. This is an example of how laws created from a standpoint of proactivity can be more thorough than reactive laws.

Reminiscent of the relationship between Articles 26 and 29 of the GDPR[33], the CCPA establishes responsibility for companies with respect to both the biometric information they store and the data they share with other companies[34]. This means companies who have had information shared with them must delete the data if the first company is instructed to do so by the individual. Like BIPA, this hints at ownership. One concerning element to CCPA, however, is the authorising of businesses to offer financial incentives for collection of personal information[35]. While this reinforces the concept of ownership generally, it pulls the emphasis away from the individual, back towards the corporation. Further, by offering incentives for biometric information, corporations will be able to target specific groups based on their appeal to that incentive. For example, a financial incentive that involved a discount on children’s toys would result in parents offering theirs and their children’s data.

Unlike BIPA, the CCPA does specifically address the biometrics of children, however in the view of the writer this component of the law is underwhelming and inadequate. Under CCPA, a business shall not sell the personal information of a consumer that is less than sixteen years of age “unless the consumer’s parent or guardian as affirmatively authorised the sale of the consumer’s personal information.”[36] Again, we see here transference of ownership of the data being a key theme in CCPA. This mechanism also fails to recognise the value of data beyond its point of sale, for example, under the law, corporations can still target the collection of the biometrics of children, as long as they do not sell it. The data itself could be used to inform business decisions and lead to targeted advertising campaigns based on the physical characteristics of children who have no input or understanding of how their bodies have been commodified.

It is clear that both BIPA and CCPA have been drafted from different perspectives. As the first attempt at a comprehensive law on biometrics, BIPA, while detailed, lacks foresight. CCPA, having sharpened the tools used to create BIPA, has been generated from a business-conscious perspective, misbalancing the sale and purchase of biometric data against the security of the individual. While CCPA covers a broader range of issues with respect to biometrics, the focus on the facilitation of commercial exchange is clear.

While both BIPA and CCPA touch on the ownership of data, they fail to recognise the nature of this ownership specifically. Both laws could be strengthened through the clarification of what type of ownership this is. It has been suggested that the individual’s rights over their biometric data could be construed as a property right[37]. Such an approach was elegantly proposed by Ghelardi:

“Given individuals’ continued interests in their immutable biometric information, courts could interpret consent to collect, store, or use biometric information as a kind of lease...Not only would this label make interpretation simpler for courts, but it would also emphasize that the collected information still fundamentally belongs to the individual rather than the company.”[38]

The leasing of data would certainly ensure companies like Suprema and Clearview AI re-examine their business models! This way of framing biometric data is an attempt to concretise ownership of biometric data upon the individual and is welcomed by the writer – but does it go far enough? A lease would still grant a leaseholder license to use the property as they please. The phenomenon of biometric data is so unique that a new form of conceptualising ownership is required. Perhaps the solution lies in not just thinking of data as a form of physical property, but also as a form of property of the intellect.

Intellectual property (IP) is a component of property law that grants ownership rights to the creator of an idea. Facets of IP differ from country-to-country, however, one prevalent component of IP is that of copyright. In the UK, for example, the Copyright Designs and Patents Act 1988[39] (CDPA 1988) provides the “copyright owner with property rights and the ability to prohibit certain uses of their works by others... preventing others from what only the copyright owner is authorised to do.[40]. Under ss 16(1)(ba) and (d) of the CDPA, the copyright owner has the exclusive right to rent the work to the public or to communicate the work to the public respectively.[41]

A striking paper by O’Connell and Bakina explored the potential for UK victims of “revenge pornography”[42] to use the exclusive rights afforded to them by copyright law and prevent images or videos that featured them from being disseminated or published[43]. In their analysis, the authors noted that joint ownership could be a problem in the context of “revenge pornography”, as the creator of the images many not be the person depicted in them, thus creating friction between who possesses the copyright. In the context of biometric data, there is no such stumbling block as the creator of the biometric information is undoubtedly the person from whom the unique attributes were copied. (However, technically speaking it should be noted that UK law demands a certain amount of creativity for copyright to evoked[44] which would make strict application of copyright law to biometric data problematic).

Now, what if we were to fuse copyright ownership as suggested by O’Connell and Bakina with the lease idea put forth by Ghelardi? Let us refer to this ownership fusion concept as bio-property. Bio-property would grant individuals ownership with respect to both how the data would be used and also enable the leasing of data to organisations for specific purposes. This would concretise and demystify the nature of ownership.

The concept of bio-property could be fluid and move between a copyright formulation and property right formulation depending on the nature of the relationship between the individual and the data collector. For example, a leasing or licensing agreement could be used for identification and authentication purposes. Upon expiration of the lease, the data would effectively be destroyed and returned to the individual. If the data was misused or distributed without consent, the individual could rely on the copyright aspect to bio-property, and exercise their rights for removal or seek remedies for breach. This would shift the emphasis off corporations offering financial incentives for data, to individuals marketing their data. Bio-property would therefore be akin to owning the copyright to an exclusive vehicle while having physical possession of it simultaneously. In a world where the economy of biometrics may be unavoidable, this is an ethical outcome.

To be clear, this paper is not arguing that biometric data be forced into conformity with an existing conception of IP. Such an attempt like this would be to repeat the errors we have already seen with respect to traditional privacy conceptions. Rather, it confirms viewing biometrics both through the lens of privacy and of ownership is fertile thinking ground. When developing a framework for the ownership of bio-property, policy makers would be well placed to work with IP law in conjunction with privacy.

V THINK OF THE CHILDREN

Article 23 of the 1989 UN Convention on the Rights of the Child recognises children with disabilities should enjoy a “full and decent life” with dignity, self-reliance and “active participation in the community.[45]

We have previously noted human beings are establishing relationships with technology at a younger age than ever before. Just taking a cursory glance around your local supermarket brings the sight of dozens of children operating mobile phones, electronic tablets, and the like, many of whom use facial recognition and other biometrics to ‘unlock’ these devices. In the view of the writer, this has created a sub-economy of juvenile behavioural data, free flowing into the hands of actors unseen. Some may argue it’s the responsibility of parents to manage children screentime. This argument should be dismissed: children take their devices everywhere, they use them on public transport, are constantly online at school and have regular, unsupervised access to the internet. Whether parents like it or not, children are free to share their personal data with algorithms with more reckless abandon than adults.

A unique study involving workshops with children aged 9-12 years was conducted in 2019 by Milkaite et al in Ghent, Belgium[46]. Guided by Article 12 of the GDPR that specifically refers to the need for “concise, transparent, intelligible”, “easily accessible” and use of “clear and plain language”, “for any information addressed specifically to a child[47], the workshops involved actively engaging children in discussions relating to data processing and privacy. The activities took a co-design format where the children worked with moderators to discuss what types of communication they would best understand. It was determined that the children had an understanding of interpersonal privacy, in the sense that they were aware of what their peers, parents and strangers might see online and how this could impact them. However, it was “clear that commercial and governmental data processing is not something children know much about.”[48] This illustrates a lack of understanding of the ramifications of children sharing their biometric data. Consequently, the study called for children’s views to be considered by data actors when formulating policy and for the parental and child deresponsibilisation and the responsibilisation of data controllers, concluding data controllers must increase their efforts to specifically and clearly communicate appropriate data literacy with children.[49]

The Ghent study identified an absence of maturity-appropriate education for children with respect to privacy and data processing. In the view of writer, the study itself exposed a potentially dangerous situation in the world of tomorrow, that is, who is responsible for educating children (or adults for that matter) with respect to this issue? As a society, we have become suspicious of how our data is captured and treated; children do not possess this intellectual luxury. Can we trust data collectors to educate impartially? As adults we can already see how corporations such as Google ask for more personal data under the guise of increasing security. While adding a mobile phone number to an email account increases security in some way, it also does more than this – it increases the overall value of the dataset already shared with organisation. Let us call this the ‘illusion of security argument’.

Returning specifically to biometrics, as children become familiar with technology at younger ages, the act of offering their biometric data is becoming normalised. Since digital platforms do what they can to maximise the amount of time spent using their services[50], they have a specific commercial interest in targeting children, who will spend an increasingly significant percentage of their lives online. This increased amount of time, plus their general ignorance with respect to data use, will render children disproportionately vulnerable to the illusion of security argument.

The susceptibility of children to data manipulation is particularly complex in the context of disability advocacy. Many children with disabilities rely on “digital content, formats, and platforms to undertake many aspects of their daily home” life.[51] These technologies may often require biometrics, such as voice recognition or movement sensing, to ensure basic household needs are met. Children with certain types of disability are therefore dependent on biometric data collection and processing by default. Furthermore, in many institutions, disability is a status requiring registration before accommodations are granted which further propels these children into situations where data collection is required. Yet, despite this, children with disabilities have been overlooked in the broader discussion of privacy and data management[52].

Adding a further layer of complexity, we find the concept of sharenting. This phenomenon was discussed in a poignant article by Goggin and Ellis exploring sharenting in the context of privacy and human rights[53]. Sharenting occurs when parents deliberately share images or other personal information of their children with disabilities online[54], often with the intention of connecting with a community or seeking support. Given that social media encourages photographic and videographic content, parents inadvertently make public their children’s biometrics with a view to promoting empowerment. The challenge here is that parents often assume ownership of this private information and do not necessarily understand the broader privacy implications of sharing it. Furthermore, and perhaps ironically, this can be viewed as being in tension with Article 7 of the 2006 UN Convention on the Rights of Persons with Disabilities recognises that children with disabilities “have the right to express their views freely on all matters affecting then...on an equal basis with other children.”[55]

Of course, this essay is not criticising the actions of hopeful parents looking to improve and advocate for their children. Rather, what we are exposing here is an area of privacy and the law that requires careful consideration, especially if a situation is reached whereby children with disabilities grow into adults seeking legal recourse from parental posts “discovered by college admissions officers and future employers, friends and romantic prospects[56].” The fact is that children with disabilities are compelled towards surrendering their biometric data more strongly than other children, and to compound the matter, laws that allow the monetisation of children’s biometrics through parental consent, further encourage sharenting as commercial enterprise and this area requires further legal consideration. A question ripe for consideration here, then, is: what percentage of the three billion images contained in Clearview AI’s database are taken from posts of well-meaning parents sharing images of their children with disabilities?

VI CONCLUSION AND RECOMENDATIONS

Let us now bundle the points explored above into specific recommendations.

1) Altruistic or medicinal use of biometrics

We have seen why biometric data presents such a difficult problem. The issue becomes increasingly complex when we look at how children participate in the Information Society. Children, as heirs to the world of tomorrow, must be a focal point of any law or policy that concerns biometrics. Much of the discussion around biometrics focuses on how data can be used to facilitate ethical commercial enterprise and this need not be the case. A recent study used a dataset of children’s voice and heart rate data with a view to developing preventive measures regarding situations dangerous to children[57]. Measures like this, that focus on an altruistic or medicinal use of biometrics, could be used to inform the current discourse on data generally. Another example would be the use of a database such as Clearview AI’s to proactively source and remove images used without consent. Biometric systems will inevitably become more advanced in time and there is the possibility that scanning programs will be able to detect the presence of physical infirmities or ailments. Put another way “while biometric data is typically used to recognise individuals, it is possible to deduce other types of attributes of an individual from the same data.”[58] If this is the case, the law will need to map what responsible use of this technology would look like. Exploring this hypothetical scenario in depth is beyond the scope of this paper, however, the point that data captured to facilitate a commercial transaction can be used for altruistic or medicinal purposes is a salient one.

2) Uniformity of Biometrics

In our analysis we noted that both BIPA and the CCPA approached biometrics from different perspectives. It must also be acknowledged that both are US state laws, applying to persons who reside in either Illinois or California. While other states are enacting similar legislation in the US and elsewhere, given the borderless nature of the internet and digital technology, federal action is required to ensure consistency. Currently a Bill before Congress is seeking to unify the developments of biometrics into a coherent whole and features both cause of action and parental/guardian controls for data obtainment components[59]. The Bill then, can be thought of as a union of BIPA and CCPA, however, as it is yet to be enacted, the path of tomorrow remains unclear. Given the global context nature of data, national laws, while a step in the right direction, will still lead to inconsistencies in a borderless digital world. Therefore, there exists a need for a global understanding in the world of tomorrow. If biometric data is considered as something as inalienable as a human right, achieving such international consistency will become easier.

3) Clarification on the Ownership Problem

We have observed laws handling biometrics that hint at individual ownership but struggle to conceptualise it as something separate to privacy. This paper has suggested a new form of property, bio-property, be conceptualised. This area requires the attention of lawyers and policy makers moving forward. Current laws do suggest individuals own their data, but what does this mean when the data is copied, stored, traded and exploited by a variety of parties without the knowledge or consent of the individual? Clear boundaries with respect to ownership should be the starting point from here, and then commercial use of the data should follow – not the other way around.

With respect to children, the concept of ownership becomes murkier still. We observed potential consequences sharenting and its ramifications later in life for children with disabilities. This danger is also present with children that do not have disabilities. Laws such as CCPA, while addressing the use of data, provide no guidance on what happens to the data of children once they come of age. Theoretically, they should then own this data but how would this work if a child has over decade of data captured and exploited by corporations already? Perhaps situations like this could be remedied through the use of trusts, however, this could add a further layer of complexity that may not be required. As children will only become more entrenched in technology moving forward, new laws need to address ownership specifically in this context with urgency.

4) Appropriate Guidelines for Law Enforcement

The Clearview AI and AFP case exposed the need for law enforcement to have a measured approach when using biometrics to solve cases. Here exists a very real need for checks and balances for law enforcement use of biometrics. Public concerns of mass surveillance appear warranted in this context. One solution to this problem would be to create a statutory body that governs when law enforcement is allowed to consider the use of biometric data. This body could establish criteria for an application or permit system for biometric use. A measure like this would have prevented the AFP from engaging Clearview AI in the first place.

The writer would also like to suggest that biometric data could be used by law enforcement to ban cybercriminals from using technology with which they have caused harm previously. This would work similarly to an alcohol interlock device that prevents a car from starting until a blood alcohol concentration of zero is provided. In the context of cybercrime, if a particular website was only accessible using biometrics, then cybercriminals could theoretically be permanently banned from entering the site.

This paper has advocated for a new way of thinking about the ownership of biometric data and exposed the need for children to be legally protected when interacting online. The issue is important because traditional notions of privacy are ill-equipped to handle the phenomenon of biometrics, a phenomenon that will increase exponentially in the world of tomorrow.


[1] Kaiser, Brittany, Full Address and Q&A | Oxford Union (https://www.youtube.com/watch?v=AgBHfmf2JhQ)

[2] Advisory report on the Identity-matching Services Bill 2019 and the Australian Passports Amendment (Identity-matching Services) Bill 2019 (see: https://www.aph.gov.au/Parliamentary_Business/Committees/Joint/Intelligence_and_Security/Identity-Matching2019/Report)

[3] Biometrics Institute, Biometrics Institute Ltd <www.biometricsinstitute.org> at 5 May 2008; Organisation for Economic Co-operation and Development, Biometric-Based Technologies (2004), 10–11; Council of Europe, Progress Report on the Application of the Principles of Convention 108 to the Collection and Processing of Biometric Data (2005), [16].

[4] Privacy Act 1988 (Cth) No. 119, 1988 as amended

[5] Universal Declaration of Human Rights, Article 12

[6] Article 5(2), General Data Protection Regulation (EU) 2016/679 [hereafter GDPR]

[7] Taylor, Josh, Major breach found in biometrics system used by banks, UK police and defence firms (https://www.theguardian.com/technology/2019/aug/14/major-breach-found-in-biometrics-system-used-by-banks-uk-police-and-defence-firms)

[8] Articles 5(b) & (e), GDPR

[9] Motag et al, The Rise of Biometric Mass Surveillance in the EU, European Digital Rights (see edri.org)

[10] Organisation for Economic Co-operation and Development, Biometric-Based Technologies (2004), 12–13.

[11] According to their official website (see www.clearview.ai)

[12] Goldstein, Jake, Australian police are using the Clearview AI facial recognition system with no accountability The Conversation, 3 March 2020

[13] Australian Information Commissioner, Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (14 October 2021)

[14] APP 3.3 and 3.4

[15] APP 3.5

[16] APP 5

[17] APP 10.2

[18] Australian Information Commissioner, Commissioner initiated investigation into the Australian Federal Police (Privacy) [2021] AICmr 74 (26 November 2021)

[19] Clause 12 Privacy (Australian Government Agencies – Governance) APP Code 2017

[20] Ibid.

[21] Biometric Information Privacy Act, 43 S. ILL. U. L.J. 819 (2019) [hereafter, BIPA]

[22] Insler, Charles N, How to Ride the Litigation Rollercoaster Driven by the Biometric Information Privacy Act, 43 S. ILL. U. L.J. 819 (2019).

[23] BIPA.

[24] SB-1121 California Consumer Protection Act, Cal. Civil Code 1798 [hereafter, CCPA].

[25] Ibid. 1798.140(b)

[26] Ibid. 740 ILCS 14/20 Sec. 20.

[27] Rosenbach v. Six Flags Entertainment Corporation, 2019 1L 123186

[28] Ibid, 30.

[29] Ibid.

[30] Article 5(f) GDPR

[31] Ibid at 28, 8. 740 ILCS 14/15 Sec. 15(a).

[32] CCPA, Cal. Civil Code 1798.140(o)(1).

[33] Articles 26 & 29 GDPR

[34] CCPA, Cal. Civil Code 1798.105(c)

[35] CCPA, Cal. Civil Code 1798.125(b)(3)

[36] CCPA, Cal. Civil Code 1798.120(c)

[37] Ghelardi, Eva-Maria, ‘CLOSING THE DATA GAP: PROTECTING BIOMETRIC INFORMATION UNDER THE BIOMETRIC INFORMATION PRIVACY ACT AND THE CALIFORNIA CONSUMER PROTECTION ACT’ (2021) 94(3) St. John's law review 869

[38] Ibid.

[39] Copyright Designs and Patents Act 1988 (UK)

[40] O'Connell, Aislinn and Ksenia Bakina, ‘Using IP Rights to Protect Human Rights: Copyright for 'revenge Porn' Removal’ (2020) 40(3) Legal studies (Society of Legal Scholars) 442

[41] Copyright Designs and Patents Act 1988, s 16

[42] A cultural phenomenon where one party, or parties, distribute(s) explicit material of another party without their consent.

[43] O'Connell, Aislinn and Ksenia Bakina, ‘Using IP Rights to Protect Human Rights: Copyright for 'revenge Porn' Removal’ (2020) 40(3) Legal studies (Society of Legal Scholars) 442

[44] Painer v Standard Verlags GmbH [2012] Case C-145/10 ECDR 6 paras 120–124.

[45] United Nations Convention on the Rights of the Child, 1989, Article 23

[46] Milkaite, Ingrida et al, ‘Children’s Reflections on Privacy and the Protection of Their Personal Data: A Child-Centric Approach to Data Protection Information Formats’ (2021) 129 Children and youth services review 106170

[47] Article 12, GDPR

[48] Ibid at 44

[49] Ibid at 45

[50] Stewart, James B, Facebook Has 50 Minutes of Your Time Each Day. It Wants More. New York Times 6 May 2016 (see: https://www.nytimes.com/2016/05/06/business/facebook-bends-the-rules-of-audience-engagement-to-its-advantage.html)

[51] Goggin, Gerard and Katie Ellis, ‘Privacy and Digital Data of Children with Disabilities: Scenes from Social Media Sharenting’ (2020) 8(4) Media and communication (Lisboa) 218

[52] Ibid.

[53] Ibid.

[54] Ibid.

[55] United Nations Convention on the Rights of Persons with Disabilities, 2006, Article 7

[56] Kamenetz, A. (2019, June 5). The problem with ‘sharenting.’ The New York Times. Retrieved from https://www.nytimes.com/2019/06/05/opinion/ children-internet-privacy.html

[57] Kim, Tae-Yeun, Libor Měsíček and Sung-Hwan Kim, ‘Modeling of Child Stress-State Identification Based on Biometric Information in Mobile Environment’ (2021) 2021 Mobile information systems

[58] Dantcheva, Antitza, Petros Elia and Arun Ross, ‘What Else Does Your Biometric Data Reveal? A Survey on Soft Biometrics’ (2016) 11(3) IEEE transactions on information forensics and security 441

[59] S.440 National Biometric Information Privacy Act of 2020


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/UNSWLawJlStuS/2022/23.html