AustLII Home | Databases | WorldLII | Search | Feedback

University of New South Wales Law Journal Student Series

You are here:  AustLII >> Databases >> University of New South Wales Law Journal Student Series >> 2023 >> [2023] UNSWLawJlStuS 3

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Dumas, Annabella --- "Somebody That I Used To Know: The Reform Implications Of Inserting A Right To Be Forgotten Into The Privacy Act 1988 (Cth" [2023] UNSWLawJlStuS 3; (2023) UNSWLJ Student Series No 23-3


SOMEBODY THAT I USED TO KNOW: THE REFORM IMPLICATIONS OF INSERTING A RIGHT TO BE FORGOTTEN INTO THE PRIVACY ACT 1988 (CTH)

ANNABELLA DUMAS*

I INTRODUCTION

The right to be forgotten...like a balloon...appears grandiose at first, but is easily pierced when looked at more closely. It obtains meaning, only by what people make of it.[1]

- Jef Ausloos

The Australian Competition and Consumer Commission’s (‘ACCC’) proposal for a Right to Be Forgotten (‘RtBF’) attempts to restore the agency of data subjects against the rise of digital platforms.[2] The proposal is modelled on article 17 of the General Data Protection Regulation (‘GDPR’), which provides a ‘right to delete information about oneself, even if published by a third-party’.[3] Derived from European human rights discourse, article 17 conceives of the individual as a master of their own memory. It has been termed a ‘right to change one’s mind’ and perhaps most poignantly, a ‘right to repentance’,[4] reflecting underlying ideas about the dynamic nature of the relationship between public interest and privacy over the passage of time. This is in sharp contrast to the ‘data minimisation approach’ currently adopted by the Australian Privacy Act 1988 (Cth) (hereafter, the Privacy Act). The Privacy Act relies upon a more static interpretation of information that protects against objective factual inaccuracy and operative redundancy.[5] Problematically, the Australian regulatory framework is not aided by a codification of rights at Commonwealth level. In light of these complexities, in 2020 the Australian government published an Issues Paper calling for submissions on the proposal as part of a broader review of the Privacy Act.[6] The purpose of this paper is to identify and critically discuss the reform implications of introducing a RtBF as proposed by the ACCC.

Part II outlines the current legislative framework, and the limitations of the existing Australian Privacy Principles (‘APPPs’). It proceeds to discuss the nuances of the ACCC proposal and its effective transposition of article 17 of the GDPR into the Privacy Act. Part III critically examines the reform implications of transposing such a right into the Australian legal context, including reform to complaint mechanisms under the Act; the scope of the Act’s objects clause; and definitional aspects of the journalism exemption. Part IV introduces a ‘minimalist’, consent-based alternative to the ACCC proposal. However, if the minimalist alternative is pursued and the RtBF is to be more than a ‘decision rule of near universal non-intervention’,[7] then this minimalist model will need to be complemented by comprehensive reform to the way consent is conceptualised under the Act. Ultimately, it is impossible to adjudicate upon the merits of a RtBF without first setting the parameters of broader reform to the Act, because the right derives its content from the legislative context in which it operates.

II DIGITAL SHADOW: THE AUSTRALIAN PRIVACY PRINCIPLES AND THE AUSTRALIAN COMPETITION AND CONSUMER COMMISSION’S PROPOSAL FOR A RIGHT TO BE FORGOTTEN

A Limitations of the Australian Privacy Principles

The Privacy Act partially implements into domestic law, Australia’s obligations under article 17 of the International Covenant on Civil and Political Rights (‘ICCPR), which protects individuals from ‘unlawful interferences’ in their privacy.[8] Reproduced in sch 1 of the Act, the Australian Privacy Principles (‘APPs’) regulate the ‘collection, use and disclosure of personal information’ by APP entities.[9] An APP entity refers to any federal agency or officeholder, and/or private organisation.[10] Any information or opinion relating to a reasonably identifiable individual will constitute ‘personal information’.[11] In a recent Discussion Paper, the Law Council of Australia has stated that there is limited support amongst its membership for a RtBF,[12] because APP 13 already provides a ‘right to correction’,[13] and APP 11.2 places an obligation on APP entities to ‘destroy or de-identify...information’ that is no longer required for the purpose for which it was collected or disclosed.[14] The prominent Australian media organisation, Nine News, has made submissions to similar effect.[15] However, it is important to distinguish between claims that these principles provide adequate protection, and claims that these principles, taken together, provide analogous protection to the proposed RtBF. Whilst the former is open to debate, the latter cannot stand. APP 13 represents a more onerous regime than that which subsisted under the previous bifurcated system, made up of the Information Privacy Principles and National Privacy Principles.[16] Rather than requiring individuals to prove that the information is inaccurate or incomplete, APP 13 shifts the evidentiary onus onto the APP entity to ensure that information is accurate, complete, and current, with respect to the purpose for which it was collected.[17] Individuals may request rectification of information provided, which is complemented by the right to access personal information in APP 12.[18] In respect to APP 11.2 which imposes an obligation to destroy or de-identify information that is no longer necessary for the purpose/s for which it was collected, it is open to argue that in the new data economy ‘it is impossible to foresee all potential uses of personal data’.[19] At best, proposed use is increasingly ‘abstract, distant and uncertain’, which invariably impacts the protective potency of APP 11.2.[20] In contrast, the RtBF, as it has been proposed by the ACCC, is not exclusively concerned with information provided by the individual. Nor is it exclusively concerned with information rendered factually erroneous by virtue of inaccuracy, incompleteness, or lack of contemporaneity, or with information which is no longer necessary for the purposes for which it was collected. Rather, the ACCC proposal is concerned with information that is accurate and complete, where the putative collection purpose continues to exist, but where the public interest no longer justifies the invasion of privacy. The APPs, and in particular APPs 13 and 11.2, maintain Australia’s ‘data minimisation approach’ to the processing of personal data.[21] There no doubt exists many advocates for such an approach, but it is nonetheless important not to conflate, however convenient, the subjective adequacy of APPs 13 and 11.2 with their objective scope.

B The Australian Competition and Consumer Commission’s Proposal

In light of the relatively limited scope of APPs 13 and 11.2, and the growing asymmetry in bargaining power between individuals and data processors, in their 2019 Digital Platforms Inquiry – Final Report the ACCC recommended a new APP be introduced to provide for a RtBF.[22] The ACCC’s recommendation ‘broadly aligns’ with article 17 of the GDPR,[23] which has its genesis in the earlier case of Google Spain.[24] In 1998, Spanish newspaper La Vanguardia, published a notice that Mario Costajea’s property was being auctioned due to unpaid debts.[25] Costajea, who had since ceased to be a debtor, sought to invoke Data Protection Directive 95/46 to argue for the deletion of the notice from the La Vanguardia archive, and the deindexation of the notice from Google search results.[26] In May 2014, the Court of Justice of the European Union (‘CJEU’) handed down its judgment, ruling against Google and in favour of deindexation, but not disturbing the decision of the Spanish court at first instance not to alter the newspaper archive.[27] Google was deemed to be a ‘data controller’, and thus subject to Directive 95/46, because the indexing, storing and making available of data constituted ‘processing’.[28] Directive 95/46 provided a right to erasure where the information was ‘inadequate...no longer relevant...or excessive...in the light of the time that has elapsed’.[29] The Court held that the right must be exercised proportionately, requiring a ‘fair balance’ be struck between protected rights.[30] Specifically, the rights provided for under article 8 (respect for privacy and family life), and article 10 (freedom of expression) of the European Convention on Human Rights (‘ECHR’). By extension, this balancing exercise invokes article 8 of the Charter of Fundamental Rights of the European Union, which is parasitic on article 8 of the ECHR, and provides an autonomous right to data protection.[31] The Google Spain decision is in many ways the origin story of what commentators, and indeed legislators, have come to call the RtBF.

The GDPR came into force in May 2018.[32] Referred to as ‘a codification of the right to be forgotten’, article 17 provides for a right of erasure under which the data subject may object to the processing of personal information, unless the data controller can establish legitimate grounds for overriding the individual’s right to privacy.[33] If they cannot, they must erase the information without ‘undue delay’.[34] In reality, the GDPR expands the right to extend to not merely deindexation, but ‘removal of the data itself’; although in practice, applicants overwhelmingly seek the former remedy.[35] This is because erasure of the data in its initial context, would similarly need to represent a ‘fair balance’ between freedom of expression and the public interest, and the right to privacy. The GDPR makes explicit reference to freedom of expression; however, in the absence of any a priori hierarchy, and in a rather Dworkinian way, the content of a right is defined by the content of the right with which it conflicts.[36] Thus, the GDPR does little more than import the balancing exercise engaged in Google Spain. Consequently, Edward George has referred to the RtBF as essentially ‘a restraint on how widespread information can disseminate’.[37] There are exceptions for ‘information used for journalistic purposes; academic, artistic or literary expression; or statistical, scientific or historical research’, and the right does not apply where the information is retained for contractual performance or by operation of law.[38] The ACCC proposal essentially mimics article 17 of the GDPR. In effect, the proposed RtBF requires ‘APP entities to erase...personal information...without undue delay on receiving a request from the consumer...unless the retention...is necessary for the performance of a contract...required under law, or...otherwise necessary for an overriding public interest reason’.[39] This is a vexed attempt to transpose legal developments derived from European rights discourse, into the Australian legal context. The purpose of this article is not to adjudicate upon the merits of the proposal, but to critically consider the reform implications of such a transposition.

III THE RIPPLE EFFECT: MAPPING THE REFORM IMPLICATIONS OF INSERTING A RIGHT TO BE FORGOTTEN INTO THE PRIVACY ACT 1988 (CTH)

A The Balancing Exercise and the ‘Hybridisation of Governance’

NT1 and NT2 is the ‘the most high-profile consideration of the right to be forgotten in a common law jurisdiction’.[40] It provides a recent illustration of the balancing exercise described in Part II, and that must inform legislative change to the Privacy Act. Whilst the judgement proceeds the introduction of the GDPR, it is considered an instructive example of how Courts will decide delisting requests made under article 17, or more broadly, under a GDPR modelled RtBF.[41] NT1 and NT2 are the anonymisations of two businessmen, who applied to the High Court of England and Wales to have Google search results, which revealed previous criminal convictions, delisted. Each objected on the basis that the information was no longer relevant, and that there was insufficient public interest to override their right to privacy. Justice Warby reiterated the need to balance competing rights, stating that ‘neither privacy nor freedom of expression has...precedence over the other’.[42] Handing down separate rulings, Justice Warby held that because NT1 had been convicted of a business crime and continued to conduct business activities, including lending money, the information remained of interest to that section of the public with whom he may do business.[43] This adopts a particularly broad definition of ‘public figure’, which extends to ‘business figures’ and anyone with ‘media exposure’.[44] It was persuasive that NT1 had failed to show contrition for his crime.[45] In contrast with NT2, the Court could not be satisfied that there was no ‘plausible risk of...[NT1] re-offending’.[46] Furthermore, the Court was not convinced that Google’s continued processing caused the harm NT1 alleged, postulating that the ‘interference with his privacy rights, would have occurred irrespective of Google’s actions’.[47] Citing Google Spain, the Court affirmed that where the claimant can show harm to their rights, this will be a persuasive factor in favour of granting a delisting order.[48] In respect to NT2, the conviction was not a dishonesty offence, he had been remorseful and pleaded guilty, and the conviction was unrelated to his current business activities.[49] Thus, the Court found that NT2 had established a compelling case of harm, which was not outweighed by public interest.[50] This variation in outcome emphasises the fact specific nature of any RtBF assessment. Judicial precedent has established a set of factors which may be persuasive, such as a compelling case of harm, and in the context of prior convictions, a demonstration of contrition. However, in and of themselves, none of these factors will resolve the interest conflict one way or another. The weight to be attributed to these factors will vary based on the facts of the case. Moreover, many fundamental ambiguities remain. For example, if the definition of ‘public figure’ is to be so broad, then what is the distinction to be made between ‘A-lister’ celebrities, politicians, and businessmen like NT1 and NT2?[51] Whilst such balancing exercises are not new to courts, or to the common law (most notably in the Australian legal system, under the implied freedom of political communication),[52] private organisations to whom RtBF applications are made under the GDPR, lack the independence to undertake such a subjective exercise.[53] As Joanna Connolly writes, commercial data controllers like Google, ‘are far more likely to be guided by their own commercial interests and err towards erasing material to avoid legal challenges and monetary damages’ – at the expense of freedom of expression.[54] Notwithstanding the administrative burden of processing RtBF requests in terms of economic cost, the requirement to act with ‘undue delay’ is unlikely to produce the careful consideration mandated by NT1 and NT2.[55] Consequently, the introduction of a GDPR modelled RtBF requires an independent body to process the request at first instance, to avoid what has been lamented in Europe as a ‘hybridisation of governance’.[56]

Whilst the Privacy Act provides multiple complaint pathways, there is a requirement that the individual first direct their request to the relevant APP entity.[57] If an individual is not satisfied with the way their request has been handled, they may complain to the Office of the Australian Information Commissioner (‘OAIC’); however the Information Commissioner may only investigate if the request was first made to the respondent.[58] It is manifest that this procedure would not be fit for purpose with respect to RtBF assessments. Even in Europe, where the decision of the data controller can be appealed directly to the CJEU, scholars have criticised the emergence of private actors as stakeholders in, rather than subordinates to, the definition of the ‘rules and procedures’ from which the RtBF derives its content.[59] The European experience provides a learning point for Australian legislators. Google itself submitted, in response to the Issues Paper, that requests should be made to an ‘independent judicial or regulatory authority’.[60] This authority would need to provide for ‘impartiality/neutrality, audi alteram partem (hearing both sides), and judicial review’.[61] The obvious body in Australia would be the OAIC, effectively transforming the OAIC into the ‘main centralised decision maker’.[62]

Under the current system, OAIC decisions can be appealed to the Federal or Federal Circuit Court if found to be legally incorrect, as well as being subject to merits review by the Administrative Appeals Tribunal.[63] However, given Google received 12,000 deletion requests the day Google Spain was handed down, this transformation into centralised decision maker would impose an ‘enormous administrative burden’ on the OAIC.[64] Alternatively, Michel Reymond has suggested that a designated body play a ‘filtering’ role.[65] This body would ‘filter out’ and decide those requests that have ‘no prima facie case...[or which] address pure data protection [cf public interest] issues or...where the delisting criteria clearly point towards a solution’.[66] Complex cases would be forwarded to the Federal Court for determination. Whether the designated body is the OAIC or a body established purely for this purpose via a special legislative instrument, this so-called ‘filtering out’ would still, in absolute terms, attract a high administrative cost with respect to both time and resources. A less radical approach would be to remove the requirement to act with undue delay (generally interpreted as within one month of receiving the request), instead specifying a minimum time period, and to cap penalties for noncompliance at below the current maximum for breaches of the APPs.[67] These reforms may encourage a more careful consideration by private actors that does not favour risk minimisation, but they are not a guarantee of neutrality. Notably, there would still be a lack of due process. As Jean-Marie Chenou and Roxana Radu so aptly surmise, in digital markets there is no ‘transparent space for consultations between...actors’, and thus no real or meaningful opportunity for both sides to be heard, even if such a procedural requirement were to be mandated.[68] If Australia is to do justice to the balancing exercise the ACCC recommendation seeks to import, it must learn from the European experience and ensure that requests are dealt with by an impartial body at first instance, acting either as a centralised decision maker or assuming a filtering out role. The latter may represent a compromise between the division of responsibility, and thus resource utilisation, at the executive and judicial level. Regardless of how the administrate burden is divided between the arms of government, there are significant implications in terms of resourcing. Therefore, policymakers need to recognise an inherent tension between justice and managerialism – a recognition hereto absent from the government response.

B Freedom of Expression and Reform to the Objects Clause

The balancing act undertaken by the CJEU in Google Spain, and later by the High Court of England and Wales in NT1 and NT2, finds a point of equilibrium between the right to privacy and the right to freedom of expression. In Europe, these rights are expressly protected under the ECHR, and given domestic effect in England and Wales through the Human Rights Act.[69] The transposition of a right which derives meaning from human rights frameworks, is problematic given that Australian law ‘only recognises...freedom of political communication as a constraint on legislative and executive power’.[70] Chief Justice Bathurst has noted extra-curially, that far from being strangers, privacy and freedom of expression are in fact unlikely bedfellows.[71] Privacy provides the space necessary to develop autonomous thoughts ‘free from observation and interference by Big Brother, or even by a liberal democratic state’.[72] Consistent with Chief Justice Bathurst’s unlikely bedfellows thesis, limiting ‘algorithmic search results’ (in other words, deindexation) does not necessarily limit freedom of expression, as this would be to conflate the freedom of expression with the right to know.[73] However, as NT1 and NT2 demonstrates, there will be cases where the sort of substantive privacy rights that the proposed RtBF seeks to protect, conflict with freedom of expression. It may be argued that because there is no express protection of freedom of expression at the Commonwealth level in Australia, RtBF assessments could be reduced to balancing the privacy of the individual with the interests of the data processor. Indeed, the objects clause of the Privacy Act suggests that the statute is concerned with finding just such a balance, where application of the Act goes in search of a balance between personal privacy on the one hand, and expedience and convenience on the other.[74] The Issues Paper sought submissions on reform to the objects clause.[75] The Australian Centre for Media Transition (‘the Centre’) recommended the insertion of an additional object, which reads as follows: ‘to promote the privacy and autonomy of individuals in accordance with Australia’s international obligations’.[76] In its submission, the Centre suggested that this would have the effect of requiring a consideration of Australia’s obligations under international law, beyond the right to privacy contained in article 17 of the ICCPR, which is already alluded to in section 2A(h) of the objects clause.[77] Specifically, it would appear to invoke consideration of article 19(2) of the ICCPR, which contains a right to freedom of expression that applies to ‘all forms of communication’.[78] Although not absolute, the right is provided for in terms that bear strong similarity to article 10 of the ECHR.[79] On a contextual or purposive reading then, introducing the proposed object will provide some protection to freedom of expression consistent with public interest considerations, without necessitating broader reform to Australia’s legislative framework in the human rights arena.

C The Journalism Exemption

The journalism exemption under section 7B of the Privacy Act seeks to ensure the publication of information that is in the public interest, and in doing so, balance freedom of expression with individual privacy.[80] The provision exempts ‘media organisations’ acting ‘in the course of journalism’ from the operation of the Privacy Act.[81] For the purposes of this article, it is the ‘in the course of journalism’ limb that requires the most immediate reconsideration. As discussed in Part II, what renders the RtBF so distinct from APPs 13 and 11.2, is that it recognises how the relationship between public interest, freedom of expression and the personal right to privacy, changes with the passage of time. The journalism exemption should therefore only apply to information that is in the public interest ‘at the time when the request is received’.[82] The CJEU has interpreted ‘journalistic purposes’ as denoting a ‘public watchdog role’, where public interest is assessed at the time of the request (cf the time of publication).[83] However, in Australia, the Privacy Act does not define ‘journalism’. There is a dearth of judicial consideration on the term, and recourse to the ordinary meaning of the word has proven a controversial state of affairs for an imprecise term criticised as much for its potential breadth,[84] as for its potential confinement.[85] In 2008, the Australian Law Reform Commission (‘ALRC’) proposed that ‘journalism’ be defined as:

the collection, preparation for dissemination or dissemination of the following material for the purpose of making it available to the public:

(a) material having the character of news, current affairs or a documentary;

(b) material consisting of commentary or opinion on, or analysis of, news, current affairs or a documentary; or

(c) material in respect of which the public interest in disclosure outweighs the public interest in maintaining the level of privacy protection afforded by the model Unified Privacy Principles [in contemporary terms, this would refer instead to the APPs].[86]

In terms of limbs (a) and (b), this formulation is preferable to the more recent proposal made by scholars at the University of Queensland, in which journalism includes but is not limited to news or current affairs.[87] This is because the ‘news’ aspect, as Nine News itself has admitted, limits the exemption to information that serves a public interest purpose, rather than a public entertainment purpose.[88] Of course, there may be overlap between these two purposes, but the public interest element is retained. In respect to limb (c), the use of the term ‘outweighs’ (present tense) suggests that public interest needs to be balanced against privacy interests whenever the exemption is invoked. Thus, if media organisations theoretically seek to resist the erasure of data in its initial context by relying on the section 7B exemption, they will need to establish that there exists public interest in the information at the time of the request, as opposed to asserting that at the time of publication, public interest outweighed the privacy rights of the individual. The language of the ALRC proposal indicates that public interest is always a contemporaneous assessment. This definitional reform is not a complete answer to the controversies which surround the exemption and is likely to attract the reproach of media organisations who seek a broadening of the term ‘journalism’. However, it is effectively a condition precedent to the introduction of a RtBF in the form proposed by the ACCC.

IV A MINIMALIST ALTERNATIVE: THE CONSENT BASED MODEL

It is clear from the preceding discussion that the ACCC’s proposal for a RtBF will have a ripple effect on other provisions and procedures under the Privacy Act. Some of the most onerous reforms have been largely overlooked by the Issues Paper and associated submissions. Others are natural complements that have been well-formulated in the Issues Paper, and subject to a general consensus amongst respondents – such as reform to the objects clause. Yet more are controversial and likely to attract reproach from vested interest groups. In this context, Part IV considers the implications of an alternative formulation of the RtBF, which is based on earlier ALRC proposals. It emerges that whilst narrower in scope, as far as mapping reform implications is concerned, this is a ‘minimalist alternative’ only in relative terms.

A The Australian Law Reform Commission’s Proposal

In their 2014 Discussion Paper, ‘Serious Invasions of Privacy in the Digital Era’, the ALRC proposed the insertion of a new privacy principle that would, on the request of the individual concerned, require the deletion of ‘personal information provided to the entity by the individual’.[89] The ALRC was at pains to distinguish this proposal from a RtBF.[90] However, as this article has sought to show, the RtBF does not attract a fixed content. The proposal sits comfortably within the first tier of the graduated approach to the right explicated by Peter Fleisher.[91] It only applies to personal information that the individual provided to the APP entity. In other words, the individual must have expressly or impliedly consented to the collection, disclosure or use of the data. The right to seek deletion of this data, essentially manifests withdrawal of that consent. Unlike the ACCC proposal, it does not extend to personal information posted by third parties.[92] The proposal is more consistent with Australia’s ‘data minimisation’ approach.[93] The potential ‘chilling effect’ on freedom of expression is considered ‘minimal’.[94] However, as Damian Clifford and Jeannie Patterson note, ‘as it currently stands, the Privacy Act has little to say about consent’.[95] Thus, absent further legislative change, the proposed right is reformative only in the strictest sense, because it will essentially give rise to a ‘decision rule of near universal non-intervention’.[96] The Law Council has gone so far as to suggest that if the right is only to apply to personal information collected based on the individual’s consent, then ‘substantial questions about the role of consent as part of a broader regulatory structure need to be addressed’.[97] The next section attempts to define these questions and to consider the legislative reforms that may reasonably be proffered in response.

B Reform Implications for Managing Consent Under the Privacy Act 1988 (Cth)

Clifford and Patterson argue that the current definition of consent as ‘express or implied consent’, imports the ‘formal’ approach of contract law and provides a ‘low threshold’.[98] Against this legislative framework, the expansion of ‘clickwrap agreements, take-it-or-leave-it terms, and bundling of consents’ increasingly degrades the ‘legitimacy of user participation’.[99] The ACCC has again looked to the GDPR for a solution. It has suggested that consent be defined synonymously with article 4(11) of the GDPR, to mean ‘a clear affirmative act that is freely given, specific, unambiguous and informed (including about the consequences of providing or withholding consent)’.[100] This has been termed a ‘qualitative standard’, that promotes control, awareness and choice on the part of the consumer.[101] However, pursuant to APP 6, entities are not required to obtain consent, when personal information is being collected for a ‘primary purpose’.[102] This refers to the ‘particular purpose for which the information was collected’.[103] APP entities have significant discretion in defining the primary purpose of collection in their privacy policies.[104] As was mentioned in Part II, internationally, proposed use is becoming ‘abstract, distant and uncertain’.[105] If much of the information collected about individuals is not subject to a requirement of obtaining express or implied consent, then clearly, this has implications for what information the ALRC proposal would apply to.

The ACCC has recommended that the primary purpose exception be replaced with a general requirement for consent, with more limited exceptions for data collected for the purposes of performing a contract, or pursuant to the operation of law.[106] It is worth asking whether this bifurcation between personal information collected for contractual and non-contractual purposes imposes a gratuitous complexity. Clifford and Patterson argue that the ‘delineation between contractual and non-contractual uses is not always clear’.[107] However, requiring consent for both contractual and non-contractual collection is likely to increase consent fatigue, compounded by individual ‘limits in time and experience in reading terms with legal import’.[108] Here, layered disclosure emerges as a popular concept amongst commentators. This refers to multi-layered and potentially multimodal consent notices, in which the first layer provides a concise overview.[109] Detail increases with each layer.[110] The first layer could take the form of a concise ‘audio and[/or] video’ version but would need to include an outline of the ‘proposed uses...on-sharing...and how that data will be monetised’.[111] One way to implement layered notices is through a Privacy Code for digital platforms that are APP entities,[112] and in the interests of enforcement this should take the form of secondary legislation rather than merely a best practice pledge. The Code will need to be supplemented by a requirement similar to that contained in article 25 of the GDPR, that provides for ‘data protection by default and by design’ (emphasis added).[113] This is because ‘so-called dark patterns’, which manipulate the user interface to prompt particular responses from the consumer, are increasingly prevalent.[114] However, it is unlikely that any of these reforms will overcome the information asymmetries experienced by vulnerable groups, who are at risk of not ‘fully appreciating what providing consent means’.[115] California has sought to provide for the protection of children, as one such vulnerable group, by conferring a GDPR styled RtBF on minors.[116] To this extent, the reforms discussed in Part III are still relevant to proponents of the ALRC model. In particular, reform to the objects clause of the Privacy Act to prompt consideration of international obligations remains pertinent. In the context of children, Australia ratified the Convention of the Rights of the Child in December 1990, and this enshrines a child’s right to development.[117] As Chief Justice Bathurst has observed extra-curially, privacy provides ‘space within which to develop...identity’.[118] Notwithstanding the continuing need for a GDPR styled RtBF in respect of vulnerable groups more broadly, if the ALRC model was the rule rather than the exception, it is likely to impose a lower but nevertheless significant, administrative cost on the OAIC because the right is narrower in scope. However, to have any real meaning, the proposal requires widespread reform to the way consent is conceptualised under the Act, and thus is minimalist only in relative terms.

V CONCLUSION

The ACCC’s proposal for a RtBF significantly extends the privacy protections afforded to individuals under APPs 13 and 11.2. However, it involves a balancing of the right to freedom of expression with the right to privacy, where freedom of expression is not expressly protected as a personal right under Australian law. Fidelity to the proposal therefore requires a series of reforms, including the introduction of an impartial body (such as the OAIC) to act as a centralised decision maker or in conjunction with courts to provide a filtering out system. Reform to the objects clause reorientates the overarching spirit in which the Privacy Act should be applied, to encourage a broader consideration of Australia’s international human rights obligations - including freedom of expression. The proposed definitional changes to the journalism exemption allow for an assessment of public interest at the time of the request (cf publication) consistent with judicial precedent from the CJEU. However, such changes are perhaps the most controversial reform option being contemplated by the government in relation to that provision, from the perspective of vested interest groups. Whilst the consent-based model has been a popular footnote in industry submissions, in practice it necessitates comprehensive changes to consent under the Act. As the reformative potential of these changes is limited, it may require an ACCC styled RtBF in favour of vulnerable groups, which in turn compels consideration of each of the reforms discussed in Part III. Thus, this article has sought to show that it is redundant to debate the merits of a RtBF per se, because when viewed in silos, it is a hollow right. The content of the right is dependent upon its interaction with other provisions, that may at first instance seem unrelated. Reform to these provisions must therefore take logical precedence in terms of public policy. Only then, will it be clear what a RtBF means. After all, it takes time to forgive and forget.


* BA (High Distinction) UWA. Juris Doctor student at UNSW.

1 Jef Ausloos, The Right to Erasure in European Union Data Protection Law (Oxford University Press, 2020) 109.

[2] Australian Competition and Consumer Commission, Digital Platforms Inquiry: Final Report (Report, June 2019) 470, recommendation 16(d); Joanna Connolly, ‘The Right to Erasure: Comparative Perspectives on an Emerging Privacy Right’ (2021) 46(1) Alternative Law Journal 58, 60.

[3] Regulation (EU) No 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons With Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC [2016] OJ L 119, art 17 (GDPR); Jarrod Bayliss-McCulloch, ‘Does Australia Need a Right to Be Forgotten?’ (2014) 33(1) Communications Law Bulletin 7, 9.

[4] Cécile de Terwangne, ‘The Right to be Forgotten and Informational Autonomy in the Digital Environment’ in Alessia Ghezzi, Ângela Guimarães Pereira and Lucia Vesnić-Alujević (eds), The Ethics of Memory in a Digital Age (Palgrave Macmillan, 2014) 82, 88.

[5] Privacy Act 1988 (Cth) sch 1 (‘Australian Privacy Principles’); Bayliss-McCulloch (n 3) 8–9.

[6] Australian Government Attorney-General’s Department, Privacy Act Review (Issues Paper, October 2020) 2,4, 11, 51–53 (‘Issues Paper’).

[7] Joshua Sinn, ‘Managing Nascent Digital Competition’ [2021] UNSWLawJl 33; (2021) 44(3) University of New South Wales Law Journal 919, 947.

[8] Privacy Act 1988 (Cth) s 2A(h) (‘Privacy Act’); International Covenant on Civil and Political Rights, opened for signature 19 December 1966, 999 UNTS 171 (entered into force 23 March 1976) art 17 (‘ICCPR’); Anna von Dietze and Anne-Marie Allgrove, ‘Australian Privacy Reforms – An Overhauled Data Protection Regime for Australia’ (2014) 4(4) International Data Privacy Law 326, 327.

[9] Australian Privacy Principles (n 5); Niloufer Selvadurai, ‘Protecting Online Information Privacy in a Converged Digital Environment – The Merits of the New Australian Privacy Principles’ (2013) 22(3) Information and Communications Technology Law 299, 303.

[10] Privacy Act (n 8) s 6.

[11] Ibid.

[12] Law Council of Australia, ‘Privacy Act Review’ (Discussion Paper, 27 January 2022) 16 (‘Discussion Paper’).

[13] Ibid; Australian Privacy Principles (n 5) pt 5 s 13.1(a)-(b).

[14] Discussion Paper (n 12); Australian Privacy Principles (n 5) pt 4 s 11.2(a)-(d).

[15] Nine News, Submission to Attorney-General’s Department, Privacy Act Review (10 December 2020) para 9.

[16] Selvadurai (n 9) 304, 307–308.

[17] Ibid.

[18] Australian Privacy Principles (n 5) pt 5 ss 13.1, 13.4–13.5, 12.1.

[19] Ausloos (n 1) 90–91, 104–105.

[20] Ibid 105.

[21] Bayliss-McCulloch (n 3) 8–9.

[22] Australian Competition and Consumer Commission (n 2).

[23] Ibid 471; Connolly (n 2) 59.

[24] Google Spain SL v Agencia Espanola de Proteccion de Datos (Court of Justice of the European Union, Case C-131/12, 13 May 2014) (‘Google Spain’).

[25] Jean-Marie Chenou and Roxana Radu, ‘The “Right to Be Forgotten”: Negotiating Public and Private Ordering in the European Union’ (2019) 58(1) Business and Society 74, 76.

[26] Ibid; Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the Protection of Individuals With Regard to the Processing of Personal Data and on the Free Movement of Such Data [1995] OJ L281/31 (‘Directive 95/46’).

[27] Google Spain (n 24) [16], [41], [59]–[60], [80], [98].

[28] Ibid [21].

[29] Ibid [94]; Directive 95/46 (n 26).

[30] Google Spain (n 24) [81].

[31] Ibid [3], [9]–[10], [69]; Sabine Jacques and Felix Hempel, ‘The Right to Be Forgotten in the UK: A Fragile Balance?’ in Frank Werro (ed), The Right to Be Forgotten: A Comparative Study of the Emergent Right's Evolution and Application in Europe, the Americas, and Asia (Springer, 2020) 195, 207; Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature 4 November 1950, 213 UNTS 221 (entered into force 3 September 1953), arts 8(1), 10(1) (‘ECHR’); Charter of Fundamental Rights of the European Union [2000] OJ C 364/1, art 8(1); De Terwangne (n 4) 86.

[32] Jacques and Hempel (n 31) 200.

[33] Connolly (n 2) 59; GDPR (n 3) arts 17(1)(c), 21(1).

[34] GDPR (n 3) art 17(1).

[35] Connolly (n 2) 59.

[36] De Terwangne (n 4) 90.

[37] Edward George, ‘The Pursuit of Happiness in the Digital Age: Using Bankruptcy and Copyright Law as a Blueprint for Implementing the Right to Be Forgotten in the United States’ (2018) 106(3) Georgetown Law Journal 905, 921.

[38] GDPR (n 3) art 17(3)(b), (d); Issues Paper (n 6) 52.

[39] Australian Competition and Consumer Commission (n 2); Issues Paper (n 6) 51.

[40] Jacques and Hempel (n 31) 196; NT1 and NT2 v Google and The Information Commissioner [2018] EWHC 799 (QB) (‘NT1 and NT2’).

[41] Jacques and Hempel (n 31) 213.

[42] NT1 and NT2 (n 40) [132] (Justice Warby).

[43] Ibid [140]; Jacques and Hempel (n 31) 215.

[44] Jacques and Hempel (n 31) 215.

[45] NT1 and NT2 (n 40) [170].

[46] Jacques and Hempel (n 31) 216.

[47] Ibid 215; NT1 and NT2 (n 40) [151].

[48] Jacques and Hempel (n 31) 215; NT1 and NT2 (n 40) [147].

[49] NT1 and NT2 (n 40) [203].

[50] Ibid [221]-[223].

[51] Jacques and Hempel (n 31) 217.

[52] Nationwide News Pty Ltd v Wills [1992] HCA 46; (1992) 177 CLR 1; Australian Capital Television Pty Ltd v the Commonwealth (1992) 177 CLR 106.

[53] Chenou and Radu (n 25) 93.

[54] Connolly (n 2) 62.

[55] Ibid.

[56] Chenou and Radu (n 25) 78, 85.

[57] Privacy Act (n 8) ss 36, 40(1A).

[58] Ibid.

[59] Chenou and Radu (n 25) 86.

[60] Google Australia, Submission to Attorney-General’s Department, Privacy Act Review (29 November 2020) 9-10.

[61] Chenou and Radu (n 25) 93-94.

[62] Michel José Reymond, ‘The Future of the European Union Right to Be Forgotten’ (2019) 2 Latin American Law Review 81, 94.

[63] Office of the Australian Information Commissioner, Your Complaint Review Rights (Web Page) <https://www.oaic.gov.au/privacy/privacy-complaints/your-complaint-review-rights>.

[64] Chenou and Radu (n 25) 82; Reymond (n 62).

[65] Reymond (n 62).

[66] Ibid.

[67] Connolly (n 2) 62; Privacy Act (n 8) ss 13(1)(a), 13G; Crimes Act 1914 (Cth) s 4AA.

[68] Chenou and Radu (n 25) 96.

[69] Human Rights Act 1998 (UK) sch 1: Jacques and Hempel (n 31) 198.

[70] Queensland Office of the Information Commissioner, Submission to Attorney-General’s Department, Privacy Act Review (10 January 2022) 3.

[71] Chief Justice Bathurst, ‘A Comparative Perspective on Privacy Law: The Australian Experience’ (Keynote Address, Indian National Bar Association Annual Conference, 25 November 2017) 5–6.

[72] Ibid.

[73] George (n 37) 921.

[74] Privacy Act (n 8) ss 2A(a)-(b).

[75] Issues Paper (n 6) 8, 15.

[76] Centre for Media Transition, Submission to Attorney-General’s Department, Privacy Act Review (28 November 2020) 5.

[77] Ibid.

[78] ICCPR (n 8) art 19(2); Azadeh Dastyari, ‘Vitalising International Human Rights Law as Legal Authority’ [2020] UNSWLawJl 30; (2020) 43(3) University of New South Wales Law Journal 827, 840.

[79] Dastyari (n 78) 844.

[80] Privacy Act (n 8) s 7B(4); Issues Paper (n 6) 35.

[81] Privacy Act (n 8) s 7B(4).

[82] Jacques and Hempel (n 31) 206.

[83] Ibid; Connolly (n 2) 62; Magyar Helsinki Bizottsa ́g v Hungary (European Court of Human Rights, Application No 18030/11, 8 November 2016) [168].

[84] Centre for Media Transition Submission (n 76) 12–13.

[85] Nine News Submission (n 15) [2].

[86] Australian Law Reform Commission, For Your Information: Australian Privacy Law and Practice (Report No 108, May 2008) vol 2, 1452, recommendation 42-1.

[87] Peter Greste, ‘Define Journalism; Not Journalists’ (Reform Briefing, Press Freedom Policy Papers, University of Queensland, March 2021) 2.

[88] Nine News Submission (n 15) [2].

[89] Australian Law Reform Commission, ‘Serious Invasions of Privacy in the Digital Era’ (Discussion Paper, March 2014) 223 (‘ALRC Discussion Paper’).

[90] Ibid.

[91] Bayliss-McCulloch (n 3) 9.

[92] Ibid; ALRC Discussion Paper (n 89).

[93] Bayliss-McCulloch (n 3) 9.

[94] Issues Paper (n 6) 52; Centre for Media Transition Submission (n 76) 19.

[95] Damian Clifford and Jeannie Paterson, ‘Consumer Privacy and Consent: Reform in the Light of Contract and Consumer Protection Law’ (2020) 94(10) Alternative Law Journal 741.

[96] Sinn (n 7).

[97] Discussion Paper (n 12) 16.

[98] Clifford and Patterson (n 95) 743–746.

[99] Ibid 746; Australian Competition and Consumer Commission (n 2) 25.

[100] Australian Competition and Consumer Commission (n 2) 35; GDPR (n 3) art 4(11); Clifford and Patterson (n 95) 746.

[101] Clifford and Patterson (n 95) 746.

[102] Ibid 748; Australian Privacy Principles (n 5) pt 3 s 6.1.

[103] Clifford and Patterson (n 95) 748.

[104] Ibid.

[105] Ausloos (n 1) 104–105.

[106] Australian Competition and Consumer Commission (n 2) 24; Clifford and Patterson (n 95) 745–746, 749.

[107] Clifford and Patterson (n 95) 749.

[108] Ibid 747.

[109] Australian Competition and Consumer Commission (n 2) 36; Centre for Media Transition Submission (n 76) 16.

[110] Australian Competition and Consumer Commission (n 2) 36.

[111] Centre for Media Transition Submission (n 76) 16.

[112] Clifford and Patterson (n 95) 747.

[113] Ibid; GDPR (n 3) art 25(1)-(2).

[114] Clifford and Patterson (n 95) 747.

[115] Law Council of Australia, ‘Review of the Privacy Act 1988 (Cth)’ (Issues Paper, 17 December 2020) 18.

[116] California Code Business and Professions Code 2016, CAL. BUS. & PROF. CODE § 22581(a)(1).

[117] Anna Bunn, ‘Children and the “Right to Be Forgotten”: What the Right to Erasure Means for European Children, and Why Australian Children Should be Afforded a Similar Right’ (2019) 170(1) Media International Australia 37, 39; Convention on the Rights of the Child, opened for signature 20 November 1989, 1577 UNTS 3 (entered into force 2 September 1990) art 6(1).

[118] Chief Justice Bathurst (n 71) 5.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/UNSWLawJlStuS/2023/3.html