AustLII Home | Databases | WorldLII | Search | Feedback

University of New South Wales Law Journal Student Series

You are here:  AustLII >> Databases >> University of New South Wales Law Journal Student Series >> 2019 >> [2019] UNSWLawJlStuS 8

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Lim, Dal --- "Smart Toys, Lagging Governments: A Critical Comparative Analysis Of The United States And Australian Governments' Responses To Smart Toys" [2019] UNSWLawJlStuS 8; (2019) UNSWLJ Student Series No 19-08


SMART TOYS, LAGGING GOVERNMENTS: A CRITICAL COMPARATIVE ANALYSIS OF THE UNITED STATES AND AUSTRALIAN GOVERNMENTS’ RESPONSES TO SMART TOYS

DAL LIM

Technological advancements have affected and permeated every aspect of society, even transforming children’s toys from simple physical objects to advanced devices with internet connectivity and embedded with advanced software and sensors. Toys with such capabilities are being widely referred to as “smart toys”. This term is presumably derived from the term “smart device”, which refers to an electronic gadget capable of connecting to other networks or devices to remotely interact and share data, often autonomously. One only needs to consider smartphones, which have become ubiquitous. Smart toys are becoming increasingly popular, yet as with other smart devices, there are also serious concerns about privacy, which has been subject to continuous erosion as technology develops. Genuine risks to the personal safety of children arise from poor data safeguards, third-party data sharing and personal information collection, location tracking, audio and video recording, and interactive communication. Troubling personal safety implications arise from the proven ‘hackability’ of smart toys, where children’s tendency to harbour trust in their toys means they could be inclined to potentially follow a toy into danger or be psychologically harmed should a malicious hacker infiltrate the toy. Smart toy hackings affecting millions of children such as VTech and Hello Barbie are recent examples highlighting real privacy concerns, as well as German regulators classifying an interactive doll called ‘My Friend Cayla’ as an illegal espionage device due to its inconspicuous surveillance capabilities.

Interference with the privacy of children is particularly disquieting due to their inherent innocence, immaturity and vulnerability, which is legally recognised through the enactment of protective laws worldwide recognising their legal incapacities. Children are also recognised as having their own rights in international law, which is confirmed by the widespread ratification of the United Nations Convention on the Rights of the Child. Despite extensive recognition of the need for children to have special protections, governments worldwide have largely been inert in respect of regulating smart devices to ensure companies have adequate cybersecurity and limit and disclose the collection of personal information and data. Whilst adults have the capacity to understand privacy and consent to disclosure of their personal information, smart toys aimed at children require closer scrutiny due to children’s vulnerability and their legal incapacity to provide consent. Fears about stifling technological development and innovation are no longer satisfactory reasons for legislators to refrain from regulating technology from a children’s rights perspective. Globalisation means that smart toys can be quickly and easily distributed worldwide, and regulators must adopt a proactive approach to force smart toy companies to comply with children’s recognised privacy rights prior to toys being available to consumers.

I INTRODUCTION

This essay will examine the regulation and emergence of smart toys in the United States and Australia to argue that children’s online privacy requires stronger enforcement mechanisms and legislative protections. The US has successfully prosecuted smart toy companies under legislation protecting children’s online privacy, whereas such action is not possible in Australia. Contrasting the approaches to smart toy regulation of two common law jurisdictions highlights Australia’s failure to adhere to its international legal commitments of protecting children’s privacy and safety. The US’s proactive approach has not stifled technological innovation, and Australia should learn from the US’s prioritisation of children’s rights over business growth.

A definition of smart toys will first be provided, along with risks arising from their use and beneficial features that are guaranteeing their continuous growth. The legal right children have to privacy under the international conventions will be established, and legislation in the US and Australia reflecting fulfilment of this obligation or adherence to constitutional rights of privacy will be critically examined. Having established this, existing smart toy regulation in the US and Australia through legislation, cases, fining power, and other mechanisms will be scrutinised. This will involve consideration of threats-based approaches through guidance documents and warnings issued by relevant regulatory agencies in each country compared to judicial action. Finally, arguments in favour of stricter regulation by implementing technology-specific laws and the precautionary principle for smart toys will be presented, where active risk management through laws can be justified by the risks and dangers associated with smart toys.

II SMART TOYS: DEFINITION, RISKS AND BENEFITS, GROWTH

This section will establish a working definition of ‘smart toys’ and set out key risks and benefits, as well as examine the rapid growth of the industry. Several high-profile cases have highlighted how smart toys with poor security safeguards and problematic privacy features can endanger children, yet consumers are often inadequately informed about these dangers. Research shows that even shrewd consumers aware of possible risks give greater weight to the perceived benefits of educational and entertainment features. The strong growth of the smart toy market worldwide despite documented dangers reinforces this essay’s argument that greater scrutiny and regulation of these devices is necessary.

A Definition: Smart Toys

Children’s toys have been an intrinsic part of human experience for as long as humans have existed and can simply be understood as objects for play or education. As humans have developed, so too have toys, where technological developments have advanced mere plush toys to devices with internet capabilities and sensors known as ‘smart toys’. Smart toys can be characterised as an ‘emerging’ field,[1] and fall under the purview of the internet of things (‘IoT’), which is defined as: ‘[E]veryday things, objects and devices that are connected to the internet... their connection to the Internet through sensors to record, process, store and transfer data, whether they communicate between themselves, with computers or with people.’[2]

As a subset of IoT, smart toys can contain sensors and have artificial intelligence capabilities, and ‘frequently have internet connectivity – directly or indirectly through companion apps – and collect information about their users and environments.’[3] Microphones, location-tracking, storage devices, cameras and voice recognition technology are also often key features. The constant collection and use of this data through internet connectivity frequently includes the collection and storage of children’s personal information. Three distinguishing characteristics typical of smart toys are: a conventional physical toy with integrated sensors and electronic components enabling network communication; a mobile device providing the physical toy with mobile services; and a mobile application to interact with and control the toy.[4] Technological advancements and artificial intelligence integration are likely to lead to increasingly sophisticated smart toys that can interact without external applications or devices.

B Smart Toy Risks and Benefits

The features of smart toys are undoubtedly exciting for consumers, where entertainment and educational features and the novelty aspect of interactive toys can easily overshadow concerns about privacy policies and security vulnerabilities. Smart toys capable of providing children with new experiences of collaborative play which develop literacy, social and numeric skills and promote active learning are increasingly commonplace.[5] Location tracking can be viewed as advantageous for parents who want to track their children’s whereabouts in real time. Personalisation and individualisation through interactive media that adapts to an individual child’s needs and progress is undeniably appealing, ‘giving students choice in the pace, place, and mode of their learning’.[6] Research has found parents are more likely to think the benefits of smart toys outweigh the harms,[7] yet parents may not be the best decision-makers due to their own misunderstandings or lack of knowledge about smart toy privacy issues.[8]

The rapid growth of smart toys is concerning, where a recent spate of high-profile cases has highlighted inadequate security measures which can and have led to grave privacy leaks. Some recent cases have included: ‘Rapid7’, where baby monitors could be hacked to view video and speak to children;[9] ‘CloudPets’, a teddy bear through which hackers could directly speak to children and access voice records;[10] MiSafes’ children’s smartwatch, which a security researcher found ‘easy to hack’ due to a lack of data encryption, enabling location-tracking, incognito eavesdropping and spoof calls to the watch; [11] and ‘Hello Barbie’, which automatically connected to unsecured Wi-Fi networks, thus enabling attackers to listen in on conversations.[12]

These cases highlight the genuine safety concerns arising from children’s inherent vulnerability, immaturity, innocence, and diminished capacity to make decisions and recognise danger.[13] It is common practice for companies to collect personal data and distribute it with third parties such as advertisers, yet children are a vulnerable population ignorant to invasions of privacy and the effects of targeted marketing.[14] Children are incapable of understanding the full extent of the risks of sharing their data and are subsequently unable to provide meaningful consent. The collection and storage of personal data which can include the child’s name, photos, videos, and voice recordings can also intrude on children’s privacy. The normality and permeation of smart devices indicates that privacy is generally being eroded. A recent example of this is the exposé that Amazon, Apple and Google employ staff who listen to customer voice recordings from their smart speakers and apps to improve speech recognition.[15] This revelation is particularly disquieting given that many customers are unaware humans may be listening and children are especially prone to disclosing personal information to smart devices that could be misused.

Consumers are inadequately informed of business practices governing the collection of their personal information, and the environment of smart toys has raised several legal issues regarding cybersecurity, the use and release of data and information, and obtaining meaningful consent from child consumers. The nature of toys themselves gives rise to greater risks as research has found children tend to harbour trust and attachment for their smart toys, and this trust can be exploited.[16] Global laws are largely failing to focus on prevention, and the veritable flood of unregulated, cheap smart toys with security susceptibilities has already impacted millions of users worldwide. Four key ways smart toys endanger children are: unsecured wireless connections; tracking children’s movements; poor data protections; and third-party data sharing and usage.[17] Private organisations and security researchers have been at the forefront of testing and identifying smart toys which have problematic privacy features and policies, such as Mozilla’s buyers’ guide to children’s toys which rates their privacy security standards.[18]

There are numerous devices with similar issues and security flaws, and these vulnerabilities have predominantly been exposed by security researchers, prompting public outcry and extensive media coverage.[19] Grave possibilities arise from these flaws, such as immediate personal danger if a child’s location can be tracked, the sharing of identifying and confidential personal information, a hacker using the toy to lead children into physical danger, and psychological harm if a toy is maliciously hacked to generate violent or disturbing audio or video content, with a plethora of other plausible, dire outcomes. Children’s safety is clearly at risk, and regulation can promote strong data and privacy protections by mandating the adequate handling and safeguarding of personal information.

C Smart Toys’ Growth and Dominance of the Toy Market

Smart toys are becoming increasingly common in households with children, and there are predictions the smart toy market will reach US$69.16 billion by 2026.[20] More than one third of U.S. homes with children currently has at least one IoT toy,[21] with North America leading the smart toy market in 2017.[22] North America is expected to remain dominant, followed by Europe and the Asia Pacific in years to come.[23] This is attributable to the growing adoption of smart technologies in North America, where the connected toy market is at an ‘emerging’ stage and expected to rapidly expand worldwide in the coming years.[24]

This includes connected devices which are not toys in the traditional sense, but intended for children’s use, such as smartwatches and Amazon’s Echo Dot, a smart-home device aimed at children.[25] The increasing number of internet users along with the growing popularity of IoT is generally stimulating awareness and growth of the smart toy market worldwide.[26] The internet and IoT are dynamically changing all areas of life, and being constantly connected and monitored is rapidly becoming the new norm. This is forcefully occurring: companies that do disclose their data collecting practices commonly restrict or ban the ‘user experience’ of consumers opt out of data collecting processes.[27] For example, users are blocked from accessing the companion app of the doll ‘Hello Barbie’ unless all terms and conditions are agreed to.[28] These terms and conditions include voice data capture and storage. The growth in popularity and production of smart toys strengthens the argument that regulation is necessary to ensure children can only interact with secure, privacy-compliant devices.

III CHILDREN’S PRIVACY RIGHTS

Intrusion into children’s privacy is central to arguments against smart toys which collect personal information and have inadequate safeguards. How privacy is legally defined will be established, where it is broadly recognised as a human right. Widely ratified treaties acknowledge both the right to privacy and specifically, children’s privacy rights. This section will also compare the recognition of privacy rights in international law to national legislative, constitutional or regulatory protections in US and Australia. This comparison exposes the shortcomings of national efforts to comply with treaty obligations and the lack of privacy protections Australians have compared to US citizens.

A Definition: Privacy

The right to privacy is recognised internationally as a fundamental human right, where the International Covenant on Civil and Political Rights, ratified by both the US and Australia, provides: ‘No one shall be subjected to arbitrary or unlawful interferences with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.’[29] Privacy is important for protecting vulnerable individuals from undue interference or harassment and safeguarding personal freedom and safety from harassment.[30]

Technology, the internet, social media and now smart toys have distorted perceptions of privacy, with some commentators suggesting that the internet is limiting the law’s ability to protect privacy, and that: ‘privacy is dead’.[31] Parents themselves are often complicit in infringing their children’s privacy by sharing images of their children on social media and purchasing smart toys for their children without understanding privacy infringements. The recent influx of highly publicised and large scale data breaches affecting tens of millions of users such as Facebook[32] and Cambridge Analytica[33] support such arguments, particularly due to the power and impact of such technology organisations in everyday life.

B Recognition of Children’s Privacy Rights in International Law

Children’s privacy rights are recognised in international law. The United Nations Convention on the Rights of the Child sets out the civil, political, economic, social and cultural rights children are entitled to, and echoes the ICCPR in identifying privacy as a right. [34] Article 16 of the UNCRC states: ‘No child shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence, nor to unlawful attacks on his or her honour and reputation.’ The UNCRC is the most widely ratified international human rights treaty in the world, and all UN Member States have ratified the Convention except for the United States, which is a signatory.

Whereas ratification indicates a State’ s consent to be bound to a treaty and imposes obligations on national governments to enact the necessary legislation to give domestic effect to the treaty,[35] a signature does not establish consent to be bound, although States are obliged to act in good faith by refraining from acts which would defeat the treaty’s purpose.[36] The rapid adoption of the internet and IoT is distorting traditional perceptions of privacy, but States such as Australia which have ratified the UNCRC risk breaching their treaty obligations by failing to adapt their legislation accordingly. Although the US has not ratified the treaty, there is a constitutional right to privacy in the US. The following sections will examine the privacy laws giving effect to international obligations for Australia or constitutional rights in the US.

C The Right to Privacy in the United States of America

The right to privacy in the US is mainly concerned with government intervention, and the Supreme Court has held that individuals have a ‘reasonable’ right to privacy protected under the Fourth Amendment.[37] However, children’s online privacy is specifically recognised in the US under the Children’s Online Privacy Protection Act (COPPA).[38]

1 Legislation: COPPA and the COPPA Rule

The primary regulator of children’s online privacy and security in the US is the Federal Trade Commission (FTC), which is responsible for issuing regulations and enforcing COPPA. Congress enacted COPPA in 1998, which prohibits unfair or deceptive acts or practices in connection with the collection, use or disclosure of personal information from and about children online.[39] In accordance with its obligations to issue regulations giving effect to COPPA,[40] the FTC enacted the COPPA Rule. [41] The COPPA Rule clearly sets out what operators of websites and online services targeting children that utilise personal information must do to comply with protecting children’s online privacy and safety.[42]

The COPPA Rule requires that companies collecting personal information from children, who are defined as individuals under the age of thirteen-years-old,[43] follow steps to ensure that their information is protected.[44] This includes clearly disclosing to parents the information it collects, how it will be used, and seeking verifiable parental consent.[45] Operators are obliged to provide notice of their information practices and privacy policies, which must be clearly and completely written.[46] Companies must also take reasonable steps to protect the confidentiality, security and integrity of the personal information they collect about children.[47] Parents have the right to review personal information provided by their children,[48] and data can only be retained for a ‘reasonably necessary’ time and protected against unauthorised access.[49]

COPPA and its regulations through the COPPA Rule have been in place for over 20 years, and several amendments in recent years have modernised the legislation to accommodate new technologies. These amendments have included updating the definition of personal information to include geo-location information, photos, videos and audio files.[50] The FTC has actively enforced COPPA, bringing 31 actions against operators for non-compliance with COPPA since its enactment[51] and issuing hefty fines, where civil penalties of up to US$42,530 per violation can be imposed.[52] There has been criticism of COPPA as ineffective by merely encouraging age fraud[53] and violation of children’s First Amendment rights to freedom of speech and self-expression.[54] However, COPPA is an impressive model for protecting children’s privacy in recognising internet usage risks and the FTC has been vigilant with enforcement.

Further privacy protections are suggested by the Fourth Amendment to the US Constitution, which protects the: ‘right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures’.[55] Although this is primarily concerned with unlawful government searches, it has been interpreted as giving rise to an objective test on whether an individual has a reasonable expectation of privacy.[56] Justice Harlan of the US Supreme Court stated: ‘a man’s home is, for most purposes, a place where he expects privacy’,[57] and the use of toys primarily in private homes supports the argument for smart toy privacy regulations. However, COPPA and the COPPA Rule perhaps provide more protections by specifically recognising and protecting children’s right to privacy, although the constitutional right to privacy is also noteworthy. The US approach to protecting children’s privacy through targeted legislative recognition of children’s online privacy rights marks the US as a leading nation in this area, where few other countries appear to be cognizant of the genuine threat the internet poses to privacy.

D Australia

1 Legislation: Privacy Act 1988 (Cth) and ‘Australian Privacy Principles’

Australia’s key information privacy law, the Privacy Act 1988 (Cth), is concerned with the security of personal information held by certain entities rather than privacy generally.[58] An object of the act is to promote the protection of the privacy of individuals,[59] yet this does not establish a general right to privacy. Children are not distinguished or mentioned in the Privacy Act. Companies must comply with the 13 Australian Privacy Principles (APP) set out,[60] and APP entities are obliged to ‘manage personal information in an open and transparent way.’[61] The APP outlines how companies must handle, use and manage personal information. A breach of an APP is deemed to be an interference with an individual’s privacy,[62] and penalties of up to AU$420,000 can be faced by breaching entities,[63] or five times that amount for body corporates.[64]

The Office of the Australian Information Commissioner (OAIC) is primarily responsible for handling complaints where there has been a suspected APP breach and can initiate investigations.[65] The OAIC can make determinations on privacy complaints[66] and in practice, monetary fines have never exceeded $20,000,[67] which is underwhelming compared to the FTC’s penalties. Large corporations are unlikely to be deterred by such small penalties.

Another serious flaw is that an APP entity is defined as an agency or organisation,[68] which does not include ‘small business operators’, or businesses with an annual turnover of AU$3 million or less.[69] Small businesses can subsequently misuse personal information and breach other APPs as the deterrent penalties are unenforceable and APP breaches are in effect allowed. A business guideline for start-up businesses on the OAIC website even states: ‘If your start-up is a small business (turnover of $3 million or less per annum) the Privacy Act may not apply to you yet. But ask yourself; do you plan for your business to stay small, or to grow?’[70] The Australian government’s fears about stifling technological innovation are evidently sacrificing individuals’ privacy and personal information as this exemption prioritises the interests of entities in carrying out their functions over the privacy of individuals. This legislative loophole is contrary to the object of balancing entity’s interests with individuals’ privacy.[71]

The Privacy Act fails to comprehensively protect the privacy of Australians and Australian children, and there is no mention or recognition of the internet or online services. The definition of personal information is limited to information or opinion about an identified individual,[72] which would not include photos, images and other media like COPPA. Failure to recognise children’s privacy rights in accordance with obligations created by Australia’s ratification of the UNCRC could result in countermeasures under the law of international responsibility.[73] Compared to COPPA, Australia’s children’s privacy protections are evidently lacking.

IV SMART TOY REGULATION IN THE UNITED STATES AND AUSTRALIA

Comparing smart toy regulations in the US and Australia highlights Australia’s regulatory inertia and failure to recognise the serious risks posed by smart toys. Whilst the United States has adopted specific legislation addressing the protection of children’s online privacy which has enabled breaching smart toy companies to be prosecuted, Australia has largely been inactive.

A Smart Toy Regulation in the US

1 COPPA Case: USA v VTech[74]

The recent case of USA v VTech involved electronic toymaker VTech and was the FTC’s first children’s privacy and security case specifically involving smart toys. The case demonstrates the effectiveness of the COPPA Rule due to its applicability to smart toys.

In this case, a major data breach occurred in 2015 involving the unauthorised access of VTech’s customer data database.[75] Personal information such as names, passwords and email addresses were stored on the database, along with the genders and birthdates of children.[76] VTech disclosed that a staggering 4,863,209 customer (parent) accounts and 6,368,509 related children’s profiles were affected.[77] Internet security expert Troy Hunt found there was a lack of cryptographic protection, and a ‘total lack of care shown by VTech in securing this data’, where the exposed data could allow someone to link a child to their parent and locate the child’s physical address.[78] The District Court held that VTech violated the COPPA Rule and imposed a significant fine and a number of other orders.

The data breach was one of many high-profile breaches in 2018 such as Facebook, Google, and Marriott Starwood Hotels, reinforcing the large-scale impact of online privacy transgressions.[79] In this instance, the hacker was not malicious and in fact notified Motherboard, a multiplatform media publication, who then notified VTech of the unauthorised access and security flaws.[80] The hacker told Motherboard: ‘Frankly, it makes me sick that I was able to get all this stuff.’[81] Had the hacker had nefarious intentions, VTech clearly lacked the necessary security measures to even be aware of the breach, highlighting the aforementioned safety concerns for children on the customer database.

Whilst investigating, the FTC found VTech’s data collection methods through their ‘Kid Connect’ app breached the COPPA Rule. Through ‘Kid Connect’, VTech gathered personal information without obtaining verifiable parental consent or informing users of the type of data being collected and how it would be used.[82] The District Court identified four key COPPA breaches: failing to make reasonable efforts to ensure parents received direct notice of VTech’s personal information practices;[83] failing to provide a ‘prominent and clearly labeled link to an online notice of its information practices with regard to children’ on any of their websites and online services; failing to ‘obtain verifiable parental consent’,[84] and; failing to implement and ‘maintain reasonable procedures to protect the confidentiality, security, and integrity’ of the personal information and data they collected.[85]

VTech was fined $650,000 for the breaches,[86] and an injunction was imposed permanently preventing VTech from misrepresenting their data security and privacy practices and violating COPPA.[87] The Court also ordered VTech to establish, implement and maintain a ‘comprehensive information security program’[88] conduct a risk assessment involving employee training and management, and prevention and detention of system attacks, among other things.[89] Biennial independent assessments of their data security programs submitted to the FTC for the next twenty years were ordered, to be obtained from a ‘qualified, objective, independent third-party professional’, and submitted to the FTC to ensure their continued adequacy.[90]

The VTech outcome shows that the US is taking children’s online privacy seriously and COPPA can flexibly adapt to new technologies and devices such as smart toys. Additionally, on 27 February 2019 the FTC obtained their largest civil penalty of US$5.7 million for COPPA violations from video social networking app ‘Musical.ly’ for its failure to seek parental consent before collecting children’s personal information.[91] FTC Chairman Joe Simons stated: ‘This record penalty should be a reminder to all online services and websites that target children [...] we will not tolerate companies that flagrantly ignore the law.’[92] FTC Commissioners have indicated that ‘growth even at the expensive of endangering children’ is ‘egregious conduct’ and the FTC intends to begin investigating and charging individuals responsible for decision-making.[93] The FTC’s approach can be described as a proactive and precautious based on the US’s recognition of the real risks online activity can pose to children’s safety,[94] and COPPA can evidently protect children from the real dangers of smart toys.

2 Threats-Based Approaches: Guidance Letters, Compliance Plans and FBI Public Service Announcements

The VTech case was decided over four years after 2013 COPPA amendments, during which time the FTC issued guidance letters, published media releases, compliance plans and the Federal Bureau of Investigation (FBI) made a Public Service Announcement (PSA). Tim Wu has argued that agencies regulating disruptive innovations such as smart toys should adopt a threats-based approach by issuing guidance documents and avoid issuing specific laws for areas of ‘high uncertainty’ such as newly invented technologies.[95] Wu argues a flexible ‘threats regime’ is better suited for dynamic industries, yet Cortez’s argument that regulatory threats are only suitable as ‘temporary stopgap[s]’ rather than long-term strategies is supported by the FTC’s response to smart toys.[96]

Although the COPPA Rule was amended proactively contrary to Wu’s ideal post-threat creation of laws, the FTC did not enforce the COPPA Rule against smart toy companies until VTech. This lack of enforcement allows the FTC’s treatment of smart toys to be analysed with respect to Wu’s threats-based framework, which endorses the use of threats prior to, and to inform lawmaking.[97] VTech was objectively given enough notice, or ‘threats’, and formal legislative action by the FTC has both punished VTech’s non-compliance with COPPA and ensured they take appropriate action to guarantee ongoing compliance. Large civil penalties are an effective punishment and deterrent of unlawful conduct and mandatory security programs as ordered by the court are more effective than a solely threats-based approach.

This essay has established that as a subset of IoT, smart toys are an emergent technological category that could be characterised as a dynamic industry due to its disruptive innovation.[98] The FTC adopted a threats-based approach in recognition of the rapid changes to the internet and technology which necessitated the 2013 amendments expanding the definition of ‘personal information’ to include photos, videos, geo-location and audio. Firstly, the FTC sent ‘educational letters’ to more than 90 businesses both overseas and in the US that could be affected by changes to the COPPA Rule.[99] This was six weeks before the amendment date to assist businesses in complying with the updated requirements. These letters were private threats,[100] although the FTC also issued a press release on the distribution of these letters.[101]

In June 2017 the FTC published an updated guidance document for businesses on their website advising on compliance with the COPPA Rule for businesses: ‘The Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business’.[102] It is a step-by-step plan for assisting companies in determining whether they are covered by COPPA and how to comply with the COPPA Rule, and specifically states that connected toys and IoT devices are covered.[103] The update to the guidance materials was a clear warning to businesses about COPPA’s coverage of emerging IoT categories such as smart toys.

One month after this guideline, the FBI’s Internet Crime Complaint Center issued a PSA encouraging: ‘consumers to consider cyber security prior to introducing smart, interactive, internet-connected toys into their homes or trusted environments’.[104] The FBI is the domestic national security and intelligence agency in the US with both intelligence and law enforcement responsibilities.[105] The PSA identified the privacy risks mentioned here and other dangers such as child identity fraud and exploitation. The PSA detailed specific issues consumers should be aware of, such as user agreement disclosures, where personal data is sent and stored, and security safeguards. It provided ten recommendations for parents to consider prior to using internet-connected toys, as well as an online link for filing complaints about children’s toys.[106]

Since COPPA’s creation, the FTC has also authorised ‘safe harbors’ from law enforcement action for operators that comply with a FTC-approved COPPA self-regulatory program.[107] There are currently seven ‘safe harbor’ programs operated by private companies that establish COPPA-compliant self-regulatory guidelines.[108] The COPPA Rule enables industry groups to apply for safe harbor status,[109] and harbor participants are subject to these guidelines and disciplinary procedure rather than official FTC protocol. The FTC stated the safe harbor option: ‘encourages industry self-regulation, which the Commission believes often can respond more quickly and flexibly than traditional statutory regulation to consumer needs, industry needs, and a dynamic marketplace.’[110] Safe harbors have been criticised for low industry participation and minimal regulatory flexibility,[111] yet are an innovative mechanism for companies to comply with COPPA and means more organisations are monitoring and regulating online activity,[112] thus boosting children’s online privacy protection.

VTech was ‘threatened’ in several ways and had the time, options and capability to comply with COPPA, yet still failed to take adequate steps to protect its child users’ personal information and safety. A solely threats-based approach was clearly ineffective, particularly as COPPA was an enforceable legal regulation that particularly incentivised compliance with the guidance documents. Guidance is unenforceable, and the FTC would not have been able to contend violation of the guidance documents or initiate enforcement actions solely based on them.[113] The FTC attempted to mitigate unfair disturbance to online businesses affected by the 2013 amendment by delaying enforcement and issuing threats, thereby ‘softening’ traditional regulation without undermining it long-term.[114] Enforcing COPPA was ultimately the most effective way to handle VTech’s dubious data protection and collection practices, yet the mixture of a threats-based and regulatory approach reflects the US’s varied, considerate and serious treatment of children’s privacy violations.

B Australia

Australia’s Privacy Act is paltry compared to the COPPA Rule, which is best demonstrated by the lack of any cases involving children’s online privacy, let alone any smart toy cases or legislation. Australia’s approach can best be described as regulatory inertia in favour of businesses, as reflected by the inapplicability of the already-skeletal Privacy Act to small businesses and miniscule fines.

An OAIC employee’s 2015 speech published on the OAIC website on big data and privacy endorsing organisations using big data in Australia specifically referenced smart toys:

[W]here organisations’ data collection practices impact more vulnerable people, like children or bystanders, attitudes can be different. On hearing about Hello Barbie — the new Barbie that can hear your children and reply to them (as well as collecting the data in the process) — the Campaign for a Commercial-Free Childhood started a petition to stop production and parents are signing it. Will it stop production? Maybe not. Will it lead to better privacy protections? We can hope.[115]

This is an abysmal articulation of regulatory oversight, highlighting the complacent and business-oriented approach the Australian government has to children’s privacy. Acknowledging children’s vulnerability yet merely express hopefulness for the success of a private organisation’s petition is a wholly inadequate response from a regulator. Despite acknowledging ‘Hello Barbie’s’ potential harm, the complacency and the contrast with the FTC’s assertive threats-based and regulatory approach is troubling for Australian children.[116] Regulatory inertia can be difficult to overcome without an immense failure or shock which draws attention to regulation,[117] such as a child being lured into danger from location tracking. Despite children’s safety being at risk, the Privacy Act’s small business loophole indicates the major role economics is playing in government regulation.[118] The ALRC’s 2008 report recommended OAIC should develop guidelines to determine the capacity of children to give consent[119] and distinguish children and adults in the Privacy Act,[120]and even analysed COPPA and suggested adopting a similar model[121] – these well-founded recommendations were evidently not adopted. The Australian Labor Party suggested ‘special protection for children’ based on COPPA, but this also failed.[122]

Additionally, the Office of the eSafety Commissioner (OEC) is responsible for promoting the online safety of Australians and is primarily concerned with educating Australians on the risks of being online.[123] OEC Commissioner Inman Grant said in response to a poll on parents purchasing web-enabled presents: ‘Ultimately, parents are the frontline of defence against any risk their children can be exposed to online, so remaining engaged in their online lives just as they are offline is integral.’[124] This statement places responsibility solely on parents as consumers to make determinations on smart toys, and would barely satisfy Wu’s definition of an agency threat.[125] The inadequacy of the Australian response to smart toys compared to the US despite real, recognised dangers reinforces arguments in favour of stricter smart toy regulations.

V ARGUMENTS IN FAVOUR OF STRICTER SMART TOY REGULATIONS

This essay has shown that smart toys can genuinely endanger children’s safety by encroaching on their privacy through the collection of personal information. The US approach reflects the FTC’s awareness, powers and willingness to prosecute companies egregiously handling American children’s personal information online. The successful prosecution of the smart toy company in the VTech case shows that the US government through the FTC, COPPA and the COPPA Rule has the ability and willingness to prosecute companies infringing on children’s online privacy. Although it is possible for Australian regulators to prosecute companies in breach of the Privacy Act, there is a clear unwillingness to do so, where the focus is instead on encouraging business growth.

However, improvements and changes are possible, where three persuasive theories and ideas in favour of stricter smart toy regulation to provide stronger protections to children’s privacy rights will be suggested. The first suggestion is for lagging nations such as Australia to take note of the US’s approach to the protection of children’s privacy, which can be characterised as embodying the precautionary principle. Alternatively, technology-specific laws in the US and Australia could better protect children, whilst an argument urging reconsideration of the concept of parents as the best decision-makers for their children will also be contended. These diverse arguments are intended to provide several compelling justifications and agitate for greater governmental protections of children’s online privacy and safety.

A The Precautionary Principle

COPPA was created in 1998 to address privacy and safety risks created by children using the internet,[126] and is an example of the precautionary principle due to COPPA’s inception in response to threats of harm and uncertainty about the internet.[127] The precautionary principle recognises that inchoate, or emerging technologies are unpredictable and regulatory intervention is justified on the basis of risk to human safety. [128] The US response was well-founded due to children’s inherent immaturity,[129] and American children have subsequently benefitted from this early, clear regulation recognising a genuine and ongoing issue.[130] Australian children can still benefit from adopting this approach given that the internet is still inchoate insofar it is still not completely developed, nor stable, [131] as evidenced by the recent spawning of IoT and smart toys.

Australian regulators are clearly reluctant to specifically regulate technology to encourage businesses,[132] yet the regulatory focus should be on the potential for harm rather than the presence of ‘technology.’[133] Additionally, shifting the emphasis to protecting children’s privacy rights, as COPPA did, would also be in accordance with Australia’s UNCRC obligations. Australia’s inertia seems to stem from worries about impacting technological innovation, where bans and regulations can prevent the development of positive technologies.[134] However, protection of children’s online privacy should be integral to any technology due to the proven safety risks. The North American smart toy industry’s global dominance and strong growth despite COPPA highlights that the US’s adoption of the precautionary principle has not hindered businesses or stifled innovation.[135] Instead, privacy issues may hamper the smart toy market’s overall growth as the dangers of smart toys can deter consumers.[136] A dimension of regulation is knowledge, where uncertainty attitudes are based on the amount of knowledge available – increased knowledge on smart toy risks should be motivating the Australian government to regulate.[137]

Failure to adopt the precautionary principle to protect children’s online privacy would be a major oversight, and the Australian government could jeopardise Australian businesses by failing to implement regulations that better protect consumer’s online privacy. The smart toy industry and other technological industries are rapidly growing, as is consumer awareness and mistrust of data collection and privacy policies – delaying intervention could be both expensive and detrimental in the long-term for businesses. [138] Adoption of the precautionary principle for smart toy and children’s online privacy regulation is ideal, particularly as the US has demonstrated its effectiveness through COPPA and its thriving smart toy industry.

B Parents as Children’s Privacy Decision-Makers

COPPA centres on obtaining parental consent prior to handling children’s personal information,[139] yet parents may not be the best decision-makers for their children’s online privacy. Despite parental guidance and supervision being the main protections children have on the internet,[140] parents may lack awareness of privacy risks to properly protect their children. Studies have shown that users are likely to install free applications and place more value on user reviews than of privacy policies, and parents are likely to face difficulty understanding privacy policies.[141] Through social media, parents often post photos of private moments with their children without their children’s consent, and ‘even the most-well-intentioned parent may be unknowingly compromising the autonomy of their child.’[142]

A general erosion of privacy is taking place in society, and parents can be complicit and unaware of how smart devices and social media jeopardise their children’s privacy. Children growing up today are ‘the most watched over generation in memory’,[143] and the traditional role of parents as trustees of their children’s rights should be rethought due to the difficulty parents can have when identifying concerning data handling practices.[144] Although the guidance documents published by the FTC and FBI are useful, there is a need for private protection strategies such as privacy quantification that will enable parents to understand and decide whether to share their children’s private data to smart toys.[145] Ideally, regulators should be more vigilant in preventing and regulating breaching smart toys prior to the toys becoming accessible to children on the market. Toy industries have regulations for toy safety yet fail to mention privacy – recognising and addressing this issue would ensure that only authorised smart toys can be purchased, thereby shifting responsibility onto industry experts rather than solely on inadequately informed parents.

C Technology-Specific Laws

Issues arising from parental control could be mitigated by having technology-specific laws for smart toys. Neither the US nor Australia have laws for IoT devices or smart toys, and both could benefit from tech-specific laws narrowly targeting smart devices, rather than maintaining their technology-neutral laws that broadly address general characteristics of privacy and the internet.[146] The collection of data by smart devices can be seen as surveillance by corporations, and tech neutrality in a surveillance context is deeply flawed due to uncertainty about how corporations are handling personal data.[147] Tech-specific laws would be the best way to balance the right to privacy with business interests. Australia’s open endorsement of tech-neutral laws ignores the benefits of tech-specificity, [148] such as enabling better protection of children’s personal safety through targeted risk management. Smart devices, as part of IoT, are not mere toys, so the general rules of toy manufacturing standards that only recognise the physical dangers of toys are unsuitable.[149] Tech-specific laws would allow for regulation of the unique aspects of smart toys, which would thus boost consumer confidence and mandate protection of children’s privacy.

VI CONCLUSION

This comparison of smart toy regulations has shown that protection of children’s privacy rights requires further regulatory attention and intervention to address the unique challenges arising from the internet and smart devices. COPPA is an excellent model Australia should learn from, yet stricter regulation through tech-specific laws could better protect children from genuine safety risks linked to smart toys. Although this essay has focused on smart toys, children’s online privacy concerns extend far beyond this IoT subset – it is disturbingly and increasingly commonplace for an array of devices or services to collection personal information, and there is significant uncertainty about what happens to this data and who has access to it.

The stark truth is that through smart devices such as smart toys, privacy is being undermined by the commodification of personal information by private corporations. Apprehension about how individuals, governments and other interested stakeholders may use personal information is also justified. Agitating for greater online privacy protections through regulations, legislation, treaties and other means is essential in a world that seems to be embracing the corrosion of privacy. Adults can meaningfully consent to losing this right, yet children’s privacy rights must be advocated for given their inherent vulnerability, inability to consent, and genuine personal safety risks proven to be possible from inadequate data protections and safeguards. Existing regulations are mostly inadequate in recognising digital privacy risks, and the public must exercise caution – cuddly toys are one of many ways corporations or anyone else might be listening, watching, and eroding privacy.

BIBLIOGRAPHY

A Articles/Books/Reports

Acute Market Reports, Global Connected Toys Market Size, Market Share, Application Analysis, Regional Outlook, Growth Trends, Key Players, Competitive Strategies and Forecasts, 2018 to 2026 (Report, January 2019)

Australian Law Reform Commission, For Your Information: Australian Privacy Law and Practice (Report, No 108, May 2008)

Australian Law Reform Commission, Serious Invasions of Privacy in the Digital Era (Discussion Paper, No 80, 2014) <http://www.alrc.gov.au/sites/default/files/pdfs/publications/whole_dp80.pdf>

Azaria, Danae, ‘Responses to Breaches under the Law of Treaties’ (2015) Oxford Monographs in International Law 139

Chung, Grace and Sara M. Grimes, ‘Data Mining the Kids: Surveillance and Market Research Strategies in Children’s Online Games’ (2006) 30(4) Canadian Journal of Communication 527

Cockfield, Arthur J, ‘Towards a Law and Technology Theory (2004) 30(3) Manitoba Law Journal 383

Cortez, Nathan, ‘Regulating Disruptive Innovation’ (2014) 29 Berkeley Technology Law Journal 175

Costin, Luke, ‘eSafety Commissioner warns parents about ‘smart’ Christmas gifts’, The New Daily (online) 6 December 2018 <https://thenewdaily.com.au/life/tech/2018/12/06/esafety-warning-smart-christmas-gifts/>

Eade, Lauren, ‘Legal Incapacity, Autonomy, and Children’s Rights’ [2001] NewcLawRw 16; (2001) 5 Newcastle Law Review 157

Family Online Safety Institute, ‘Connected Families: How Parents Think & Feel about Wearables, Toys, and the Internet of Things’ (Report, 2017) Hart Research <https://www.fosi.org/documents/231/HartReport_d7_full_report_WEB.pdf>

Federal Trade Commission, ‘Implementing the Children’s Online Privacy Protection Act: A Report to Congress’ (Report, 2007) <http://www.ftc.gov/sites/default/files/documents/reports/implementing-childrens-online-privacyprotection-act-federal-trade-commission-report-congress/07coppa_report_to_congress.pdf>

Gervais, Daniel J., ‘The Regulation of Inchoate Technologies’ (2010) 47(3) Houston Law Review 665

Golob, Brandon, ‘How Safe are Safe Harbors? The Difficulties of Self-Regulatory Children’s Online Privacy Protection Act Programs’(2015) 9 International Journal of Communication 3469

Goncalves de Caralho, Luciano and Marcelo Medeiros Eler, ‘Security Requirements for Smart Toys’ (2017) 2 Proceedings on the 19th International Conference on Enterprise Information Systems (ICEIS) 144 <https://pdfs.semanticscholar.org/9cae/d7dc05c58ae1886aaad9e0f31f8ba3d36b67.pdf>

Gordon, Neil, ‘Flexible Pedagogies: Technology-Enhanced Learning’ (2014) Flexible Pedagogies: Preparing for the Future. The Higher Education Academy, January 1

Gowda, Dev, Kara Cook-Schultz, and Ed Mierzwinski, Trouble in Toyland: The 32nd Annual Survey of Toy Safety (Report No 32, November 2017) <https://uspirgedfund.org/sites/pirg/files/reports/USP%20Toyland%20Report%20Nov17%20Web.pdf>

Greenberg, Brad A., ‘Rethinking Technology Neutrality’ (2016) 100 Minnesota Law Review 1495

Hart Research Associates, Connected Families: How Parents Think & Feel about Wearables, Toys, and the Internet of Things (Report, 2017) <https://www.fosi.org/documents/231/HartReport_d7_full_report_WEB.pdf>

Holloway, Donell and Lelia Green, ‘The Internet of Toys’ (2016) 2(4) Communication Research and Practice 506

Kemp, Richard, ‘Legal Aspects of the Internet of Things’ (2017) Kempt IT Law <http://www.kempitlaw.com/wp-content/uploads/2017/06/Legal-Aspects-of-the-Internet-of-Things-KITL-20170610.pdf>

Koops, Bert-Jaap, ‘Ten Dimensions of Technology Regulation – Finding Your Bearings in the Research Space of an Emerging Discipline’ (2010) 15 Tilburg University Legal Studies Working Paper Series 311

Matecki, Lauren A., ‘Update: COPPA is Ineffective Legislation! Next Steps for Protecting Youth Privacy Rights in the Social Networking Era’ (2010) 5(2) Northwestern Journal of Law and Social Policy 369

<https://scholarlycommons.law.northwestern.edu/njlsp/vol5/iss2/7>

Office of the Privacy Commissioner of Canada, Connected toy manufacturer improves safeguards to adequately protect children’s information (Report, 8 January 2018) <https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2018/pipeda-2018-001/>

Ohm, Paul, ‘The Argument Against Technology-Neutral Surveillance Laws’ (2010) 88 Texas Law Review 1685

Shasha, Sharon, Moustafa Mahmoud, Mohammad Mannan, and Amr Youssef, ‘Playing With Danger: A Taxonomy and Evaluation of Threats to Smart Toys’ (2018) Internet of Things Journal 1 <https://arxiv.org/pdf/1809.05556.pdf>

Shmueli, Benjamin & Ayelet Blecher-Prigat, ‘Privacy for Children’ (2011) 42 Columbia Human Rights Law Review 759

Solove, Daniel J., The Digital Person: Technology and Privacy in the Information Age (NYU Press, 2004)

Sorensen, Shannon, ‘Protecting Children’s Right to Privacy in the Digital Age: Parents as Trustees of Children’s Rights’ (2016) 36(3) Children’s Legal Rights Journal 156

Taylor, Emmeline and Katina Michael, ‘Smart Toys that are the Stuff of Nightmares’ (March 2016) IEEE Technology and Society Magazine 7

Wu, Tim, ‘Agency Threats’ (2010) 60 Duke Law Journal 1841

B Cases

Katz v. United States [1967] USSC 262; 389 U.S. 347 (1967)

Pollock v Pollock, [1998] USCA6 306; 154 F 3d 601 (6th Cir, 1998)

United States of America v Musical.ly and Musical.ly Inc, (CD Cal, No 2-19-cv-1439, 27 February 2019) <https://www.ftc.gov/system/files/documents/cases/musical.ly_proposed_order_ecf_2-27-19.pdf>

United States of America v VTech Electronics Limited and VTech Electronics North America, LLC, (ND Ill, Civ No 1-18-cv-144, 8 January 2018) <https://www.ftc.gov/system/files/documents/cases/vtech_file_stamped_stip_order_1-8-18.pdf>

C Legislation

Children’s Online Privacy Protection Act 1998, 15 USC

Children’s Online Privacy Protection Rule 16 CFR

EU General Data Protection Regulation (GDPR) Regulation (EU) No 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1

Federal Trade Commission Act, 15 U.S.C.

Personal Information Protection and Electronic Documents Act 2000 S.C. 2000

Privacy Act 1988 (Cth)

Privacy Amendment (Private Sector) Bill 2000 (Cth).

Telekommunicationsgesetz [TKG] [Telecommunications Act] (Germany) 22 June 2004, BGBl I, 2004, 2473, s 90

United States Constitution

D Treaties

Convention on the Rights of the Child, opened for signature 20 December 1989, 1577 UNTS 3 (entered into force 2 September 1990)

International Covenant on Civil and Political Rights, opened for signature 16 December 1966, 999 UNTS 171 (entered into force 23 March 1976)

Vienna Convention on the Law of Treaties opened for signature 23 May 1969, 1155 UNTS 331 (entered into force 27 January 1980)

E Other

Amazon, My Friend Cayla (2018) <https://www.amazon.ca/Genesis-31837-My-Friend-Cayla/dp/B010T4JV5G>

BBC, ‘Smart Speaker Records Reviewed by Humans’, BBC (online) 11 April 2019 <https://www.bbc.com/news/technology-47893082>

BBC, ‘Toy Firm VTech fined $650,000 over data breach’, BBC (online) 9 January 2018 <https://www.bbc.com/news/technology-42620717>

Bundesnetzagentur, Press Release: Bundesnetzagentur removes children’s doll “Cayla” from the market” (17 February 2017) <https://www.bundesnetzagentur.de/SharedDocs/Pressemitteilungen/EN/2017/17022017_cayla.html?nn=404422>

Chopra, Rohit and Rebecca Kelly Slaughter, ‘Joint Statement Of Commissioner Rohit Chopra And Commissioner Rebecca Kelly Slaughter In the Matter of Musical.ly Inc. (now known as TikTok) Commission File Number 1723004’, Public Statements (Web Page, 27 February 2019) <https://www.ftc.gov/system/files/documents/public_statements/1463167/chopra_and_slaughter_musically_tiktok_joint_statement_2-27-19_0.pdf>

Commonwealth of Australia, Parliamentary Debates, Senate, 30 November 2006, 20302

Consumer Federation of America, Press Release: Consumer and Privacy Groups Demand Action on Toys That Spy on Children: One Year After Complaint to the Federal Trade Commission, Dangerous Toys Are Still on the Market (Press Release, 18 December 2017) <https://consumerfed.org/press_release/consumer-privacy-groups-demand-action-toys-spy-children/>

Day, Matt, Giles Turner and Natalia Drozdiak, ‘Amazon Workers Are Listening to What You Tell Alexa’, Bloomberg (online) 11 April 2019 <https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio>

Dickson, Elisabeth, ‘Kids’ toys are the latest battleground in the online privacy wars’, Vox (online) 13 December 2018 < https://www.vox.com/the-goods/2018/11/21/18106917/kids-holiday-gifts-connected-toys>

Edison Research, ‘Moms and Media 2014’ (Web Page, 2014) <http://www.edisonresearch.com/wp-content/uploads/2014/05/Moms-and-Media-2014-FINAL-REPORT.pdf>

eSafety Commissioner, Role of the Office (Web Page, 2019) <https://www.esafety.gov.au/about-the-office/role-of-the-office>

Federal Bureau of Investigation, About (Web Page) <https://www.fbi.gov/about>

Federal Bureau of Investigation, ‘Consumer Notice: Internet-Connected Toys Could Present Privacy and Contact Concerns for Children’ Public Service Announcement (Public Service Announcement, 27 July 2017) <https://www.ic3.gov/media/2017/170717.aspx>

Federal Trade Commission, ‘2017.06.21 Response to Senator Warner Letter’ (Web Page, 22 June 2017) <https://www.scribd.com/document/352278126/2017-06-21-Response-to-Senator-Warner-Letter>

Federal Trade Commission, COPPA Safe Harbor Program (Web Page) <https://www.ftc.gov/safe-harbor-program>

Federal Trade Commission, ‘FTC Publishes Inflation-Adjusted Civil Penalty Amounts’ Press Releases (Press Release, 1 March 2019) <https://www.ftc.gov/news-events/press-releases/2019/03/ftc-publishes-inflation-adjusted-civil-penalty-amounts>

Federal Trade Commission, ‘FTC Sends Educational Letters to Businesses to Help Them Prepare for COPPA Update’ Press Releases (Press Release, 15 May 2013) <https://www.ftc.gov/news-events/press-releases/2013/05/ftc-sends-educational-letters-businesses-help-them-prepare-coppa>

Federal Trade Commission, Legal Resources (Website) <https://www.ftc.gov/tips-advice/business-center/legal-resources?type=case&field_consumer_protection_topics_tid=246>

Federal Trade Commission, ‘The Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business’ Guidance (Web Page, June 2017) <https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-complianceFederal Trade Commission, ‘Video Social Networking App Musical.ly Agrees to settle FTC Allegations That it Violated Children’s Privacy Law’, Press Releases (Press Release, 27 February 2019) <https://www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc>

Franceschi-Bicchierai, Lorenzo, ‘Hacker Obtained Children’s Headshots and Chatlogs from Toymaker VTech’, Motherboard/VICE (online) 30 November 2015 <https://motherboard.vice.com/en_us/article/yp3zev/hacker-obtained-childrens-headshots-and-chatlogs-from-toymaker-vtech>

Genesis, Privacy Policy: My Friend Cayla (23 February 2015) <https://www.myfriendcayla.com/privacy-policy>

Gibbs, Samuel, ‘Hackers can hijack Wi-Fi Hello Barbie to spy on your children’, The Guardian (online) 26 November 2015 <https://www.theguardian.com/technology/2015/nov/26/hackers-can-hijack-wi-fi-hello-barbie-to-spy-on-your-children>

Hung, Patrick C. K., Marcelo Fantinato and Laura Rafferty, ‘A Study of Privacy Requirements for Smart Toys’ (Conference Paper, June 2016)

Hunt, Troy, ‘When children are breached – inside VTech hack’, Troy Hunt (Blog Page, 28 November 2015) <https://www.troyhunt.com/when-children-are-breached-inside/>

Internet Crime Complaint Center (IC3), ‘Frequently Asked Questions’ (Web Page) <https://www.ic3.gov/faq/default.aspx>

Kanso, Heba, ‘Smart Toys help kids prepare for high-tech future’, CBS News (online), 18 February 2016 <https://www.cbsnews.com/news/smart-toys-kids-high-tech-future/>

Kelion, Leo, ‘MiSafes’ child-tracking smartwatches are ‘easy to hack’’, BBC (online) 15 November 2018 <https://www.bbc.com/news/technology-46195189>

Kemp, Richard, ‘Legal Aspects of the Internet of Things’, Kempt IT Law (June 2017) <http://www.kempitlaw.com/wp-content/uploads/2017/06/Legal-Aspects-of-the-Internet-of-Things-KITL-20170610.pdf>

Laughlin, Andrew, ‘Watch as the voice of this child’s toy cat is taken over by hackers’, Which (online), 23 June 2017 <https://www.which.co.uk/news/2017/06/watch-as-the-voice-of-this-childs-toy-cat-is-taken-over-by-hackers/>

Leskin, Paige, ‘The 21 biggest data breaches of 2018’, Business Insider (online) 11 December 2018 <https://www.businessinsider.com/data-hacks-breaches-biggest-of-2018-2018-12>

Lieber, Chavie, ‘Amazon Hopes Parents Will Trust Its New Gadgets – And we Probably Will’, Racked (online) 3 May 2018 <https://www.racked.com/2018/5/3/17314982/amazon-echo-dot-kids-privacy>

Magid, Larry, ‘Magid: Protecting Children Online Needs to Allow for Their Right to Free Speech’, The Mercury News (online) 29 August 2014 <https://www.mercurynews.com/2014/08/29/magid-protecting-children-online-needs-to-allow-for-their-right-to-free-speech/>

Maras, Marie-Helen, ‘4 ways ‘internet of things’ toys endanger children’, The Conversation (online) 10 May 2018 <https://theconversation.com/4-ways-internet-of-things-toys-endanger-children-94092>

Matsakis, Louise and Issie Lapowsky, ‘Everything We Know About Facebook’s Massive Security Breach’, Wired (online), 28 September, 2018 <https://www.wired.com/story/facebook-security-breach-50-million-accounts/>

Mozilla, ‘Toys & Games’, Privacy Not Included (Web Page) <https://foundation.mozilla.org/en/privacynotincluded/categories/toys-games/>

Office of the Australian Information Commissioner, Determinations (Web Page, 2018) <https://www.oaic.gov.au/privacy-law/determinations/>

Office of Australian Privacy Commissioner, ‘Privacy Business resource 18: Privacy and Start-Up Businesses’ Business Resources (Web Page, December 2017) <https://www.oaic.gov.au/agencies-and-organisations/business-resources/privacy-business-resource-18>

Office of the Privacy Commissioner, ‘Guidelines for Obtaining Meaningful Consent’, Collecting Personal Information (Web Page, May 2018) <https://www.priv.gc.ca/en/privacy-topics/collecting-personalinformation/consent/gl_omc_201805/#_consent>

Office of the Privacy Commissioner of Canada, ‘News Release: VTech breach investigation highlights security failures’, 8 January 2018 <https://www.priv.gc.ca/en/opc-news/news-and-announcements/2018/nr-c_180108/>

Pilgrim, Timothy, ‘Big Data and Privacy: A Regulators Perspective’ (Speech, International Conference on Big Data from a Privacy Perspective, Hong Kong, 10 June 2015) <https://www.oaic.gov.au/media-and-speeches/speeches/big-data-and-privacy-a-regulators-perspective>

Puckett, Jennifer M., ‘Insider insights on COPPA’, Emoderation (online) 13 May 2013 <https://web.archive.org/web/20161118232637/http://www.emoderation.com/insider-insights-coppa/>

SBS, ‘Tech giants face fines upwards of $100 million under changes to Australia’s Privacy Act’, SBS (online) 24 March 2019 <https://www.sbs.com.au/news/tech-giants-face-fines-upwards-of-100-million-under-changes-to-australia-s-privacy-act>

Stanislav, Mark and Tod Beardsley, ‘Hacking IoT: A Case Study on Baby Monitor Exposures and Vulnerabilities’, Rapid7 (online) 29 September 2015 <https://www.rapid7.com/docs/Hacking-IoT-A-Case-Study-on-Baby-Monitor-Exposures-and-Vulnerabilities.pdf>

Therrien, Daniel, “Letter to the Standing Committee on Access to Information, Privacy and Ethics on the 2018-2019 Main Estimates’, Office of the Privacy Commissioner of Canada, 29 May 2018, <https://www.priv.gc.ca/en/opc-actions-and-decisions/advice-to-parliament/2018/parl_sub_180529/>

Valdez, Andrea, ‘Everything You Need to Know About Facebook and Cambridge Analytica’, Wired (online), 23 March, 2018 <https://www.wired.com/story/wired-facebook-cambridge-analytica-coverage/>

Vlahos, James, ‘Smart talking: are out devices threatening our privacy?’, The Guardian (online) 26 March 2019 <https://www.theguardian.com/technology/2019/mar/26/smart-talking-are-our-devices-threatening-our-privacy>

VTech, ‘FAQ about Cyber Attack on VTech Learning Lodge’ (Web Page, 9 January 2018) <https://www.vtech.com/en/press_release/2018/faq-about-cyber-attack-on-vtech-learning-lodge/>

VTech, ‘Press Releases: Data Breach on VTech Learning Lodge (update)’, Press Releases (Web Page, 30 November 2015) <https://www.vtech.com/en/press_release/2015/data-breach-on-vtech-learning-lodge-update/>

Walsh, Michael, ‘My Friend Cayla doll banned in Germany over surveillance concerns’, ABC (online) 17 February 2017 <https://www.abc.net.au/news/2017-02-18/my-friend-cayla-doll-banned-germany-over-surveillance-concerns/8282508>


[1] Bert-Jaap Koops, ‘Ten Dimensions of Technology Regulation – Finding Your Bearings in the Research Space of an Emerging Discipline’ (2010) 15 Tilburg University Legal Studies Working Paper Series 311, 315.

[2] Richard Kemp, ‘Legal Aspects of the Internet of Things’ (2017) Kempt IT Law <http://www.kempitlaw.com/wp-content/uploads/2017/06/Legal-Aspects-of-the-Internet-of-Things-KITL-20170610.pdf> 1.

[3] Sharon Shasha et al, ‘Playing with Danger: A Taxonomy and Evaluation of Threats to Smart Toys’ (2018) Internet of Things Journal 1, 1 <https://arxiv.org/pdf/1809.05556.pdf>.

[4] Luciano Goncalves de Caralho and Marcelo Medeiros Eler, ‘Security Requirements for Smart Toys’ (2017) 2 Proceedings on the 19th International Conference on Enterprise Information Systems (ICEIS) 144, 144 <https://pdfs.semanticscholar.org/9cae/d7dc05c58ae1886aaad9e0f31f8ba3d36b67.pdf>.

[5] Heba Kanso, ‘Smart Toys Help Kids Prepare for High-Tech Future’, CBS News (online, 18 February 2016) <https://www.cbsnews.com/news/smart-toys-kids-high-tech-future/>.

[6] Neil Gordon, ‘Flexible Pedagogies: Technology-Enhanced Learning’ (2014) Flexible Pedagogies: Preparing for the Future. The Higher Education Academy, January 1, 3.

[7] Family Online Safety Institute, ‘Connected Families: How Parents Think & Feel about Wearables, Toys, and the Internet of Things’ (Report, Hart Research, 2017) 1, 11 <https://www.fosi.org/documents/231/HartReport_d7_full_report_WEB.pdf> (‘FOSI Report’).

[8] See, eg, Shannon Sorensen, ‘Protecting Children’s Right to Privacy in the Digital Age: Parents as Trustees of Children’s Rights’ (2016) 36(3) Children’s Legal Rights Journal 156.

[9] Mark Stanislav and Tod Beardsley, ‘Hacking IoT: A Case Study on Baby Monitor Exposures and Vulnerabilities’, Rapid7 (online, 29 September 2015) <https://www.rapid7.com/docs/Hacking-IoT-A-Case-Study-on-Baby-Monitor-Exposures-and-Vulnerabilities.pdf>.

[10] Andrew Laughlin, ‘Watch as the Voice of This Child’s Toy Cat Is Taken Over by Hackers’, Which (online, 23 June 2017) <https://www.which.co.uk/news/2017/06/watch-as-the-voice-of-this-childs-toy-cat-is-taken-over-by-hackers/>.

[11] Leo Kelion, ‘MiSafes’ Child-Tracking Smartwatches Are “Easy to Hack”’, BBC (online, 15 November 2018) <https://www.bbc.com/news/technology-46195189>.

[12] Samuel Gibbs, ‘Hackers Can Hijack Wi-Fi Hello Barbie to Spy on Your Children’, The Guardian (online, 26 November 2015) <https://www.theguardian.com/technology/2015/nov/26/hackers-can-hijack-wi-fi-hello-barbie-to-spy-on-your-children>.

[13] Lauren Eade, ‘Legal Incapacity, Autonomy, and Children’s Rights’ [2001] NewcLawRw 16; (2001) 5 Newcastle Law Review 157, 157.

[14] Grace Chung and Sara M. Grimes, ‘Data Mining the Kids: Surveillance and Market Research Strategies in Children’s Online Games’ (2006) 30(4) Canadian Journal of Communication 527, 531.

[15] BBC, ‘Smart Speaker Records Reviewed by Humans’, BBC (online, 11 April 2019) <https://www.bbc.com/news/technology-47893082>; Matt Day, Giles Turner and Natalia Drozdiak, ‘Amazon Workers Are Listening to What You Tell Alexa’, Bloomberg (online, 11 April 2019) <https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio>.

[16] Shasha et al (n 3) 1.

[17] Marie-Helen Maras, ‘4 Ways ‘Internet of Things’ Toys Endanger Children’, The Conversation (online, 10 May 2018) <https://theconversation.com/4-ways-internet-of-things-toys-endanger-children-94092>.

[18] Mozilla, ‘Toys & Games’, Privacy Not Included (Web Page) <https://foundation.mozilla.org/en/privacynotincluded/categories/toys-games/>.

[19] See, eg, James Vlahos, ‘Smart talking: Are out Devices Threatening Our Privacy?’, The Guardian (online, 26 March 2019) < https://www.theguardian.com/technology/2019/mar/26/smart-talking-are-our-devices-threatening-our-privacy>; Elisabeth Dickson, ‘Kids’ Toys Are the Latest Battleground in the Online Privacy Wars’, Vox (online, 13 December 2018) < https://www.vox.com/the-goods/2018/11/21/18106917/kids-holiday-gifts-connected-toys>.

[20] Acute Market Reports, ‘Global Connected Toys Market Size, Market Share, Application Analysis, Regional Outlook, Growth Trends, Key Players, Competitive Strategies and Forecasts, 2018 to 2026’ (Report, January 2019) (‘Global Connected Toys Report’).

[21] FOSI Report (n 7) 4.

[22] Global Connected Toys Report (n 20).

[23] Ibid.

[24] Ibid.

[25] Chavie Lieber, ‘Amazon Hopes Parents Will Trust Its New Gadgets – And We Probably Will’, Racked (online, 3 May 2018) <https://www.racked.com/2018/5/3/17314982/amazon-echo-dot-kids-privacy>.

[26] Global Connected Toys Report (n 20).

[27] Donell Holloway and Lelia Green, ‘The Internet of Toys’ (2016) 2(4) Communication Research and Practice 506, 509–10.

[28] Ibid.

[29] International Covenant on Civil and Political Rights, opened for signature 16 December 1966, 999 UNTS 171 (entered into force 23 March 1976) art 17 (‘ICCPR’).

[30] Australian Law Reform Commission, Serious Invasions of Privacy in the Digital Era (Discussion Paper, No 80, 2014) 29 <http://www.alrc.gov.au/sites/default/files/pdfs/publications/whole_dp80.pdf> .

[31] Daniel J Solove, The Digital Person: Technology and Privacy in the Information Age (NYU Press, 2004) 52.

[32] Louise Matsakis and Issie Lapowsky, ‘Everything We Know About Facebook’s Massive Security Breach’, Wired (online, 28 September 2018) <https://www.wired.com/story/facebook-security-breach-50-million-accounts/>.

[33] Andrea Valdez, ‘Everything You Need to Know About Facebook and Cambridge Analytica’, Wired (online, 23 March 2018) <https://www.wired.com/story/wired-facebook-cambridge-analytica-coverage/>.

[34] United Nations Convention on the Rights of the Child, opened for signature 20 December 1989, 1577 UNTS 3 (entered into force 2 September 1990) (‘UNCRC’).

[35] Vienna Convention on the Law of Treaties opened for signature 23 May 1969, 1155 UNTS 331 (entered into force 27 January 1980) arts 2(1)(b), 14(1), 16.

[36] Ibid arts 10, 18.

[37] United States Constitution amend IV; Katz v. United States [1967] USSC 262; 389 U.S. 347, 361 (1967).

[38] Children’s Online Privacy Protection Act 1998 15 USC § 6501–6505 (‘COPPA’).

[39] Ibid § 6502(a)(1).

[40] Ibid § 6502(b)(1).

[41] Children’s Online Privacy Protection Rule 16 CFR § 312.2 (‘COPPA Rule’)

[42] COPPA Rule § 312.

[43] Ibid § 312.2.

[44] Ibid § 312.3.

[45] Ibid § 312.3(a)–(b), 312.5.

[46] Ibid §§ 312.4, 312.4(a).

[47] Ibid § 312.3(e), 312.8.

[48] Ibid § 312.6.

[49] Ibid § 312.10.

[50] Ibid § 312.2.

[51] Federal Trade Commission, Legal Resources (Website) <https://www.ftc.gov/tips-advice/business-center/legal-resources?type=case&field_consumer_protection_topics_tid=246> (Filtering the FTC Search Engine to Type: Case and Topic: Children’s Privacy brings up 31 cases).

[52] Federal Trade Commission, ‘FTC Publishes Inflation-Adjusted Civil Penalty Amounts’ Press Releases (Press Release, 1 March 2019) <https://www.ftc.gov/news-events/press-releases/2019/03/ftc-publishes-inflation-adjusted-civil-penalty-amounts>.

[53] Lauren A Matecki, ‘Update: COPPA is Ineffective Legislation! Next Steps for Protecting Youth Privacy Rights in the Social Networking Era’ (2010) 5(2) North Western Journal of Law and Social Policy 369, 370 <https://scholarlycommons.law.northwestern.edu/njlsp/vol5/iss2/7>.

[54] Larry Magid, ‘Magid: Protecting Children Online Needs to Allow for Their Right to Free Speech’, The Mercury News (online, 29 August 2014) <https://www.mercurynews.com/2014/08/29/magid-protecting-children-online-needs-to-allow-for-their-right-to-free-speech/>.

[55] United States Constitution amend IV.

[56] Katz v. United States [1967] USSC 262; 389 U.S. 347, 360–1 (1967).

[57] Ibid.

[58] Privacy Act 1988 (Cth) (‘Privacy Act).

[59] Ibid s 2A(a).

[60] Ibid ss 15, sch 1.

[61] Ibid sch 1.

[62] Ibid s 13(1)(a)

[63] Ibid s 13G (Civic penalty of 2,000 penalty units, the current Commonwealth penalty unit is $210).

[64] Ibid s 80W(5); The Morrison Government is discussing the introduction of tougher penalties to protect Australians’ online privacy, but has not yet been drafted. See SBS, ‘Tech Giants Face Fines upwards of $100 Million under Changes to Australia’s Privacy Act’, SBS (online, 24 March 2019) <https://www.sbs.com.au/news/tech-giants-face-fines-upwards-of-100-million-under-changes-to-australia-s-privacy-act>.

[65] Ibid pt V, ss 36, 40.

[66] Ibid s 52.

[67] See Office of the Australian Information Commissioner, Determinations (Web Page, 2018) <https://www.oaic.gov.au/privacy-law/determinations/>. A table with summary details of privacy determinations made since 1 November 2010 is provided.

[68] Privacy Act s 6.

[69] Ibid s 6D(1).

[70] Office of Australian Privacy Commissioner, ‘Privacy Business Resource 18: Privacy and Start-Up Businesses’ Business Resources (Web Page, December 2017) <https://www.oaic.gov.au/agencies-and-organisations/business-resources/privacy-business-resource-18>.

[71] Privacy Act s 2A(b).

[72] Ibid s 6(1).

[73] Danae Azaria, ‘Responses to Breaches under the Law of Treaties’ (2015) Oxford Monographs in International Law 139, 144.

[74] United States of America v VTech Electronics Limited and VTech Electronics North America, LLC, (ND Ill, Civ No 1-18-cv-144, 8 January 2018) <https://www.ftc.gov/system/files/documents/cases/vtech_file_stamped_stip_order_1-8-18.pdf> (hereinafter ‘VTech’).

[75] VTech, ‘Press Releases: Data Breach on VTech Learning Lodge (update)’, Press Releases (Web Page, 30 November 2015) <https://www.vtech.com/en/press_release/2015/data-breach-on-vtech-learning-lodge-update/>.

[76] Ibid.

[77] VTech, ‘FAQ about Cyber Attack on VTech Learning Lodge’ (Web Page, 9 January 2018) <https://www.vtech.com/en/press_release/2018/faq-about-cyber-attack-on-vtech-learning-lodge/>.

[78] Troy Hunt, ‘When Children Are Breached – Inside Vtech Hack’, Troy Hunt (Blog Page, 28 November 2015) <https://www.troyhunt.com/when-children-are-breached-inside/>.

[79] Paige Leskin, ‘The 21 Biggest Data Breaches of 2018’, Business Insider (online, 11 December 2018) <https://www.businessinsider.com/data-hacks-breaches-biggest-of-2018-2018-12>.

[80] Lorenzo Franceschi-Bicchierai, ‘Hacker Obtained Children’s Headshots and Chatlogs from Toymaker VTech’, Motherboard/VICE (online, 30 November 2015) <https://motherboard.vice.com/en_us/article/yp3zev/hacker-obtained-childrens-headshots-and-chatlogs-from-toymaker-vtech>.

[81] Ibid.

[82] BBC, ‘Toy Firm VTech Fined $650,000 over Data Breach’, BBC (online, 9 January 2018) <https://www.bbc.com/news/technology-42620717>.

[83] VTech (n 74) 46.

[84] Ibid.

[85] Ibid.

[86] Ibid.

[87] Ibid 47–8.

[88] Ibid 48.

[89] Ibid.

[90] Ibid 49–51.

[91] United States of America v Musical.ly and Musical.ly Inc, (CD Cal, No 2-19-cv-1439, 27 February 2019) <https://www.ftc.gov/system/files/documents/cases/musical.ly_proposed_order_ecf_2-27-19.pdf>

[92] Federal Trade Commission, ‘Video Social Networking App Musical.ly Agrees to settle FTC Allegations That it Violated Children’s Privacy Law’, Press Releases (Press Release, 27 February 2019) <https://www.ftc.gov/news-events/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc>.

[93] Rohit Chopra and Rebecca Kelly Slaughter, ‘Joint Statement of Commissioner Rohit Chopra and Commissioner Rebecca Kelly Slaughter in the Matter of Musical.ly Inc. (now known as TikTok) Commission File Number 1723004’, Public Statements (Web Page, 27 February 2019) <https://www.ftc.gov/system/files/documents/public_statements/1463167/chopra_and_slaughter_musically_tiktok_joint_statement_2-27-19_0.pdf>.

[94] Daniel J Gervais, ‘The Regulation of Inchoate Technologies’ (2010) 47(3) Houston Law Review 665, 699.

[95] Tim Wu, ‘Agency Threats’ (2010) 60 Duke Law Journal 1841, 1842–51.

[96] Nathan Cortez, ‘Regulating Disruptive Innovation’ (2014) 29 Berkeley Technology Law Journal 175, 179.

[97] Wu (n 95) 1851.

[98] Ibid 1848.

[99] Federal Trade Commission, ‘FTC Sends Educational Letters to Businesses to Help Them Prepare for COPPA Update’ Press Releases (Press Release, 15 May 2013) <https://www.ftc.gov/news-events/press-releases/2013/05/ftc-sends-educational-letters-businesses-help-them-prepare-coppa>.

[100] Wu (n 95) 1844.

[101] Ibid.

[102] Federal Trade Commission, ‘The Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business’ Guidance (Web Page, June 2017) <https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance>.

[103] Ibid. (“In addition to standard websites, examples of others covered by the Rule include: ... connected toys or other Internet of Things devices”).

[104] Federal Bureau of Investigation, ‘Consumer Notice: Internet-Connected Toys Could Present Privacy and Contact Concerns for Children’ Public Service Announcement (Public Service Announcement, 27 July 2017) <https://www.ic3.gov/media/2017/170717.aspx>.

[105] Federal Bureau of Investigation, About (Web Page) <https://www.fbi.gov/about>.

[106] Internet Crime Complaint Center (IC3), ‘Frequently Asked Questions’ (Web Page) <https://www.ic3.gov/faq/default.aspx>.

[107] COPPA Rule § 312.11; Federal Trade Commission,Implementing the Children’s Online Privacy Protection Act: A Report to Congress’ (Report, 2007) 1 <http://www.ftc.gov/sites/default/files/documents/reports/implementing-childrens-online-privacyprotection-act-federal-trade-commission-report-congress/07coppa_report_to_congress.pdf> (‘FTC Congress Report’).

[108] See Federal Trade Commission, COPPA Safe Harbor Program (Web Page) <https://www.ftc.gov/safe-harbor-program>. The seven safe harbor programs are administered by Aristotle International Inc; the Children’s Advertising Review Unit (CARU); the Entertainment Software Rating Board (ESRB); iKeepSafe; kidSAFE; Privacy Vaults Online, Inc (d/b/a PRIVO); and TRUSTe.

[109] COPPA Rule § 312.11(a).

[110] FTC Congress Report (n 107).

[111] Brandon Golob, ‘How Safe are Safe Harbors? The Difficulties of Self-Regulatory Children’s Online Privacy Protection Act Programs’ (2015) 9 International Journal of Communication 3469, 3470–1.

[112] Ibid 3470.

[113] Cortez (n 96) 211.

[114] Ibid 179.

[115] Timothy Pilgrim, ‘Big Data and Privacy: A Regulators Perspective’ (Speech, International Conference on Big Data from a Privacy Perspective, Hong Kong, 10 June 2015) <https://www.oaic.gov.au/media-and-speeches/speeches/big-data-and-privacy-a-regulators-perspective>.

[116] See Emmeline Taylor and Katina Michael, ‘Smart Toys that are the Stuff of Nightmares’ (March 2016) IEEE Technology and Society Magazine 7, 7–8.

[117] Ibid 175.

[118] Ibid 203.

[119] Australian Law Reform Commission, For Your Information: Australian Privacy Law and Practice (Report, No 108, May 2008) 79.

[120] Ibid 118.

[121] Ibid [69], [69.22]–[69.23].

[122] Commonwealth of Australia, Parliamentary Debates, Senate, 30 November 2006, 20302 (N Bolkus); Privacy Amendment (Private Sector) Bill 2000 (Cth).

[123] eSafety Commissioner, Role of the Office (Web Page, 2019) <https://www.esafety.gov.au/about-the-office/role-of-the-office>.

[124] Luke Costin, ‘eSafety Commissioner Warns Parents about ‘Smart’ Christmas Gifts’, The New Daily (online, 6 December 2018) <https://thenewdaily.com.au/life/tech/2018/12/06/esafety-warning-smart-christmas-gifts/>.

[125] Wu (n 95).

[126] FTC Congress Report (n 107) 1.

[127] Gervais (n 94) 697.

[128] Ibid 699.

[129] Eade (n 13) 157.

[130] Cortez (n 96) 204.

[131] Gervais (n 94) 672.

[132] See Privacy Act.

[133] Lyria Bennett Moses, 'Regulating in the Face of Socio-Technical Change' in Roger Brownsword, Eloise Scotford and Karen Yeung (eds), The Oxford Handbook of the Law and Regulation of Technology (Oxford University Press 2017) 577, 583.

[134] Gervais (n 94) 697.

[135] See Global Connected Toys Report (n 20).

[136] Ibid.

[137] Koops (n 1) 320.

[138] Ibid 317.

[139] COPPA Rule §§ 312.5 - 312.6.

[140] Patrick C K Hung, Marcelo Fantinato and Laura Rafferty, ‘A Study of Privacy Requirements for Smart Toys’ (Conference Paper, June 2016) 4.

[141] Ibid 5.

[142] Sorensen (n 8) 156.

[143] Benjamin Shmueli and Ayelet Blecher-Prigat, ‘Privacy for Children’ (2011) 42 Columbia Human Rights Law Review 759, 761–2.

[144] Hung et al (n 140) 4–5.

[145] Ibid.

[146] Brad A Greenberg, ‘Rethinking Technology Neutrality’ (2016) 100 Minnesota Law Review 1495, 1495.

[147] Paul Ohm, ‘The Argument Against Technology-Neutral Surveillance Laws’ (2010) 88 Texas Law Review 1685, 1686.

[148] See Pilgrim (n 115).

[149] Gervais (n 94) 681.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/UNSWLawJlStuS/2019/8.html