(2 part series written in November 2019, originally published on Medium: https://medium.com/@emmadaylaw/childrens-connected-toys-part-2-14474ababa4e)
Image by kalhh from Pixabay
COPPA has extraterritorial reach beyond the US, as it defines a ‘child’ as “an individual under 13”, and defines “operator” as anyone who operates a web based service within the US, or between the US and any State or foreign nation. This means that American toy companies operating supporting software for their toys in foreign markets must still comply with COPPA, and foreign companies operating in the US must also comply. Further, higher standards of data protection for children under EU laws may be having an impact on the US and raising the bar across both the EU and US markets. However, children in the “Global South” and “Global East” are left outside of protective legal frameworks, and may be susceptible to commercial exploitation through data mining by Western companies.
As well as EU legislation, court decisions from European countries may have some influence in the U.S. Even prior to the GDPR coming into force, Germany’s Federal Network Agency (Bundesnetzagentur) banned My Friend Cayla in 2017, on the grounds that the doll contained a “concealed surveillance device” which violated federal privacy regulations. German parents were instructed to destroy the toy. The German decision was quoted in the complaint filed with the FTC and also by Senator Warner in his letter to the FTC requesting action on smart toy regulation, so there appears to be some influence from foreign decisions in the US. However, the influence that a court decision from another country will have on the US also depends on the geopolitical power of that country, and a similar decision from a smaller country with lower GDP situated outside of North America or the European Union could be much less likely to get attention.
Companies operating between the US, Hong Kong, and China
It is often said that the US and China are leading the world in the race to produce the best AI technology. At a recent conference at Stanford university about ‘AI Ethics, Policy & Governance’, Michael Kratsios, the Chief Technology Officer to the Unites States remarked that although he believes the US are currently leading the world in AI, the Chinese are trying to catch up by the end of the next decade, and the Chinese are particularly challenging the US in the area of machine vision, because they have access to so much data under their ‘surveillance state’. At the same conference, Marietje Schaake, former Dutch Member of the European Parliament, commented that although many American companies warn lawmakers in the US about the threat of Chinese companies winning the “AI race”, many of the same American companies are sending data to China themselves for commercial reasons.
Schaake’s point seems to be reflected in the AI-powered toy market, in which the US and China currently dominate the global market and in which roboticists from Hong Kong and the US appear to be working in close collaboration, rather than in separate ecosystems underpinned by different value systems. One example of this is VTech, the first robotic toy company mentioned above to be fined by the FTC. In addition, Hanson Robotics, one of the leading robotics companies, is also based in Hong Kong. Hanson Robotics are famous for their creation of ‘Sophia the Robot’, who received international acclaim and was given citizenship of Saudi Arabia. One of the Hanson Robotics marketing videos features interviews with Chinese and American roboticists clearly working in close collaboration on ‘Little Sophia’, the children’s commercial version of ‘Sophia the Robot’. The Little Sophia doll made by Hanson Robotics is currently still in development. Some of its features include facial tracking and recognition, and it is capable of interactive chat with children, claiming to teach STEM, coding and AI. Hanson Robotics has offices in both Hong Kong and the US, as does another leading AI toymaker, ‘Roybi’, which also has a third office in Shenzhen in mainland China. Roybi is also still under development, and is designed to teach children up to seven languages using AI technology. It uses face and voice recognition to recognise and interact with child users, and claims to “detect how your child is feeling and offer them support and encouragement”. The company also claims that the robot can help children with speech delays. It is worth noting that Roybi states that it is not currently GDPR complaint and is not intended for use in the European market, which implies that Roybi interprets US law as being more favourable to its commercial use of children’s data than European law.
Exploitation of children in foreign countries: the product development phase in the Global East & Global South
It could be that Hong Kong is commercially preferable for the product development stage, because there are less restrictions on the collection and use of children’s data. In Hong Kong there is no need to obtain parental consent to collect children’s personal information at any age, and there is little or no accountability for breaching the general guidance on data privacy contained in the Personal Data (Privacy) Ordinance, Laws of Hong Kong (Cap 486) (PDPO). If the web based service is operated solely in Hong Kong during the product development stage, rather than between the US and Hong Kong, it would seem to fall outside of the definition of an ‘operator’ under COPPA. In my view, products sold on the American market should not be permitted to collect data from children in countries with weaker privacy protections, to perfect their algorithms during the product development stage; and then sell those products to an American market where children are given a higher standard of privacy protection. This represents a kind of digital colonialism that it is incumbent upon American lawmakers to prevent.
Perhaps in part due to concerns about foreign companies exploiting Chinese children’s data, China recently enacted its own child online privacy legislation, the Measures on Online Protection of Children’s Personal Data, which came into effect on October 1st, 2019. China’s law appears to be more protective of children’s privacy rights than COPPA. It protects children aged 14 and younger, as opposed to 13, and is reportedly more in line with the substantial protections contained in the GDPR. The Chinese regulations define children as aged under 14, and require parental permission for the collection, and use of the child’s personal information, and for any subsequent repurposing of the data collected. The new regulation does not appear to make it clear what level of verification is required to prove that consent has been given by the child’s parent or guardian. It remains to be seen however to what extent this law will be enforced against companies in China. The Chinese regulations do not have extra-territorial application.
Data mining from the Global South
Beyond the battle between the U.S. and China for global dominance in AI technology, there are emerging markets in other regions which may become players in their own right, or possibly part of the supply chain for AI toys produced elsewhere. Price Waterhouse Coopers has identified emerging markets for AI IN Africa, Latin America, and Asia. Kigali, for example, has been dubbed the “Silicon Valley of Africa”.
In the World Economic Forum’s white paper on “AI Governance: A Holistic Approach to Implement Ethics into AI”, there is discussion of the regulation of AI toys for children. Reference is made to the German regulatory authority’s decision to ban My Friend Cayla, (discussed earlier in this paper). WEF suggests that “…an ethical evaluation may be different from the perspective of developing countries. In many of them, being able to speed up and increase access to education is believed by most economists as the best way to close the gap between the developed and developing world.” One possible conclusion that could be drawn from this economic analysis is that developing countries may want to ‘relax’ their data protection laws for children in order to promote economic development which will be better for children ultimately because as the country’s GDP increases, everyone’s quality of life is argued to increase. A rights-based perspective would lead to the exact opposite conclusion; children in lower income countries are likely to be the most vulnerable to commercial exploitation by tech companies and arguably need even higher standards of protection for both legal and ethical reasons.
Scholars working on the intersection between aid, international development and technology have observed a ‘progress narrative’ that justifies technology initiatives aimed at solving some of the big development problems, without considering necessary due diligence processes. This is coupled with the commercial interest of the companies behind tech with a social purpose such as Ed. Tech and Aid Tech, which leads to massive data harvesting from the global South and East and reproduces and exacerbates patterns of structural inequality. People in low income countries have been found to readily agree to give away their data in exchange for access to information, education, or even entertainment.
I recommend that the FTC should provide and enforce guidance for American companies producing technology aimed at children in low and middle income countries, that ensures such children from countries with weaker data protection laws are not exploited to further American commercial interests. This guidance should have regard to the particular harms that children in low income countries may be exposed to which go beyond commercial exploitation and also include data that may be accessed by armed actors or authoritarian governments, especially where the children come from marginalised or persecuted minorities.
I further recommend that jurisdictions such as Hong Kong, and many other countries in the global East and South, would benefit from implementing strong privacy protection laws in relation to children, to ensure that they don’t fall prey to what has been termed a kind of ‘digital colonialism’, which could leave them in danger of becoming the testing ground for the product development stage of products designed for use in the Western world.
 (EU) 2016/679
 Id. Article 5
 Verbal comments by Michael Kratsios, Chief Technology Officer of the United States, “Conversation with Michael Kratsios and Eileen Donohue”, Stanford HAI Fall Conference 2019, ‘AI Ethics, Policy & Governance’, October 29, 2019.
 See for example the marketing video of Hanson Robotics. Hanson Robotics Limited, Perspectives About Little Sophia — Meet the Hanson Robotics Team, YouTube, (Dec. 9, 2019), https://youtu.be/AlUfhdnuHgg
 Hong Kong Lawyer, EU GDPR and HK PDPO: What’s the Difference?, The Official Journal of the Law Society of Hong Kong, Industry Insights, (Jun. 2018), http://www.hk-lawyer.org/content/eu-gdpr-and-hk-pdpo-what%E2%80%99s-difference
 Takudzwa Hillary Chiwanza, The Construction of Africa’s Own Silicon Valley Has Started in Kigali, Rwanda, African Exponent, (Nov. 15, 2018), https://www.africanexponent.com/post/9392-kigali-innovation-city-is-set-to-be-africas-own-silicon-valley
 World Economic Forum, White Paper: AI Governance. A Holistic Approach to Implement Ethics into AI, World Economic Forum, (2019), https://weforum.my.salesforce.com/sfc/p/#b0000000GycE/a/0X000000cPl1/i.8ZWL2HIR_kAnvckyqVA.nVVgrWIS4LCM1ueGy.gBc
 Id. at P.12
 Zuboff, S., The Age of Surveillance Capitalism: the fight for a human future at the new frontier of power. PublicAffairs. New York 2019 P.172
 Michael Kwet, Digital colonialism is threatening the Global South, Al Jazeera Opinion, Science & Technology, ((Mar. 13, 2019), https://www.aljazeera.com/indepth/opinion/digital-colonialism-threatening-global-south-190129140828809.html