Return to site

A deep dive into children’s health data with a focus on biomedical research, development, and humanitarian contexts

 

 

(Originally published on Medium, Sep 3, 2021: https://medium.com/@emmadaylaw/a-deep-dive-into-childrens-health-data-with-a-focus-on-biomedical-research-development-and-2fb563b36e32)

It is now possible for a midwife working for an international NGO in Geneva to remotely monitor a baby in Kenya using an app, while the baby is still inside his mother’s womb. She can collect data about the baby’s heartbeat and any congenital defects and share those data with the government in Kenya, and with the researchers from a leading university in Boston. Once the baby is born his mother can fit him with a wearable device that will alert her if he should stop breathing, or roll onto his belly in the night, and that will collect other data related to his heartbeat, temperature, and breathing patterns. As her child grows older the mother can use her cell phone to access telehealth visits from her own home, which save her a day-long journey to the nearest clinic which she cannot afford to take.

Photo by Christian Bowen on Unsplash

The health tech used for all of these purposes is provided by different private companies to the mother’s government and to the NGO supporting her, and each of these actors has rights to collect and process the baby’s data. The health tech company may decide to store these data with third party cloud storage companies, and may wish to sell these data as aggregated data sets to interested pharmaceutical companies.

Children’s health data are collected across the life cycle, and for many different purposes. This piece will focus on two main areas:

1) Collection of health data from children for biomedical research

2) Children’s digital health in an international development context

The use of children’s health data in each of these fast-evolving sectors raises significant questions regarding technology’s relationship to compelling health outcomes, compliance with data privacy laws and regulations, and the ethics of collecting health data from children.

When tackling public health issues at a national, regional, and even global level, there can be major benefits to sharing as much data as possible between sectors and across agencies. Often this involves data sharing between the public and private sector as well. One core regulatory challenge is achieving the right balance between data protection and quality without stifling innovation.[1] In some ways there is an ethical imperative to explore the use of technology to combat some of the world’s most pressing health needs for children, and at the same time there is a responsibility to address ethical risks and data ownership implications posed by mass deployment of any new technology[2].

Before focusing in on each of these sectors it is important to clarify what exactly health data means, and why it is treated as especially sensitive data. Equally important is to understand the broad global, regional and national regulatory frameworks as they apply to children’s health data.

What is health data and why is it treated as especially sensitive data?

For the purposes of this discussion children’s health data includes medical data generated in a professional medical context; data generated in a non-medical context but that can be used to draw inferences about the actual health of the child; and seemingly unrelated data that can indicate likelihood of the child’s risks for health problems. This could include notes from visits to a doctor or hospital, and data about the health of the child’s family members, as well as data about the child’s socio-economic group, ethnicity, home address, activity level, and diet. Health data is determined in the GDPR[3]to be special category or sensitive data[4], and in the US some kinds of health data come under their own special legislation due to the recognised sensitivity of this kind of information[5].

Health data are sensitive because when a person’s mental or physical health status becomes public knowledge this can cause stigma and discrimination. This applies particularly in the context of HIV, STDs, and mental health conditions. Health data are also protected because if discovered by insurance companies, childcare centres, or schools, knowledge of the child’s mental or physical health status can lead to discrimination leading to higher insurance premiums, or assumptions about the child’s educational capacity by teachers, and once exposed these data can follow children into adulthood. Health data are also sensitive because they generally provide the most comprehensive data set available on an individual, and this kind of information is very attractive for hackers who wish to steal a child’s identity and use this to fraudulently allow others to access services in the child’s name[6].

How is health data regulated at an international, regional, and national level?

Children have rights to health and to privacy, under international and often national laws. Under Article 24 of the UN Convention on the Rights of the Child children have the right to the highest attainable standard of health, and in implementing this right States have a duty to “combat disease and malnutrition, including within the frame of primary healthcare, through inter alia, the application of readily available technology…” (Article 24(c)). At the same time, States have a duty to protect children’s privacy under the law (Article 16). Children also enjoy all the same rights that adults do, including the right to health under article 12 of the International Covenant on Economic, Social and Cultural Rights (ICESCR), and rights to privacy under the International Covenant on Civil and Political Rights (ICCPR, Article 17).

From a human rights perspective, both the government and the private sector are duty bearers, and children are rights holders whose rights must be protected and promoted, including in the context of their health, data protection, and privacy. Efforts to address public health concerns that include data collection from individuals including children should only be carried out for a legitimate purpose, and any collection, use, or sharing of data from children should be necessary and proportionate to that purpose (Article 4, ICCPR)[7].

It is important to recognise that as well as States, companies are also duty bearers that should comply with international human rights law including the CRC and the ICCPR, and with the UN Guiding Principles on Business and Human Rights[8], and the Children’s Rights and Business Principles[9]. Although the United States remains a major market player for children’s health-tech, biomedical research in relation to children, and wearables, the four countries investing the most in health tech are currently the UK, Canada, China, and India[10]. At a national level, data protection laws for children governing the private sector generally have a very low cut off age of 13 in the United States under the Children’s Online Privacy Protection Rule (COPPA), and up to 16 in California under the California Consumer Privacy Act (CCPA); between 13 and 16 in Europe under the GDPR; and 14 in China[11]. Canada does not have any special data protection laws for children, who are treated the same as adults under the Personal Information Protection and Electronic Documents Act (PIPEDA)[12], whereas in India the proposed Personal Data Protection Bill would protect children up to the age of 18[13]. Currently adolescents aged over 13 in most countries are entitled to very little special protection of their personal data, even when it is highly sensitive health data. This is particularly concerning given that adolescents aged over 13 are more likely to seek out the kinds of health tech tools discussed below such as menstrual cycle tracking apps, mental health apps, and wearable gamified health trackers.

In the United States, health data does have special status under the Health Insurance Portability and Accountability Act (HIPAA), but HIPAA only applies to ‘covered entities’, which are healthcare providers and health insurance companies, and ‘business associates’, which are entities or vendors that interact with covered entities. Wearables[14] fall outside the scope of HIPAA even though they may collect sensitive health data because the data is consumer generated[15]. Health tech such as wearables could come under the Federal Trade Commission Act (FTC Act), which governs unfair or misleading trade practices, and could be used to ensure that companies comply with their own privacy policies, but not to dictate what privacy protections go into those policies. COPPA in the United States still applies to health data of under 13 year olds, and requires websites and mobile apps to obtain verifiable consent from parents before collecting children’s data. The company requesting the child’s data may only retain and use that data for the specific purposes agreed to by the parents when it was obtained. In California both the CCPA and the Security Connected Devices Act came into force into 2020, broadening the scope of the kinds of data that are classified as ‘personal information’ and therefore subject to heightened privacy protections, and requiring ‘reasonable security’ to prevent wearables from being hacked[16]. The CCPA also provides greater data protection to children aged 13–16, requiring them to opt-in to data collection rather than opt-out.

In Europe, the GDPR applies to all kinds of data including health data, and categorises all kinds of health information as being sensitive, no matter whether it is collected offline by a medical professional, or online by a platform, app or wearable device. Both HIPAA and the GDPR allow health data to be processed for medical treatment or health system management without prior consent[17]. The application of disparate data protection laws to children’s health tech apps and wearables becomes even more complex when we consider that children may download and use an app outside of the jurisdiction that governs the app’s privacy policy and terms of use. Further, some health tech is the result of collaborations between developers in different jurisdictions such as the US and China, which can lead to the children’s data collected being governed by laws outside the county where the health tech is being used, which may not be immediately obvious to the consumer who does not seek out the data privacy policy[18].

To complicate matters further, health tech like other areas of technology is developing at a rapid pace which means it is not just a matter of filling regulatory gaps in children’s data privacy protections, but also there is a need to anticipate the future technology landscape and regulate forwards. The European Commission’s ‘Strategy for Data’[19] notes that although currently around 80% of data is processed and analysed remotely in data centres and computing facilities, and the remaining 20% is processed in situe in smart connected objects such as toys, cars, and home appliances, it is predicted that by 2025 these proportions will be inverted. Further, while today the majority of the world’s data is processed by a small number of big tech firms, the landscape is changing rapidly and it is likely that in the future a large percentage of data will be generated by industrial and professional applications, and internet of things[20] applications, including health tech applications, which may bring new and more diverse actors to the table[21]. Data analysed and stored in situe will normally be subject to national law and regulation, which could bring a shift away from the domination of American-regulated technology companies, to a more diverse and localised regulatory landscape. This may be good news for children’s data privacy rights, as government’s will have more potential to introduce national laws that are rights-based and protect the privacy of their nation’s children. This also means that there will be even greater need for international standards in relation to privacy regulation for children, to ensure consistency so that health data can be collected and shared purposefully and responsibly across borders.

Collection and use of children’s data for biomedical research

Technology plays a key role in enabling the collection and processing of data that is necessary for advancements in biomedical research related to children. There is no doubt that sharing data accelerates biomedical research. It reduces the cost of science and the infrastructure and resources that are needed to conduct replicable and reliable science. However, biomedical research has traditionally been heavily regulated due to the sensitivity of the data collected, and the privacy concerns involved, because once lost privacy can never be regained.

Whereas health data used to be collected primarily through clinical research, regulated by ethics bureaus, increasingly a combination of clinical data and consumer data is being used for research purposes. For example, Apple launched ResearchKit in 2015, an open sourced framework for building apps for use in medical research[22], and Android similarly launched ResearchStack in 2016[23]. Apple notes that one of the biggest challenges that medical researchers face is recruiting participants for their studies, but it is now easier to recruit large numbers of participants from around the world through iPhone users, and to gather data from them very frequently throughout the day as they go about their lives, which is likely to lead to advancements in research. An app has been developed using Apple’s platform to see whether it is possible to screen for autism in young children by recording the child’s reactions to videos played via the phone, and using algorithms to read the child’s emotions and reactions[24]. The premise is that it is not possible to train enough people to meet the need for early diagnosis of autism in children, and this tech solution could contribute to at least the first stages of diagnosis, followed by referral to a doctor[25]. The app has been developed by Duke University, and with consent from the parents the data is uploaded to Duke’s secure server for analysis by their team. The app includes solid and express privacy protections, and parents can also opt to not have their child’s actual video stored in the server, but only the computer-generated facial landmarks.

However, in general, research carried out through apps poses greater risks to privacy than traditional clinical research methodologies. Researchers have the potential to collect far more kinds of data including GPS data, and genomic data, as well as a much higher volume of data, which means it may become impossible to deidentify the child’s mobile phone data in the way data would ordinarily be deidentified for research purposes to protect the participant’s privacy[26]. A study of apps produced using Apple’s ResearchKit found that many companies used third party cloud storage providers to store their data as this is much cheaper, which exposes the data to security risks. The study found that many of the apps ask their users to consent to reuse of their data in aggregate for future independent research, and eight out of the 26 apps reviewed did not guarantee their users that they would not sell their data for commercial purposes, even though Apple requires them to do so. The study found that although an adult over the age of 18 must consent to the data collection from a child, the wording of the terms being consented to was not changed for apps involving data collection from children, meaning that the explanation of data usage was aimed at the parent, without regard to explaining the same to the child[27].

As well as concerns about over collection and sharing of data from children for research purposes, there are also concerns that certain groups of children may be excluded from health research datasets entirely. Not all children have access to smart phones to participate in app-based studies, and where children from certain ethnic groups, geographic areas, or income groups are not represented in health datasets used to advance medical science this may result in worse health outcomes for those children.

Children’s digital health in an international development context

Technology and data are increasingly being used by international development organisations and governments to tackle global health concerns. Again, these initiatives also invariably involve private sector actors as well. Digital health interventions for children in a development context have been used to scale up and amplify quality of care and information, to promote access to affordable and effective medicines through online pharmacies[28], and to immunise children at scale.[29] Digital Health is presented by the Broadband Commission as being an instrumental tool in advancing towards universal healthcare.[30] UNICEF partners with the Health Data Collaborative to harmonize health-related data collection across donors and partners[31]. Several UN agencies and global institutions including UNICEF, WHO, Gavi, CDC and WBG have come together to strengthen data collection across the continuum of care, allowing for linkages between reproductive, maternal, newborn, child and adolescent health and CRVS systems[32]. One of the building blocks to allow for this kind of data sharing is the concept of ‘interoperability’ between different information systems, devices and applications, which means that they can access, exchange, integrate and share data between different systems that can easily talk to each other.[33] Interoperability, together with legal frameworks, are two of the six building blocks for sustainable digital health set out by the Broadband Commission.[34]

Although interoperability allows for smooth data flows between different actors, it can also be difficult to track the flow of sensitive health data often between development agencies, governments, and private companies. A growing body of critics have started to highlight what has been termed ‘dataveillance’ as a central practice to the aid industry[35]. When data is collected from beneficiaries of development aid the relationship between the beneficiary child, the development organisation, the donor, and any additional actors such as private sector innovation companies, is inherently unequal and leaves little room for choice or understanding in terms of consent by the child. For example, ‘Khushi baby’ is a wearable digital necklace that tracks infants’ immunisation records in India as part of what has been described as an ‘accelerating digitisation of beneficiary bodies’[36]. Khushi’s success has been quantified in terms of the number of vaccinations the wearable has tracked and the number of villages and mothers monitored. The data is collected by and shared between caregivers, insurance companies, pharmacies, data aggregators and analytics companies. The objectives of Khushi are also noted to have changed over time and involved a significant expansion of purpose from an initial aim to track maternal and child health records with a focus on women frontline health workers, to an expanded scope of also tracking ‘chronic disease, TB and HIV medication adherence, conditional cash transfers, ration cards, emergency medical response and hospital readmissions’[37]. This level of data collection from children in the context of the aid sector is troubling particularly where data from people in the Global South is collected, shared, and processed by powerful actors, including private sector companies from the Global North. The wearable ultimately went on to collect biometric information from babies, mothers and health workers, and as it developed new private sector actors were brought in across different countries, with whom data would be shared, used, and sold[38].

Recommendations

Governance

· There is a need for international standards in relation to health privacy regulation for children, recognising the specially sensitive nature of health data coupled with the special vulnerability of children. International standards will help to ensure consistency across the world, so that children’s health data can be collected and shared purposefully and responsibly across borders.

· As the world moves towards a greater IoT market and more data is processed in situe, there is a need to regulate forwards in a way that anticipates rapid technology changes and shifts in data collection, sharing and storage across jurisdictions.

· States should formulate and regularly update national digital health strategies, which should include specific reference to children’s rights. National digital health strategies should balance the need to promote the collection, sharing and use of data for biomedical research, and innovative advancements in children’s health, within a strong human-rights based framework that protects children’s data privacy.[39].

· States should ensure that data privacy regulations apply to the entire health sector including the public sector, academic institutions, pharmaceutical companies, and wearables and app developers. Such regulations should also control data collection, use, sharing, and storage by foreign companies within the State, as well as ownership of children’s data following mergers and acquisitions.

· In recognition of the sensitivity of children’s health data, children should have the right to erase their health data from any public or private health provider retaining their data before they reach 18.

· Institutions should follow the Responsible Data 4 Children Principles which include requirements that: the need for the data and its intended benefit to children’s lives should be specified; the effects on children should be prioritised over potential efficiency gains or other process-oriented objectives; responsible data practices must be rights-based and recognise the distinct needs and requirements of children; any data collected from children should be the least amount necessary proportionate to achieve the intended purpose; clearly defined data stewards should be defined within any entity collecting data for children to promote accountability; and data should be collected across the lifecycle including collection, storage, preparation, sharing, analysis and usage[40].

Consent

· Consent should be given by children or their parents at a granular level related to specific kinds of data for specific purposes, rather than a broad consent to share all health records with any subsequent medical provider.

· Health information can be categorised by data type (blood tests, diagnosis, procedures etc.), provider (paediatrician, sexual and reproductive healthcare provider, mental health care provider, substance abuse counsellor, etc.); time range (and there may be arguments not to have a child’s historic records follow them into adulthood automatically); and purpose (health insurance, care delivery, diagnosis, clinical research, or technology development)[41].

Biomedical research

· Biomedical research studies should collect representative and inclusive health data from a broad range of children, and should supplement data acquired through the use of technology with traditional in-person data collection methods where access to devices is a barrier to participation in research.

· Researchers should employ data minimisation principles, to ensure that unnecessary personally identifying data is not collected and stored in relation to children and specifying time limitations for data retention[42].

Privacy and child rights by design

· Companies collecting health data from children should carry out a child rights impact assessment prior to embarking on data collection or processing, to ascertain the impacts on the full range of children’s rights, not just limited to their rights to data protection[43].

· Companies producing health tech for children should be transparent in relation to the ways in which they use data, including in relation to algorithms that may be used to make health predictions or diagnoses.

· Accreditation schemes for health apps should not rely solely on disclosures from developers, and should require publishers to rectify security vulnerabilities that may place users’ data at risk before apps are released[44].

· Data should be encrypted both at device level and in transit, and should be stored securely.

· Health tech products should be co-designed with the end users — children and adolescents — wherever possible. Such products should also be designed for privacy and security, and for context, audience, and specific user behaviours.

[1] Broadband Commission, The Promise of Digital Health: Addressing Non-communicable Diseases to Accelerate Universal Health Coverage in LMICs, Broadband Commission for Sustainable Development, September 2018

[2] Letter to the Secretary of State of the UK from Professor Sir Jonathan Montgomery, Chair — Ethics Advisory Board (CV19 App), 24 April 2020

[3] Article 4(15) GDPR provides that “‘data concerning health’ means personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status.”

[4] Article 9 GDPR

[5] In the US, The Privacy Rule is a Federal law which gives individulas rights over their health information and sets rules and limits on who can look at and receive their health information. < https://www.hhs.gov/hipaa/for-individuals/guidance-materials-for-consumers/index.html>

[6] Kent, C., Why are healthcare cyberattacks surging amid Covid-19, Verdict Medical Devices. 3 June 2020.

[7] Bingham Centre for the Rule of Law, British Institute of International & Comparative Law (BIICL), Advocates for International Development, The Rule of Law in Times of Health Crises, BIICL, 2020.

[8] OHCHR, Guiding Principles on Business and Human Rights: Implementing the United Nations “Protect, Respect and Remedy” Framework, United Nations New York and Geneva, 2011

[9] UNICEF, Children’s Rights & Business Principles.

[10] Kumar, V., Top 4 Countries Accelerating the Health Technology Development, Industry Wired, October 10, 2019.

[11] Zhang, L., China: Regulation on Online Protection of Children’s Personal Information Issued, Library of Congress Global Legal Monitor, October 28, 2019.

[12] Office of the Privacy Commissioner of Canada: Privacy Rights of Children and Teens

[13] Guest Author, Personal Data Protection Bill, 2019: Protecting children’s data online, Medianama, January 16, 2020.

[14] Wearable technology consists of things that can be worn, such as clothing or glasses, that contain computer technology or can connect to the internet: Cambridge Dictionary.

[15] Tomimbang, G., Wearables: Where do they fall within the regulatory landscape?, IAPP The Privacy Advisor, January 22, 2018.

[16] Healthverity, CCPA: Connected Devices and new privacy regulations, 2020.

[17] Bari, L. & O’Neill, D., Rethinking Patient Data Privacy In The Era Of Digital Health, HealthAffairs, December 12, 2019

[18] See, for example, the ithermomemeter made by Beijing company Raiing Medical in collaboration with Boston Children’s Hospital, is a wearable thermometer that syncs with a smartphone app. The data is stored in China and Chinese law applies to the data:

[19] European Commission, A European Strategy for Data, Brussels, February 19, 2020. COM(2020) 66 final.

[20] objects with computing devices in them that are able to connect to each other and exchange data using the internet: Cambridge Dictionary.

[21] Id.

[23] Research Stack website: http://researchstack.org/

[24] Duke Today, Duke Launches Autism Research App, Duke University, October 14, 2015.

[25] Id.

[26] Moore, S., Tassé, A. M., Thorogood, A., Winship, I., Zawati, M., & Doerr, M. (2017). Consent Processes for Mobile App Mediated Research: Systematic Review. JMIR mHealth and uHealth, 5(8), e126. https://doi.org/10.2196/mhealth.7014

[27] Id.

[28] Comments from Aria Ilyad Ahmad, Jillian Clare Kohler, Bertrand de la Chapelle, and Oki Olufuye, Promoting human rights and access to safe medicines during pandemics: the critical role of internet pharmacies, RightsCon 2020, July 29, 2020

[29] UNICEF, UNICEF’s Approach to Digital Health, undated:

[30] Broadband Commission, The Promise of Digital Health: Addressing Non-communicable Diseases to Accelerate Universal Health Coverage in LMICs, Broadband Commission for Sustainable Development, September 2018

[31] Health Data Collaborative, Health Data Collaborative Progress Report 2016–18, < https://www.healthdatacollaborative.org/fileadmin/uploads/hdc/Documents/HDC_Progress_Report_Update_AW_071218_Web.pdf>

[32] Id.

[33] HIMSS, What is Interoperability in Healthcare?.

[34] Broadband Commission, The Promise of Digital Health Addressing Non-communicable Diseases to Accelerate Universal Health Coverage in LMICs.

[35] Kristin Bergtora Sandvik (2020) Wearables for something good: aid, dataveillance and the production of children’s digital bodies, Information, Communication & Society, DOI: 10.1080/1369118X.2020.1753797

[36] Id.

[37] Id.

[38] Id.

[39] Cory, N. & Stevens, P., Building a Global Framework for Digital Health Services in the Era of Covid-19, Information Technology & Innovation Foundation. May 26, 2020.

[40] UNICEF & GovLab, Responsible Data For Children Synthesis Report, UNICEF. November 2019.

[41] Carey, C., Are Patients About To Lose Control Over Their Medical Information?, ACLU. June 6, 2012.

[42] See further Sage Bionetworks Elements of Informed Consent Toolkit, and Privacy Toolkit for Mobile Health Research Studies:

[43] Digital Futures Commission & 5Rights Foundation, Child Rights Impact Assesssment: A tool to realise children’s rights in the digital environment. March 2021.

[44] Huckvale, K., Prieto, J. T., Tilney, M., Benghozi, P. J., & Car, J. (2015). Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment. BMC medicine, 13, 214.

All Posts
×

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly