Background
It appears that the Ministry of Electronics and Information Technology (“MeitY”) will soon make ready a draft bill for the ‘Digital India Act’ (the “Proposed DI Act”) for public/industry review,
aiming for a parliamentary introduction next month – when a revised draft of the Digital Personal Data Protection Bill, 2022 (“DPDP”) may be tabled as well.
According to media reports, the Proposed DI Act will aim to facilitate an open internet, online safety, a revised intermediary framework, and limited safe harbor. It may also introduce updated norms for the sharing of non-personal data – including in respect of new technologies such as the Internet-of-Things (“IoT”) and data captured by invasive gadgets like spy camera glasses. Accordingly, new rules to regulate such devices/technologies may find a place in the Proposed DI Act, including via provisions related to ‘Know-Your-Customer’ (KYC) requirements.
IoT
IoT refers to interconnected devices and objects that collect and transmit data via the internet. While it remains an evolving technology, it has already demanded hectic policy debates in terms of data security and privacy.
Children’s platforms, toys, gadgets, and applications that are connected to the internet involve a subset of IoT devices/technologies which merit special attention. For instance, high-profile hacks – such as that involving ‘Hello Barbie’ in 2015, where it was possible to remotely access the doll’s microphone and record children’s conversations – led to regulatory concern. Further, many IoT devices collect extremely sensitive data, including health information. On the other hand, pursuant to the rise of the education technology industry and the increased datafication of students, potential breaches in respect of academic and/or school records may compromise children’s safety, ranging from immediate threats of cyberbullying to long-term risks of identity theft.
Companies which develop products and services for children, including portable devices such as electronic learning products, may be required to follow specific steps for the purpose of ensuring that children’s information remains protected. Such measures could include disclosing to parents how the underlying data is used. For instance, a mobile application that allows children to communicate with others may be required to develop a comprehensive information security program, along with having to implement adequate safeguards and privacy measures. Even when such companies store passwords and personal information in an encrypted format, hackers may nonetheless gain access to a database that contains the decryption keys, which might then allow a cybercriminal to access the underlying details.
Concerns Related to Children
In this regard, the Proposed DI Act may introduce separate provisions for protecting users against harmful content, online harm, and various cybercrimes – including by ‘age gating’ children from: (i) addictive technologies, and (ii) online/digital platforms that collect/process their data.
With respect to the collection and processing of children’s personal information under Indian law, we need to refer to the current draft of DPDP – which deals with: (i) the processing of digital personal data within Indian territory (collected online or offline, as long as such personal data has been digitized), as well as (ii) digital personal data outside Indian territory when such processing relates to the ‘profiling’ of, or an activity offering goods or services to, a ‘data principal’ (see Clause 4).
Clause 4(2) of DPDP defines ‘profiling’ as any form of processing with respect to personal data that analyzes or predicts an aspect related to the behavior, attributes, or interests of a data principal. According to Clause 2(6) of DPDP, a ‘data principal’ is the individual with respect to whom certain personal data relates, similar to ‘data subjects’ under the EU’s General Data Protection Regulation (“GDPR”). However, where such individual is a child, the term ‘data principal’ under India’s draft law includes the parents or lawful guardian of the child.
Further, Clause 10 of India’s DPDP imposes additional obligations on ‘data fiduciaries’ with regard to the processing of children’s personal data, including in terms of parental consent, ‘harm’ (as defined in Clause 2(10) of DPDP), tracking or behavioral monitoring, as well as with respect to advertising that is targeted and/or directed at children (Clause 10(3) of DPDP). While a ‘data fiduciary’ is defined broadly under Clause 2(5) of DPDP as any person who alone or in conjunction with others determines the purpose and means of processing personal data, Clause 10(1) of DPDP specifically requires such data fiduciaries to obtain verifiable ‘parental consent’ (which includes the consent of a lawful guardian, if applicable) before processing the personal data of a child, and in a manner which may be later prescribed by bespoke regulation.
Furthermore, while Clause 2(10) of DPDP defines ‘harm’ to include distortion/theft of identity or harassment (among other things) and Clause 10(2) prohibits data fiduciaries from processing any personal data that is likely to cause harm to a child, additional and/or specific restrictions may be prescribed, enlarged upon, or clarified through subsequent regulation. At any rate, Clauses 11(1)(a) and (b) of DPDP permit the central government to notify any one or class of data fiduciaries as a ‘Significant Data Fiduciary’ (“SDF”) based on certain factors, including on the basis of the volume and sensitivity of personal data processed, as well as the risk of harm to a data principal. In general, SDFs have added obligations, including those prescribed under Clause 11(2) of DPDP.
India-Specific Concerns
It is important to note that Clause 2(3) of DPDP defines a ‘child’ to mean “an individual who has not completed eighteen years of age.” Accordingly, a significant number of young individuals may be covered under DPDP’s special requirements related to children. Given that the Proposed DI Act may become India’s default law for technology-related legislation in the future, if it ends up defining ‘children’ the same way as the current draft of DPDP does, there may be implications for a multitude of online/digital/social media platforms – which are, and may continue to be, regularly accessed by under-18 persons, including devices and internet-based applications that rely on new technologies such as those involving IoT, artificial intelligence (“AI”), machine learning, wearable internet-based devices, as well as augmented and/or virtual reality platforms. In turn, several of such platforms and technologies may seek to collect and/or process children’s personal data – even if such data is subsequently anonymized. These new technologies may pose particular issues with respect to teenagers in India, especially since it is likely that they will spend large amounts of time on such platforms, including in the future.
When it comes to the online tracking of children, the motivation behind commercial advertising becomes especially salient. After all, children represent a critically important demographic to marketers and advertisers because: (i) they have considerable influence on their parents’ purchasing decisions, (ii) they themselves may wield significant purchasing power, and (iii) they will be the (adult) consumers of the future.
Accordingly, the world has already witnessed, and India will likely continue to witness, a major boom in market offerings within the ‘digital kids’ segment, fuelled by children’s ongoing engagement with digital and online media, as well as on account of their continued importance as a lucrative demographic. After all, children routinely engage with interactive content across multiple screens and platforms, access the internet nearly anywhere and anytime, and remain adept at managing multiple applications, including social media and gaming platforms. Besides, children are often first-movers in terms of exploring, and experimenting with, new digital devices, services, and content. Accordingly, the media and advertising industries have shifted their strategy to address these changing cross-platform engagement patterns. Thus, content designed for young people is now distributed across a growing array of mobile phones, tablets, gaming consoles, and other devices.
Who is a ‘Child’?
Determinations of an appropriate age with respect to defining a ‘child,’ including for the purpose of protecting them and their data, vary across jurisdictions. Unlike India’s DPDP, data protection laws in other countries may not apply the same way for older children. In the US, for instance, the Children’s Online Privacy Protection Act (“COPPA”) imposes certain requirements on operators of websites or online services directed at children, including social media platforms – but only in respect of those children who are under 13 years of age.
Similar to Clause 10(1) of DPDP, COPPA specifies that websites must require verifiable parental consent for the collection or use of children’s personally identifiable information (“PII”). Unlike DPDP, however, COPPA’s protections are restricted to only such children who are aged 13 or under.
When first enacted, COPPA focused on the PII used to contact a child, such as her name, address, and phone number. In 2012, the Federal Trade Commission (“FTC”) amended COPPA to include persistent identifiers, such as cookies or fingerprints; internet protocol (IP) addresses and geolocations; as well as a range of media files such as photos and audio/video recordings. Online services that target children are required to comply with COPPA irrespective of the device used to access such service. Online operators who fail to comply with COPPA may face civil penalties. The FTC can also require a company to change business practices as a remedy for COPPA violations.
Like in the present avatar of India’s DPDP, COPPA does not explicitly define how parental consent ought to be obtained. However, over time, the FTC has established guidelines to help website operators achieve compliance, including in respect of (i) taking reasonable efforts to provide a privacy notice to parents about collecting their children’s data; and (ii) requiring a parent to use a credit card or similar financial and/or other identifying document for the purpose of authenticating herself as well as the child’s age and identity. Further, COPPA requires website and platform operators to permit parental review of their children’s data. In practice, this means that any relevant website is obliged to provide access to all user records, profiles, and login information if and when a parent requests for it.
Importantly, part of the reason why COPPA provides safeguards only for children under 13 is because the law itself was modeled on television advertising guidelines issued by the US Federal Communications Commission for commercials that targeted children aged 12 and under. Nevertheless, while the original legislative language included children who were 16 or below, both the online industry and civil liberty groups/lobbies had opposed the wording of the provision, arguing that a mandatory requirement for parental permission prior to processing would restrict teenagers’ access to online information, thus limiting their First Amendment rights related to free speech.
Many social media companies and other services have opted out of having to deal with COPPA altogether by declaring that children under 13 are not permitted on their sites. Nevertheless, several such children remain active users on social media platforms. Meanwhile, technology companies and social media platforms have opposed proposals about strengthening COPPA further, especially in respect of behavioral tracking. They have argued, for instance, that data collected for the purpose of advertising does not inherently identify an individual – but instead, such data may be used to enhance the online experience of consumers.
However, revised provisions under COPPA seek to protect children from new-age data collection and/or marketing technologies. New rules also place restrictions on marketing messages that are tailored with respect to an individual child, including behavioral advertising and retargeting techniques.
Nevertheless, COPPA still does not cover the growing range of marketing practices that media and advertising companies use to target young people. For instance, in the children’s IoT marketplace, a new generation of interactive digital toys has been introduced which are linked to the internet through mobile apps and other devices. Many of these connected devices are designed to react to children’s behavior in real time, using software to retool the device’s functionality in order to correspond to the child’s developmental stage.
Since older teenagers are not explicitly covered by current US data protection laws, such teenagers are positioned at the center of a rapidly expanding marketplace whose business model combines data collection, profiling, tracking, and monitoring of social interactions. It is possible that such concerns have led MeitY to assign a high threshold for defining children under DPDP – although, as discussed below, this too may produce problems, including in respect of viability, access, compliance burdens, and monitoring concerns.
GDPR
Unlike COPPA or DPDP, pursuant to Article 8(1) of the GDPR, the default age at which the EU considers a person no longer to be a child – such that they can express valid consent under Article 6(1)(a) – is 16. Where the child is aged below 16, data processing will be lawful only if, and to the extent that, consent is provided or authorized by a holder of parental responsibility.
Children’s Consent: GDPR vs. DPDP
In the Indian context, under Clause 7(1) of DPDP, a data principal’s consent involves a voluntary, specific, informed, and unambiguous indication of intent, where she provides a clear affirmative action to signify agreement with data processing for a specified purpose. In light of how a ‘child’ and a ‘data principal’ are defined under Clauses 2(3) and 2(6) of DPDP, respectively, it appears that in the case of a person under 18 years of age, a valid consent under Clause 7 will always require the intervention of parents or a lawful guardian. On the other hand, Clause 8(1) of DPDP introduces the concept of ‘deemed consent’ – pursuant to which a data principal may be deemed to have consented to data processing under certain circumstances. It is unclear whether deemed consents can be invoked with respect to parental authorization requirements under Clause 10(1) of DPDP.
Meanwhile, the language of Clause 2(6) of DPDP appears to indicate that parental authorization is mandatory with respect to all consents related to data processing for persons under 18 years of age. However, unlike Clause 2(6), Recital 38 of GDPR suggests that parental consent should not be necessary in the context of “preventive or counselling services” which are offered directly to a child. Such considerations of autonomy and agency with respect to children are manifest further in Recital 58 and Article 12(1) of GDPR (dealing with transparency of communication) – which require information to be presented in clear and plain language for the purpose of ensuring that even a child is able to understand it easily, given that children require special protections.
Similar to the requirement of valid consents under Clause 7(1) of DPDP, consent under GDPR must also be unambiguous – i.e., requiring either a statement or a clear affirmative action. However, unlike deemed consents under Clause 8(1) of DPDP, consent under GDPR cannot be implied, and hence, must always be given through an opt-in, declaration, or active motion, such that there is no misunderstanding about whether the data subject has consented to processing. Nevertheless, there is no precise form required for providing consents. In this regard, the consent of older children and adolescents under Article 8 of GDPR presents a special case. While those below the age of 16 require additional parental authorization, when a service offering is explicitly not addressed to children, it may be exempt from this rule. However, the exemption does not apply to offers which are addressed to both children and adults (i.e., to persons above 16).
Further, GDPR indicates that a data subject should retain the right to have her personal data rectified, as well as have a right to be forgotten where retention infringes GDPR. While some such GDPR rights are mirrored in DPDP under Clauses 9(6) and 13, respectively – Recital 65 of GDPR expressly states that these rights are relevant in particular where the data subject has given her consent as a child, i.e., at a time when she was not fully aware of the risks involved. Accordingly, the data subject should be able to exercise such rights even if she is no longer a child.
Despite DPDP’s current silence about the implications of Clauses 9 and 13 upon children and their personal data (although Clause 10 deals with additional obligations in this regard), it is possible that such provisions will be interpreted in consonance with GDPR in the future. However, Clause 10(4) of DPDP states that such additional obligations will not be applicable for purposes which may be later prescribed by law or regulation – thus leaving the likely trajectory of such rights to future legislative and/or regulatory discretion.
In addition, while Clause 9(6)(b) of DPDP allows data fiduciaries to hold on to personal information for business purposes (in addition to legal reasons) – such reference, at most, may be interpreted as a legitimate business purpose only (which interpretation may nonetheless allow substantial wiggle room for data fiduciaries to justify retention). However, GDPR appears to be stricter in this regard, and for children’s data in particular. For instance, its Article 6(1)(f) stipulates that even when processing is necessary for the purpose of a legitimate interest, such interest may be overridden by the interests, fundamental rights, and freedoms of the data subject – and especially in case of a child.
Further in this regard, Recital 75 of GDPR (dealing with risks to the rights and freedoms of natural persons) specifically envisages a situation of heightened risk when the personal data of vulnerable individuals – and children in particular – is processed, or where processing involves a large amount of personal data and/or affects a large number of data subjects. Given that Clause 10(2) of DPDP already envisages additional rules based on the likelihood of harm caused to a child further to data processing, the outcome of governmental action under Clause 11(1) may also include considerations relating to children, whereby notified SDFs – such as major technology companies and data aggregators – may need to comply with added obligations, including those prescribed under Clause 11(2).
Nevertheless, Article 8 of GDPR – which stipulates additional requirements for children – applies only if: (i) the processing of data relies on consent as a legal basis; and (ii) an ‘information society service’ (“ISS”) is directly offered to a child. According to Article 4(25) of GDPR – which refers back to Article 1(1) of Directive (EU) 2015/1535 (“ISS Regulation”) – an ISS is any service normally provided for remuneration, at a distance, by electronic means, and at the individual request of a recipient of services. However, the 16-year age-limit under GDPR is not absolute. Under Article 8(1), the EU’s member states may adjust this requirement to any age between 13 and 16 years. Thus, individual European countries may provide for a lower threshold by national law, provided that such age is not below 13.
Key Takeaways
An important aspect of determining an appropriate age to define a child for the purpose of data protection law includes considerations of feasibility. Accordingly, other than protecting the privacy of children and safeguarding them from harm, laws ought to be practically designed to ensure proper implementation by data fiduciaries. While research studies appear to indicate instances of non-compliance and/or undesirable outcomes in terms of not just COPPA, but GDPR as well (at least in respect of children’s data), stipulating an upper age-limit as high as 18 while defining a child – as the DPDP currently does – might lead to rampant disregard of the law, including by children and parents themselves, especially since parental consent may be required each time, for example, a 16- or 17-year old wishes to access a website or digital service that seeks to process their personal information.
In addition, it may be particularly difficult to devise a fail-safe framework for all fiduciaries to verify the parental identity of an under-18 person on each occasion of use. Not only might this become a major compliance burden in the future, the likelihood of designing a rigorous framework that secures the objective of the law appears to be slim at present – given the mixed track record of other data protection regimes (including in the US and the EU, even with reduced age thresholds), as well as in light of children’s burgeoning use of the internet, digital/social media, as well as IoT devices. Over the next few years, with more regulatory experience and institutional learning, a multi-tiered system may prove more feasible – for instance, treating those aged 13 to 15, and 15 to 18, respectively, as separate legal categories.
Importantly, the prescribed penalty for non-fulfilment of additional obligations in respect of under-18 persons (and their data) under Clause 10 of DPDP is up to INR 200 crores (see Schedule 1). Further, the penalty for non-fulfilment of additional obligations with respect to SDFs under Clause 11 is up to INR 150 crores. Given the stakes involved, as well as the importance of digital inclusion and internet access involving young people in particular (including for educational and inevitable reasons), the corresponding data protection regime – while essential to secure privacy and prevent harm – might perform better in terms of both safety and efficiency with a more balanced approach.