Clarify, and where necessary, extend existing frameworks to protect children’s data 5
Robustly apply the Age Appropriate Design Code
The ICO’s Age Appropriate Design Code (AADC) is mandated by the Data Protection Act 2018 and is recognized as the benchmark for children's data protection It sets a global precedent by clearly defining data protection obligations for Information Society Services (ISS) concerning children's data.
To ensure that children benefit from a comprehensive and rights-respecting approach to the processing of their education data, it is essential to have 8 signatories This involves implementing various measures, including jurisprudence focused on children's 'best interests' and their evolving capacities Key actions include child consultation, child rights due diligence, conducting child rights impact assessments, and providing child-friendly materials.
6 children The AADC has led to notable improvements by some of the biggest companies in the world 10 , and is being mirrored in other jurisdictions 11
Many EdTech products and services that align with the ISS criteria do not comply with the AADC This non-compliance arises because schools act as intermediaries, utilizing EdTech solutions to fulfill their educational responsibilities Consequently, EdTech providers can avoid their obligations under the AADC, which diminishes the effectiveness of the AADC and ultimately fails to protect children, who are the primary data subjects in this context.
Box 2: Enforcement of the Age Appropriate Design Code
TikTok could face a £27 million fine following an ICO investigation that revealed potential breaches of UK data protection law, particularly concerning the privacy of children on its platform The ICO has issued a 'notice of intent' to TikTok Inc and TikTok Information Technologies UK Limited, indicating its preliminary findings that the company violated data protection regulations between May 2018 and July 2020.
• processed the data of children under the age of 13 without appropriate parental consent;
• failed to provide proper information to its users in a concise, transparent and easily understood way; and
• processed special category data, without the legal grounds to do so
Information Commissioner John Edwards emphasized the importance of ensuring that children can explore the digital world safely, highlighting the necessity of robust data privacy protections He stated that companies offering digital services are legally obligated to implement these safeguards However, he expressed concern that TikTok has not adequately fulfilled this responsibility.
The ICO emphasizes the importance of fostering relationships with companies to shape their data protection practices However, when companies fail to adhere to the regulator's guidance, enforcement of the AADC becomes necessary.
The AADC should be robustly applied across all digital products and services that process personal data about children This includes all uses of EdTech, irrespective of types of use,
EdTech products and services that require direct interaction from students, such as account creation and login, qualify as Information Society Services (ISS) under Directive (EU) 2015/1535, making them subject to the Application of the AADC Examples of these services in UK schools include Google Classroom, ClassDojo, and MyMaths In contrast, Management Information Systems (MIS) used in schools, including safeguarding software, do not meet the ISS criteria since children do not individually request these services While the AADC does not apply to MIS, the Information Commissioner's Office (ICO) emphasizes that the principles outlined by the AADC should still be followed by these systems.
The AADC must encompass both 'core' and 'additional' services, including 'off the shelf' options and those specifically tailored for the school It applies regardless of whether the child directly uses the service or if the school utilizes it to record data about the child High standards of data protection should also be mandated for Management Information Systems (MIS) and any other school systems that manage the child's data Additionally, the AADC includes a 'best interests' exemption, permitting businesses, regulators, and schools to bypass certain standards to foster innovation and make exceptions that benefit the child.
Action 2: The government should use the Data Protection and Digital Information Bill 2022 to clarify that all EdTech that process data about children must meet the data protection and privacy baseline provided by the AADC
1.3 Comply with the UK GDPR
A data controller is accountable for the purpose and methods of data processing, adhering to the principles outlined in Article 5 of the UK General Data Protection Regulation (GDPR), which include lawfulness, fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, and integrity and confidentiality In educational settings, schools frequently share data control responsibilities with EdTech providers.
• schools sign contracts that fail to limit the purpose of processing;
• the contract says that the school is the data controller, but the EdTech impedes the school’s capacity to exercise control (see Box 3);
• providers process data that schools did not foresee, for example, keystroke dynamics or inferred data; 16
• providers process data in ways that exceed the schools’ purposes (such as for marketing or Research & Development [R&D])
Times Tables Rock Stars claims to be merely a processor of data; however, they also indicate that they utilize this data for various purposes, including sharing it with the government and conducting research This raises concerns about the power dynamics in education, as schools seem to be under pressure to meet educational demands, especially during the pandemic.
15 Data about children are also recorded in school MIS There are grounds for believing that MIS does not comply with these principles (e.g., the news story in Box 8
When a contract allows the use of a child's data for Research and Development (R&D), advertising, or marketing, it raises concerns This data is not processed under the lawful basis of public task, and the school lacks control over its further use Additionally, without other lawful bases, the data subject may not have the chance to consent to the contract or the processing of their data.
Box 3: Control of data and purpose limitation
A comprehensive Data Protection Impact Assessment (DPIA) of G Suite for Enterprise, conducted by Privacy Company 18 in the Netherlands, revealed that due to the interplay of various Google products and the collection of service and telemetry data, customers are often unaware of the purposes for which their data is processed Consequently, Google acts as a data controller or joint controller alongside the customer, which may include sensitive personal data or data from 'special categories' that disclose protected characteristics In a school setting, the institution serves as the 'customer' under a contract with Google.
Google initially rejected the classification of being a controller or joint controller; however, after negotiations, the company agreed to restrict data processing for Dutch schools and universities to three specific purposes, as opposed to the broader purposes outlined in the Google Cloud Privacy Notice This adjustment addressed significant data protection risks identified in their Data Protection Impact Assessment (DPIA), where Google and the universities were effectively acting as joint controllers, regardless of the DPIA's content Google has indicated that implementing these changes necessitates a technical redesign.
The ICO offers a self-assessment checklist for organizations to determine their roles as data controllers, processors, or joint controllers However, this checklist is not effective in educational settings where the controller serves as an intermediary for a child.
Ensure transparency
EdTech providers must ensure transparency and adhere to high compliance standards, especially when various data protection and privacy policies are applicable to different products within a single learner's journey A notable example of this issue is Google Workspace for Education, which highlights the challenges related to transparency and compliance.
Box 4: Google Classroom governance structure
Google Workspace for Education is a comprehensive hybrid teaching and school management system that includes 23 'Core Services' like Google Classroom, Docs, Sheets, Drive, Meet, and Hangouts Schools can also enable 'Additional Services' such as YouTube, Maps, and Search, making them accessible to students throughout their learning journey.
Core and Additional Services within Google Workspace are subject to distinct privacy policies and legal terms, providing varying levels of privacy protection for children during their online learning Importantly, while using Core Services, Google refrains from utilizing data from children to create profiles for targeted advertisements, and children are not exposed to ads However, these privacy protections do not extend to Additional Services.
22 Unless the clip hosted by YouTube or Vimeo is embedded in the Google Classroom environment
23 ‘Core Services’ are Google’s main applications within the Google Workspace for Education platform
24 ‘Additional Services’ are Google’s consumer applications accessible through the Google Workspace for Education platform if the school’s platform administrator allows pupils to access them
Interface design can inadvertently lead children to access additional services from both the provider and third parties that may lack adequate privacy protections This can compromise the transparency of how a child's data is utilized, potentially exposing their school data to commercial use and increasing the risk of exploitation and future discrimination.
Box 5: Lightbeam’s identification of third party tracking via Google Classroom
An experiment was conducted to analyze the use of Google Classroom by a nine-year-old child in a London primary school and a twelve-year-old child from another school during the COVID-19 lockdowns Both children accessed Google Classroom's 'Core' and 'Additional Services', with the former being privacy-preserving and the latter not Notably, there was no notification to inform the child users when they transitioned to a different privacy regime, nor was there any request for additional consent Furthermore, it appeared that Google did not inform schools about the distinction between these services or the varying privacy policies that apply.
A child's interaction with learning material on Vimeo was monitored by 42 third parties, including 'adservice.google.co.uk' and 'analytics.tiktok.com', as revealed by Lightbeam's data capture 25.
‘amazon-adsystems.com’ and others When the child later clicked on YouTube, they were subject to cookie surveillance by a further 50 third party sites 26
25 Lightbeam is an internet browser add-on, offered by Mozilla, that visualises the first and third party tracking cookies companies deploy to monitor users’ browsing habits (Fowler, 2013)
26 Adapted from Hooper et al (2022)
Workspace for education Terms of Service
Google Workspace for Education privacy policy
Google Cloud privacy notice provides more information about data processed in additional services
Data processing and security terms (Customers) where services are provided in workspace
Services summary (defines cores services) provides more information about data that is processed while providing core services
3 rd party apps each with own governance structure and interrelated conditions
Service Specific Additional terms and policies applies only to personal information that constitutes 'service data'
Google privacy and security principles
A nine-year-old’s user journey through Google Classroom
EdTech providers must ensure the highest level of privacy protection across all services utilized by schools, especially when learners transition between different products This approach aligns with Action 2, emphasizing the ICO's commitment to rigorously enforce data protection regulations, including the AADC, for all EdTech solutions.
When an EdTech provider handles a child's data for non-educational purposes, it must be based on an individual contract with the child or a legitimate business interest In such cases, the child must be presented with distinct options to activate each component of the service, ensuring that these elements are not bundled together.
To tackle the issue of over 20,637 state-funded English schools negotiating separately with EdTech companies, the Department for Education (DfE) guidance is essential for school procurement processes The negotiation between the Dutch Data Protection Authority and Google serves as a model for implementing centralized agreements or guidelines that could establish minimum standards of practice, thereby alleviating the power imbalance and reducing the burden of contract negotiations for schools.
Action 4: The DfE should develop, with the support of the ICO and/or Crown Commercial Service, standard contractual clauses for schools to insert into their contracts with EdTech providers These should restrict EdTech’s data processing to the processing purposes that schools choose and can reasonably audit
Introduce certification for EdTech used in school settings
The need for an approved framework and standard EdTech assessment criteria 12
The expectation for individual schools to identify educational products and services while ensuring children's safety and privacy is unrealistic EdTech offerings are complex, provided by various vendors, and encompass a wide range of tools, including management information systems, teaching aids, and administrative technologies Additionally, schools must assess the compatibility and interoperability of different hardware and software solutions, further complicating the process.
Assessing the educational value, data protection, and safety of EdTech products demands significant time and expertise from schools, creating a barrier to informed decision-making Introducing certification for these products would enhance market visibility for those that have undergone peer review, ultimately providing tangible benefits for both children and educational institutions.
Current DfE-approved frameworks for ICT procurement in schools focus mainly on legal requirements, including competitive tender processes, cost-effectiveness, quality indicators, and pre-agreed Terms and Conditions However, they place insufficient emphasis on demonstrable educational benefits and do not adequately address the latest children's data protection requirements While some frameworks include products like Microsoft 365, Google Workspace for Education, and safeguarding software such as CPOMS, they overlook other widely-used tools like ClassDojo, MyMaths, and Times Tables Rock Stars, as well as various free EdTech options.
While various assessment frameworks exist, they often rely on inconsistent criteria and lack comprehensiveness Notable examples include the Department for Education's (DfE) effectiveness assessment of EdTech in schools, which evaluates the perceived value of technology by teachers; the Education Endowment Foundation (EEF), which focuses on educational benefits and technical security; and sporadic academic research.
The EdTech market presents significant challenges for both vendors and educators, primarily due to the absence of centralized and standardized procurement processes, as highlighted by the Centre for Data Ethics in 2021 Additionally, concerns regarding fairness in this complex market have been raised, as noted by Winchester in 2023.
31 Recognising this, the Welsh government has designed its own standardised EdTech product used in schools across the country Welsh Government (2023a, 2023b)
32 These DfE frameworks are voluntary and intended to relieve schools of individually vetting the provisions of each EdTech product See DfE (2022a)
In 2021, ClassDojo achieved 849,000 downloads, and a summer 2022 survey of 1,014 children aged 6 to 17 revealed that 18% utilize the app in school settings However, the lack of reliable evidence regarding its risks and benefits complicates schools' ability to make informed decisions about its use For further insights, refer to Revolution Professional (2019).
37 Admiraal et al (2020), Darvishi et al (2022), McKnight et al (2016).
The Department for Education (DfE) has established digital and technology standards; however, these standards currently do not encompass software specifically designed for teaching and learning Similar to the National Health Service (NHS), the DfE is obligated to adhere to the gov.uk Service Standards, which may serve as a foundation for future guidance and certification schemes aimed at ensuring the quality of EdTech products.
What is required is a standardised comprehensive framework for assessing and approving EdTech regarding its:
• opportunity costs or risks (including in relation to future education or employment)
The lack of a standard framework in education contrasts with current good practice in health and social care (see Box 6)
Box 6: Standard assessment criteria for digital technologies in health and social care 41
In health and social care, it is essential to evaluate digital technologies prescribed to patients or users based on standardized criteria These criteria ensure clinical safety, data protection, security, interoperability, usability, and accessibility The Digital Technology Assessment Criteria (DTAC) play a crucial role in this evaluation process.
The 42 DTAC framework is essential for healthcare organizations to evaluate suppliers during procurement and due diligence, ensuring that digital technologies comply with health and social care baseline standards It integrates relevant legislation, regulations, and best practices, providing a comprehensive guide for the diverse digital technologies utilized in the sector.
39 DfE (2019) offered some considerations for schools thinking of using cloud-based services, and referred to the National Cyber Security Centre’s guidance on basic cybersecurity considerations
The assessment criteria from NHS England provide a valuable framework for evaluating digital technologies in educational settings, which is currently hindered by a lack of unified standards The Department for Education (DfE) evaluates a wide array of educational technologies, such as 3D printers and collaborative online platforms, but primarily focuses on the processes of selection and implementation rather than their actual impact The effectiveness of EdTech in schools is largely determined by the perceived value from educators, highlighting the need for a more cohesive approach to technology assessment in education.
The Department for Education (DfE) guidelines for schools and colleges do not encompass EdTech platforms, applications, products, or services utilized in teaching and learning, nor do they include Management Information Systems (MIS) (DfE, 2022f) Most assessments of EdTech conducted by the Education Endowment Foundation (EEF) and various academics primarily concentrate on learning-related EdTech (Stringer et al, 2019; Admiraal et al, 2020; Darvishi et al, 2022; McKnight et al, 2016).
Developing trusted data infrastructure(s) for research, business and government that
Determine which data should be publicly accessible
Commercial companies have already gained access to education records held by the DfE for
Private EdTech companies often restrict access to the extensive educational data collected from children, citing Intellectual Property (IP) and commercial reasons This lack of reciprocity hinders researchers and civil society from utilizing this valuable information for accredited research purposes aimed at benefiting the public good.
Access to private education data must be improved to generate public interest insights while ensuring compliance with data protection laws and children's rights Research funding bodies should provide competitive funding for strategies that leverage pupil data to enhance educational outcomes, inform public policy, and ensure independent oversight of EdTech products Establishing mechanisms for data access interoperability and standardization is essential for fostering understanding and trust in the education sector.
Schools are required by law to collect basic analogue data, such as age and educational records, as outlined in Section 537A of the Education Act 1996 However, EdTech has the potential to generate more comprehensive and real-time data, revealing insights like optimal learning times and the impact of physical activity on memory While this valuable data is accessible to commercial EdTech companies, it fails to enhance public knowledge on improving educational outcomes and child wellbeing.
58 See DfE (2022c) and Day et al (2022)
Schools and education authorities are evaluating the effects of large language models, particularly those trained on data sourced from EdTech in educational settings Consequently, it is probable that schools will need to invest in commercial software from providers who have previously gathered training data, similar to the approach taken by ChatGPT.
Effective data stewardship systems can transform children's educational outcomes To harness the value of publicly funded education data for the public good, there must be a commitment to sharing anonymised data for research and public services Promoting open source and open data principles in EdTech can create larger data pools for UK businesses and researchers, fostering innovation and improving results for children.
Action 9: The DfE should consult on an operational model of education data sharing, to include data processed by EdTech, in the public interest 61
Develop a clear framework for data access
Publicly held education data are regulated by established frameworks, yet the sharing of these databases often lacks consistent access controls and compliance with data protection laws It is essential to implement agreed governance criteria that promote understanding among parents, caregivers, teachers, and children regarding the implications of data sharing and research Additionally, there should be robust oversight and clear mechanisms for redress in case of issues.
The Department for Education (DfE) is responsible for creating and maintaining publicly held education datasets, including the National Pupil Database (NPD), which contains extensive personal information about schoolchildren Access to these datasets is primarily managed by the Office for National Statistics (ONS), which ensures secure data-sharing in compliance with the Five Safes Framework.
Box 7: The Five Safes Framework
The Five Safes Framework, utilized by UK secure labs such as the Office for National Statistics (ONS), enables approved researchers to access personal and sensitive data while safeguarding the privacy of data subjects This framework is built on five key principles that ensure responsible data usage.
1 ‘Safe data’ guards against confidentiality concerns
2 ‘Safe projects’ require data controllers to ensure and approve appropriate, lawful and ethical uses of data for the public good
3 ‘Safe people’ refers to trusted data users – accredited and authorised researchers
4 ‘Safe settings’ means that the facilities that provide research access to data have capabilities to limit unauthorised use of data
5 ‘Safe outputs’ ensures that the research outputs maintain the confidentiality of the data subjects
Individually, these principles reduce the risks of misuse, unauthorised access or unauthorised use of sensitive data Applied in combination, they provide assurance of safe data sharing and use
60 This would be consistent with the approach taken by UNICEF (2019)
61 This recommendation is modelled on the French legal provision in Article 53-1 of the amended Ordinance No 2018-65 of 29 January
2016, relating to concession contracts (Rộpublique Franỗaise, 2019)
The Five Safes Framework is applied inconsistently by the DfE, leading to criticism for granting access to children's data to media and gambling companies A 2020 audit by the ICO revealed that alternative methods for accessing this data directly from the DfE breached the Five Safes Framework and posed significant data protection concerns.
Box 8: The DfE is reprimanded for misuse of education data
The ICO's investigation in 2022 revealed that gambling companies were profiting from the Learning Records Service, a database managed by the DfE Initially, a training company was granted access to children's personal data for training purposes but subsequently repurposed this data to offer commercial age verification services to gambling firms As a result, the ICO determined that the DfE violated UK GDPR Article 5(1)(a) and Article 5(1)(f), which mandate data controllers to prevent unauthorized processing by third parties and ensure the confidentiality of data subjects.
To meet the Five Safes Framework, it is essential to implement data access control practices that include auditing actual data usage to ensure alignment with the stated purpose in the data access application Additionally, a transparent redress procedure must be established for children whose educational records have been misused.
Action 10: The ICO should reinforce the DfE’s application of the Five Safes Framework with a robust audit system to ensure that the DfE’s data sharing of children’s education records adheres to data protection laws and the Five Safes Framework
Action 11: The DfE (as data controller) should set up an easily accessible system of redress for children whose data have been exploited
Action 12: The DfE should fund research based on its databases of children’s educational records and other publicly held datasets to create educational benefits either directly or through research funding bodies.
The future of access to education data
Evolving governance methods will enhance the secure and efficient sharing of educational data, empowering schools, parents, caregivers, and children to have greater control over access to their data, especially concerning private organizations.
The sandbox principle allows tech products and processes to be tested in a transparent and safe environment, benefiting businesses, regulators, and consumers alike We propose initiating a pilot project involving multiple sandbox schools to evaluate its effectiveness.
Explore 20 innovative EdTech products designed to test the potential of sharing business-held data for ethical educational purposes These trials aim to uncover valuable insights for the responsible use of education data in the future.
Box 9: Born in Bradford project
The Born in Bradford (BiB) project highlights the advantages of confidential data linkages among administrative records for over 30,000 Bradford children and their families By engaging with participants, researchers secured informed consent for ethical data linkage across health, social care, and education records To protect confidentiality, a ‘non-unique personal identifier’ is employed to match individuals' education and health data, minimizing reidentification risks This analysis uncovered critical insights, such as the impact of ophthalmic deficits on children's reading skills, leading to targeted interventions like proper eyeglass prescriptions to enhance reading abilities for those lagging behind.
Emerging data stewardship models are exploring decentralized governance structures, trusted intermediaries, and data trusts A data trust enables individuals to share their data with a reliable intermediary that incorporates privacy controls, granting access to a collective data pool for third parties to utilize in ways that benefit the data subjects Notable examples of data trust models include the Databox project and Mydex CIC.
Insights drawn from education data and digital technologies used to deliver education (EdTech) promise great benefits to children and their learning experience, enhancing children’s best interests These include: 67
• tracking aggregated student progress across settings to target interventions
• helping teachers to evaluate students’ progress against national standards
• early identification of special educational needs and disabilities to guide support
• personalised learning to support educational outcomes
• improving the discovery of educational content by analysing user engagement
• helping schools improve their services and processes and guide resource allocation
• identifying safeguarding needs to support child protection
• promoting public health benefits by analysing the needs of vulnerable children
• researching and documenting the benefits of educational interventions
• defining and optimising algorithms that can improve children’s outcomes
• combining education data with other datasets to produce new insights
63 This example is adapted from Mon-Williams et al (2022)
A successful model necessitates substantial progress in several key areas: enhancing technical interoperability and data readability across systems, establishing effective methods to ensure data controllers consistently provide data extracts, clarifying parental roles in managing their child's data, creating a network of trusted intermediaries to serve as data trustees, and improving data literacy among schools, parents, caregivers, and children to help them understand the value of the data trust.
Action 13: The DfE should fund and work with the ICO to set up sandboxes for privacy and child rights-respecting data trust models for education data, and experiment with how these models can be effectively integrated into the UK education ecosystem to facilitate access to and usage of privately held education data in the public interest 68
The inaction regarding education data stems from a limited understanding of the issue and its importance, compounded by the fact that school leaders, parents, and caregivers—those most affected—often lack access to essential resources and information.
The proposed blueprint aims to establish a new framework of protections, but certain contingencies must be addressed to prepare the environment for implementing and enhancing these changes.
Governments and regulators worldwide face challenges in defining effective EdTech while ensuring privacy Establishing fair terms and measurement metrics can provide a significant first mover advantage, potentially influencing a market valued at US$4.68 billion in the UK.
Political narratives highlight the advantages of these changes, necessitating the development of expertise within the Department for Education (DfE), the Department for Science, Innovation and Technology, and the Information Commissioner's Office (ICO) Additionally, it is essential to integrate EdTech into broader digital regulations, including online safety, trade, data flow, and other pertinent policy areas.
Ofsted must develop a nuanced understanding of EdTech's role in schools, recognizing that its use is not inherently beneficial It is essential to closely examine evidence that the selected EdTech is appropriate, effectively utilized, and respects privacy Additionally, when implementing EdTech products, it is crucial to ensure they have received the necessary certifications.
Finally, and perhaps most importantly, the government must find the resources to support the proposed changes
The blueprint highlights two key themes: the necessity for improved data management prioritizing children's privacy and the challenge of accessing children's data for societal benefit In the rapidly changing digital landscape, this blueprint is not the final solution; however, its 13 actions mark a significant advancement in addressing critical issues, streamlining data management in schools, and ensuring that children's data is handled in their best interests.
68 This project was pioneered in France in 2022 (CNIL, 2022)
Recent instances of data protection risks
The UK faces significant data and governance risks in EdTech, which threaten children's rights and future opportunities Learning from other jurisdictions can help mitigate these risks While AI-driven systems offer potential benefits, they also pose dangers related to personal data misuse and unexpected outcomes.
Table 1 Data protection and other risks concerning EdTech identified by jurisdiction
Date Jurisdiction Finding and decision on risk in EdTech use
In May 2022, Human Rights Watch revealed that 49 governments endorsed unsafe educational products during the COVID-19 pandemic, with 145 of these tools featuring surveillance capabilities to monitor children while they learn Following this, in July 2022, Denmark's Data Protection Authority prohibited the use of Google education products due to multiple high data protection risks identified in the Netherlands.
Workspace and Chromebooks in Helsingứr, having identified high data protection risks concerning lack of transparency in data processing and use and missing or problematic privacy controls 73
UK The Digital Futures Commission’s review of Google Classroom and ClassDojo identified data protection risks August
Remote Proctoring, used to monitor students and their homes during exams, was found to have violated privacy laws 74 November
England The ICO reprimanded the DfE for misuse of education data 75
France The French Ministry of Education urged schools to stop using the free versions of Google Workspace for Education and Microsoft Office 365 76
Australia Redesign of testing deemed critical and a likely return to pen- and-paper exams owing to plagiarism fears around ChatGPT 77 January
ChatGPT banned from all public school devices 78
ChatGPT has the potential to improve education significantly; however, it also poses risks such as delivering inaccurate, biased, or discriminatory responses, which can contribute to cheating and plagiarism Addressing these challenges may necessitate legislative changes.
73 The European Data Protection Board (EDPB, 2022) noted that this was likely to apply to other municipalities, and is finalising several relevant cases
75 See ICO (2022b) and Box 8 in Section 3.2 of this blueprint
Complete list of actions
Action Relevant legislation Acting authority Best interests of the child
Action 1: The UNCRC and General Comment No
25 should be explicitly referenced in all existing and future law, policy and practice relating to children’s education data
Data Protection and Digital Information Bill
2022 (add the clause to reference the UNCRC and General Comment
The Bill emphasizes that the best interests of the child must be the foremost consideration in all actions related to the processing of data concerning children.
Action 2: Age Appropriate Design Code (AADC) application The government should use the
Data Protection and Digital Information Bill
2022 to clarify that all EdTech that process data about children must meet the data protection and privacy baseline provided by the AADC
Data Protection and Digital Information Bill
Action 3 emphasizes the need for data protection compliance, urging the ICO to create a checklist tailored for the education sector to effectively identify data controllers It is essential that when EdTech providers operate as joint controllers, each party can manage their data responsibilities in a manner that is proportional to the data's volume, variety, and usage, without overwhelming one another Furthermore, it is crucial that responsibilities are not assigned to parties unable to fulfill them in practice.
Data Protection and Digital Information Bill
UK General Data Protection Regulation (GDPR)
Government/ICO to ensure better resourcing and effective enforcement
The Department for Education (DfE) should collaborate with the Information Commissioner's Office (ICO) and/or the Crown Commercial Service to create standard contractual clauses for schools These clauses must limit EdTech providers' data processing to the specific purposes selected by schools, ensuring that these purposes are subject to reasonable auditing.
UK Government Service Standard 79 (Principles 10 and 11)
DfE (or Crown Commercial Service)
The Department for Education (DfE) should collaborate with experts, the Information Commissioner's Office (ICO), academic institutions, and children to establish a certification scheme for EdTech in schools This scheme should be evidence-based and adhere to ten specific criteria, promoting the adoption of EdTech certification among educational institutions.
Action 6: UKAS should assess and accredit certification bodies to audit and certify EdTech products or services for compliance with the certification criteria
Conformity assessment and accreditation policy 80
Action 7: The DfE should set out the educational purposes that EdTech may serve, and maintain an independent evidence base to support this
The Department for Education (DfE) should mandate that schools perform a child rights impact assessment (CRIA) when selecting EdTech products or services that do not comply with the 10 certification criteria outlined in Action 5, particularly if these products could affect fundamental rights, thereby updating the 2022 Keeping Children Safe in Education guidelines.
Children Safe in Education: Statutory Guidance for Schools and Colleges
Action 9: The DfE should consult on an operational model of education data sharing, to include data processed by EdTech, in the public interest
Data Sharing Governance Framework (Central
The ICO should strengthen the DfE's implementation of the Five Safes Framework by establishing a comprehensive audit system This will ensure that the sharing of children's education records by the DfE complies with data protection laws and adheres to the principles of the Five Safes Framework.
Data Sharing Governance Framework (Central
The ICO to audit the DfE’s data practices
Action 11: The DfE (as data controller) should set up an easily accessible system of redress for children whose data have been exploited
The Department for Education (DfE) should allocate funding for research utilizing its databases of children's educational records and other publicly available datasets, aiming to generate educational benefits either directly or through research funding organizations.
Action 13: The DfE should fund and work with the ICO to set up sandboxes for privacy and child rights-respecting data trust models for
UK GDPR Data Protection and Digital Information Bill –
80 See Office for Product Safety and Standards & Department for Business, Energy Industrial Strategy (2012)
The integration of 25 education data models into the UK education ecosystem aims to enhance access to privately held education data for public benefit This initiative emphasizes the Secretary of State's role in regulating and safeguarding children's interests while promoting the use and reuse of data for scientific research, as outlined in Article 4 of the UK GDPR, amended in the new Chapter 8A.
5Rights Foundation (2022) 5Rights celebrates the first anniversary of the Age Appropriate
Design Code https://5rightsfoundation.com/in-action/5rights-celebrates-the-first- anniversary-of-the-age-appropriate-design-code.html
Ensuring accessibility in digital services is crucial for meeting government requirements The guide "Making your service accessible: An introduction" emphasizes the importance of creating inclusive experiences for all users Additionally, the work by Admiraal, Vermeulen, and Bulterman-Bos highlights the role of learning analytics in enhancing teaching methods, demonstrating how data can inform and improve educational practices.
How to connect computer-based assessment data with classroom instruction?
Technology, Pedagogy and Education, 29(5), 577–91 https://doi.org/10.1080/1475939X.2020.1825992 Aston, J., Davies, E., Guijon, M., Lauderdale, K., & Popov, D (2022) The Education Technology
Market in England: Research Report Department for Education, November https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attach ment_data/file/1117067/Edtech_market_in_England_Nov_2022.pdf
The British Educational Suppliers Association (BESA) provides key statistics on UK education, highlighting important trends and data In a recent development, Australian universities have decided to revert to traditional 'pen and paper' exams following incidents where students were caught using AI to generate essays, as reported by The Guardian on January 10, 2023.
Central Digital & Data Office (2022) Data Sharing Governance Framework www.gov.uk/government/publications/data-sharing-governance-framework/data- sharing-governance-framework
Centre for Data Ethics and Innovation (2021) AI Barometer Part 5 – Education www.gov.uk/government/publications/ai-barometer-2021/ai-barometer-part-5- education
initiative, as reported on May 25, 2022 More information about this initiative can be found at [CNIL's website](www.cnil.fr/en/edtech-sandbox-cnil-supports-10-innovative-projects).
Darvishi, A., Khosravi, H., Sadiq, S., & Gašević, D (2022) Incorporating AI and learning analytics to build trustworthy peer assessment systems British Journal of Educational
Technology, 53(4), 844–75 https://doi.org/10.1111/bjet.13233
Data Protection Act (2018) www.legislation.gov.uk/ukpga/2018/12/contents/enacted Day, E (2021) Governance of Data for Children’s Learning in UK State Schools Digital Futures
A blueprint for education data emphasizes the necessity of safeguarding children's rights in digitized education The report by the Digital Futures Commission and 5Rights Foundation highlights that while school attendance is mandatory, it is crucial to uphold children's data protection rights The integration of EdTech in educational settings has raised concerns regarding privacy invasions and a lack of evidence supporting its educational benefits Furthermore, there is an absence of a comprehensive strategy for utilizing children's data in ways that prioritize their best interests.
Learning, Media and Technology, 1–15 https://doi.org/10.1080/17439884.2022.2152838
DfE (Department for Education) (2019) Moving your school to the cloud 3 April www.gov.uk/government/publications/moving-your-school-to-the-cloud/moving- your-school-to-the-cloud
DfE (2021) Independent Report - AI Barometer Part 5: Education https://www.gov.uk/government/publications/ai-barometer-2021/ai-barometer-part- 5-education
The Department for Education (DfE) provides comprehensive frameworks for Information and Communication Technology (ICT) as outlined in their 2022 documentation For further details, visit the DfE's approved framework list Additionally, the DfE emphasizes the importance of applying the Service Standard to ensure quality and efficiency in educational services, which can be explored further on their dedicated application page.
DfE (2022c) How DfE shares personal data www.gov.uk/guidance/data-protection-how-we- collect-and-share-research-data DfE (2022d) Implementation of Education Technology in Schools and Colleges, Research
Report CooperGibson Research, October https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attach ment_data/file/1107808/Implementation_of_education_technology_in_schools an d_colleges.pdf
DfE (2022e) Keeping Children Safe in Education 2022: Statutory Guidance for Schools and
Colleges 1 September https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attach ment_data/file/1101454/Keeping_children_safe_in_education_2022.pdf
DfE (2022f) Meeting Digital and Technology Standards in Schools and Colleges 23 March www.gov.uk/guidance/meeting-digital-and-technology-standards-in-schools-and- colleges
recent decision to impose a ban on certain practices, emphasizing the importance of data protection in the digital economy These legislative frameworks are crucial for ensuring compliance and safeguarding rights in an increasingly digital world.
Google Workspace in Elsinore municipality 19 July https://edpb.europa.eu/news/national-news/2022/danish-dpa-imposes-ban-use- google-workspace-elsinore-municipality_en
Everything ICT (2022) features over 180 suppliers that encompass all aspects of information and communication technology (ICT) Additionally, Fowler (2013) discusses Lightbeam for Firefox, which aims to educate users about privacy while providing open data for publishers, as detailed in the Dist://ed Blog.
GlobalData (2022) United Kingdom (UK) EdTech Market Summary, Competitive Analysis and
Forecast, 2021–2026 30 September www.globaldata.com/store/report/uk-edtech- market-analysis
GOV.UK (no date) Service Standard www.gov.uk/service-manual/service-standard Hawkins, S (2022) Remote testing ‘room scans’ violate Fourth Amendment, judge says
Bloomberg Law, 23 August https://news.bloomberglaw.com/privacy-and-data- security/remote-testing-room-scans-violate-fourth-amendment-judge-says Hooper, L., Livingstone, S., & Pothong, K (2022) Problems with Data Governance in UK
Schools: The Cases of Google Classroom and ClassDojo Digital Futures Commission