How does the new EU AI Act affect the adult education sector? • ALL DIGITAL
25781
post-template-default,single,single-post,postid-25781,single-format-standard,ajax_fade,page_not_loaded,,qode-title-hidden,qode-child-theme-ver-1.0.0,qode-theme-ver-10.1.2,wpb-js-composer js-comp-ver-6.9.0,vc_responsive

How does the new EU AI Act affect the adult education sector?

05 May, 2024

Thank you to  ELM – European Lifelong Learning Magazine for the opportunity given to ALL DIGITAL and Norman Röhner to present our views on the implications of the EU AI ACT for digital education and educators. 

Authors: Katriina Palo-Närhinen

Published: 

Please consult the source article 

Aiming at protecting fundamental rights and democracy in digital education, that’s for sure. What about the duties and obligations? What are they? All Digital’s Policy Officer Norman Röhner answers three quick questions about the new EU AI Act, which is expected to come into force by June 2024.

What are the top three implications of the EU AI Act for the adult education sector?

Firstly, the EU Artificial Intelligence Act explicitly recognises the positive impact that AI systems can have on digital education and training. They can facilitate individualised, adaptive and tailored teaching and learning offers, boosting their accessibility and effectiveness. Especially relevant for adult learners, AI can support autonomous, independent learning at the learner’s own pace.

Secondly, adult education will have a significant role in the widespread use of AI systems, due to the provisions on AI literacy which oblige developers and those who integrate AI systems into their activities (“deployers” in the language of the Act) to ensure the adequate training in the use and operation of AI systems. This includes the awareness of the impact of AI, as well as both the opportunities and risks stemming from its use.

The third implication is that the AI Act lists education and training among the fundamental rights, and places special emphasis on the rules applicable when AI is used in this area. The Act classifies as “high risk” such AI systems which are used to determine a learner’s access to education (decisions on admission), the level of education they are eligible for (placement tests), to evaluate learning outcomes (decisions on grades) or to monitor behaviour of learners during tests (proctoring, detection of cheating). These use cases are still allowed but require stringent risk assessment and curation procedures under the AI Act.

While recognising the positive impact that AI systems can have on digital education and training, the new EU AI Act also places special emphasis on the rules applicable when AI is used in this field. Photo: Antoine Schibler on Unsplash.

 

What must teachers take into account?

An important understanding for teachers is that they and their teaching institutions will have considerable responsibility in the adequate and proper use of AI tools in their work. In the first place, this means they should make themselves aware of which types of uses are allowed, which uses are banned and which uses require additional assessment steps.

As a rule of thumb, the EU AI Act in the education context places additional rules only on those AI systems which suggest or make decisions, meaning that tools which automate or ease specific tasks, such as a speech-to-text tool recording notes, do not require significant administrative steps.

In addition to a range of unethical practices, such as using AI for subliminal influencing or the exploitation of vulnerabilities, the Act places a hard restriction on AI systems that are meant to interpret or infer a person’s emotional state, specifically in the context of work and education.

However, trainers and educators should feel encouraged rather than intimidated by the AI Act’s regulation of artificial intelligence systems in education. The one crucial condition is that educators and trainers seek and receive guidance and support from their institutions and professional networks.

What must learners take into account?

Learners should be aware that the EU AI Act first and foremost seeks to protect their rights as individuals. Learning institutions have strict obligations if they want to base admission or learning outcome evaluation on AI-supported systems, offering justifications and assurances that such a system is not discriminatory.

The provision of the Act should boost learners’ confidence in engaging with AI tools in their education processes.  The Act aims to strike a balance between facilitating AI tools that increase accessibility and simplify repetitive tasks while restricting those that risk infringing on their rights or cause harm.

On the other hand, the use of AI systems to detect teachers’ conformity to previous grading patterns, for example, is an exception explicitly mentioned in the Act. This further protects learners in their rights to have access to education and to have their learning outcomes assessed fairly.

In addition, the Act’s provisions on AI literacy gives teaching institutions (as “deployers” of AI systems) the responsibility to adequately train students (as “users”) in the use of the AI tools they wish to implement in their training offers.

Norman Röhner

“Teachers and their teaching institutions should make themselves aware of which types of uses are allowed, which uses are banned and which uses require additional assessment steps”, stresses Norman Röhner.

Norman Röhner works as Policy Officer at ALL DIGITAL and is a passionate digital inclusion and digital education advocate with 5+ years of expertise in European digital policy. He was part of the expert group authoring the Guidelines for teachers and educators on tackling disinformation and promoting digital literacy through education and training. He is a trained teacher of English, Maths and Physics.

Photo: Oleksandr Krushlynskyi.