Amai! Inclusive AI Education for the Digitally Vulnerable

by Isaak Vandermaesen, Scivil, the Flemish knowledge centre for citizen science, Belgium.

Digital vulnerability affects four out of ten Belgians[1] spanning diverse ages, socio-cultural backgrounds, and educational levels. Despite this diversity, these groups are often overlooked in AI-related educational initiatives. The amai! initiative in Flanders tackled this challenge through the development of a customised educational package and workshop tailored to the teachers of the digitally vulnerable, about the subject deepfakes.

Project amai!

Amai! was a mission-oriented initiative in Flanders (2019–2026) designed to inform citizens about AI and involve them in its local development. Its approach revolved around three pillars: Information, Ideas, and Impact. First, Amai! raised awareness through workshops and lectures where science communicators explained AI applications and encouraged public reflections. Second, it gathered over 1,000 citizen ideas during these sessions and at science festivals. Finally, these ideas formed the starting point of funded AI projects addressing societal challenges. After expert review, citizens helped select the projects and stayed involved throughout development, ensuring co-creation and democratic decision-making.

The amai!-project, led by the Flemish Knowledge Centre Data & Society and Scivil, the citizen science knowledge and research center, prioritised safe, responsible, and societally relevant AI applications while making their co-creative processes as inclusive and democratic as possible. Yet the team repeatedly encountered a gap between its ambition to inform the most diverse public and the large, often invisible group of digitally vulnerable people in Flanders. In 2024, amai! responded by developing AI educational materials specificallu for a segment of this target group: low-literate adults in Flanders.

Co-defining needs

By its fourth cycle, amai! had the network to convene stakeholders and co-design materials for this new target group. The team partnered with civil society actors working with low-literate adults and organized focus groups with teachers in basic adult education (LIGO) to identify learners’ most pressing AI-related needs.

The discussions uncovered an unexpected insight: teachers themselves also needed foundational and practical understanding of AI in education. Within staff rooms, AI often felt distant and opaque, and these uncertainties were amplified in communities facing literacy and technology barriers. Amai! therefore expanded the target group to include teachers, aiming to make AI both explainable and engaging for low-literate adults while empowering educators with concrete methods and materials.

Consulting with the teachers in basic education for adults, we defined the following needs for students and teachers in adult education:

Raise Awareness

  • Increase teachers’ knowledge and confidence about AI
  • Need to teach learners about privacy, online safety, and AI-related risks (deepfakes, phishing)

Apply/Integrate in Learning Process

  • Need for a database with concise guides on what tools can do and how to use them (teachers)
  • Using AI as a language, writing, or speech assistant (teachers and learners)

These AI-needs for low-literate adults and their teachers can still serve as general recommendations for future programs. Amai! had already developed materials addressing several of these needs in earlier work with secondary schools (available on the amai! website). Consequently, the team focused its new efforts on one specific risk area – deepfakes – creating a focused set of activities around this topic.

Iterative process

During the development-process, amai! encountered many of its own assumptions about education and science communication. Initially, we overestimated the digital competencies of adult learners. However, through sustained participant observation in LIGO classes in Antwerp and multiple try-outs across LIGO departments in Flanders, we gained the insight to adapt our material to the incredibly diverse target group, an outcome that would not have been possible with a top-down educational package.

This iteration also reframed our purpose. Supported by accessible and experienced partners such as LIGO, and later Avansa and Mediawijs, we shifted from a stance of “this is what they should know” to a need-driven question: “What do they actually want to know, and how do we ensure they truly get it?” Concretely, our deepfake lessons began not with definitions but with an experience. Using the ‘Heygen’ application, educators created a safe, teacher-consented deepfake of their own face. Learners interacted with it and could even put slightly controversial words in the teacher’s mouth. This “show, don’t tell” approach, within a controlled environment, proved both safe and engaging, sparking curiosity and conversation, arguably the most valuable outcomes for low-literate adults.

To conclude

Through partnerships with educators, local organisations, and AI experts, amai! leveraged interactive workshops and focus groups to co-create accessible materials tailored to the needs of low-literate adults. These efforts aimed not only to demystify AI but also to equip educators and learners with adaptable, practical tools. To date, amai! has engaged over 100 educators and volunteers working with low-literate adults, providing resources to integrate explainable AI concepts into their teaching.

Yet challenges remain. Ensuring equitable accessibility across communities requires ongoing adaptation to diverse local contexts and sustained relationships with educators and learners. Building a shared understanding of needs and goals among network actors is complex, as roles, expertise, and perspectives differ and can hinder collaboration. Persistent digital divides and fair resource access in marginalised communities continue to pose significant hurdles.

By fostering co-creation across its network, amai! demonstrates a viable model for making AI concepts and research accessible, relevant, and inclusive — one that may inspire other regions and sectors to engage with technology in ways that prioritise societal impact and inclusion.

Author

Isaak Vandermaesen is an anthropologist and educator with a strong interest in artificial intelligence (AI) and civic engagement. He is actively involved with Scivil, the Flemish Knowledge Center for Citizen Science, where he contributes to a range of projects at the intersection of AI and education.

As part of the Amai! project, he engages citizens in the co-creation of AI applications with clear societal relevance. Within Scivil, he also contributes to the pilot project maarallee.be, which encourages citizens to help expand Flemish speech datasets for scientific research in automatic speech recognition.


[1] Mediawijs. (2024, 21 februari). Who is vulnerable to digital exclusion? https://www.mediawijs.be/en/article-overview/who-vulnerable-digital-exclusion