by Jordan Hill, Organisation for Economic Cooperation and Development (OECD).
While conducting research for an OECD Education working paper on children’s media literacy, I increasingly came in to contact with the term “algorithm literacy.” I have to admit I initially thought to myself: “oh no, not ANOTHER ‘literacy’ to add to an already extensive list.”
Are algorithms really important for digital media literacy?
Algorithms are found throughout the digital media environment. They are finite sequences of rigorous instructions that have an input and an output. Most commonly, algorithms recommend (e.g. YouTube suggestions) or filter (e.g. Twitter feed) content. They use individual and aggregated behavioural data to personalise a wide variety of content, such as news, information searches, advertising and videos to maximise engagement (and revenue) for the provider and/or platform.
Studies of children, young people and adults in various contexts (e.g. Canada, Germany, the Netherlands and the United States) have revealed that, while some know and understand that what they see in the digital environment is often a highly curated version of “reality”, many are still unaware. Where knowledge does exist, whether enough individuals feel empowered by this knowledge to shape their media experiences is another question altogether.
Algorithms have been found to passively spread misinformation and other forms of false or misleading content. They are also proactively manipulated by highly media-literate people to amplify this content through coordinated engagement (e.g. commenting or sharing). The nature of algorithms in digital media is thought to enhance cognitive biases, which can generate new prejudices, reinforce existing beliefs and make critical thinking more difficult. Uneven distribution of algorithmic awareness must be tackled head on by media literacy initiatives, not seen as an “optional extra.” However, seeing algorithm education as a distinct “literacy” may not be the most appropriate framing when looking at their pervasiveness in the digital media environment.
Is “algorithm literacy” distinct?
During my research I came across two main definitions of “algorithm literacy”. The first is from Shin, Rasul and Fotiadis (2021), who define it as “a set of capabilities used to organize and apply algorithmic curation, control and active practices relevant when managing one’s AI environment.” The second is from Dogruel et al (2021), who say that algorithmically literate individuals “are able to apply strategies that allow them to modify predefined settings in algorithmically curated environments, such as in their social media newsfeeds or search engines, to change algorithms’ outputs, compare the results of different algorithmic decisions, and protect their privacy.”
These definitions are complementary and focus on the digital media environment. They rely on individuals being aware of algorithms, understanding how they work and being able to critically evaluate algorithmic decision-making. This also means having the skills to cope with, and potentially influence, what algorithms show them. This might include both explicit and implicit actions to curate algorithms, such as the manual personalisation of the tools a platform offers, or adjustment of browsing behaviour. Conceptually, there is nothing to prevent algorithm education being integrated as an essential part of digital media literacy, rather than seen as a separate literacy.
What do we know about how algorithm education can be taught to children?
Speaking about algorithms immediately brings up associations with computer science classes, programming and coding. However, OECD work has shown how teaching coding as a vocational skillset might not be the optimal configuration to ensure children are ready for an automated future. When it comes to algorithm education, an explicit focus on coding skills may also miss the point.
Instead, integrating algorithm education into digital media literacy might be better achieved by focusing on the “essence” of the topic. This can strengthen a deep engagement with the underlying concepts of digitalisation without being distracted by the digital tools of the day. Although OECD data suggests that higher order digital skills are often less focused upon than more basic operational skills, especially for younger children, media literacy practitioners are exploring innovative, offline ways for algorithm education to shift the focus.
Efforts can be enhanced by tightly intertwining algorithmic concepts, capabilities and strategies with computational thinking. Computational thinking involves solving problems, designing systems, and understanding human behaviour by drawing on the concepts fundamental to computer science. For example, problem decomposition (breaking down complex problems to simpler ones), developing step-by-step solutions and abstract thinking. These concepts are highly adaptable and computational thinking is already taught both as an independent class and integrated into other subjects. For example, life sciences like biology rely on computational concepts such as abstraction, modularity, and algorithmic logic to understand and model how structures (e.g. of organs and cells) operate within hierarchies to function as a system. Contexts in which computational thinking is taught also include arts and national language in some OECD countries and even early childhood education. Meaningfully tying this ongoing work in more closely with media literacy has potential.
When students understand how computer scientists think, they can better understand algorithm concepts such as tracking, recommendations, search optimisation, reinforcement learning, attention engineering and content filtering.
Three things that need to be done
Firstly, evidence has shown that pre-service teachers often express low levels of confidence in their understanding of social media as a tool to engage in debate, as well as knowledge of the role of algorithms and data. Systematic attention to the content of teacher training is required.
Secondly, research still lacks valid skills scales to design and evaluate robust algorithm education interventions. By now, many media literacy resources and competency frameworks refer to algorithms, and some are specific for algorithm education. Defining valid ways of measuring algorithmic awareness, understanding and capabilities can enhance impact.
Thirdly, one of the unique challenges with teaching algorithm education is the opacity of algorithms themselves. Regulations targeting greater algorithmic transparency are part of ongoing work by policy makers in many OECD countries but must be stepped up. By increasing transparency of algorithms in digital media, children and youth can be truly empowered to critically analyse them.
Naturally, all of the above requires enhancing collaboration between media literacy stakeholders, teachers, librarians, policy makers, researchers and others, to ensure algorithm education is meaningfully integrated into practice and that “algorithm literacy” is not simply added to a literacy landscape a mile wide and an inch deep.
Author
Jordan Hill, Analyst, Organisation for Economic Cooperation and Development (OECD)