by Bart Rienties, The Open University, UK.
The Open University UK has been designing, implementing, and evaluating how to effectively provide meaningful learning opportunities for hundreds and thousands of learners across the globe. Since 2015 a range of learning analytics studies and practical intervention studies have shown that these learning design decisions made by educators substantially influence what, how and when learners are learning.
However, applying and translating learning design and learning analytics in other institutions, countries and contexts is not a mere copy-paste job. In this contribution to the Media & Learning newsletter I will reflect on some of the lessons learned of how you might start to think about implementing learning analytics and learning design in your own context, if you have not already started to do so already.
Open University Learning Design Initiative
Of course the COVID-19 pandemic has accelerated the use of digital technologies in education, including how educators design, instruct, assess and use data. One prominent learning design approach that originated from work dating back to 2004 is the Open University Learning Design Initiative (OULDI). OULDI is focused on ‘what learners do’ as part of their learning, rather than on ‘what educators do’ or on what will be taught. As explained at the Media & Learning Conference 2023, a wealth of studies (for example, Nguyen et al., 2017; Rizvi et al., 2022) have shown that these learning design decisions fundamentally drive and predict how learners are learning. If you want to have a number, several studies show that around two-thirds of all learning behaviour is influenced directly by how we as educators design our daily/weekly activities.
This is perhaps a striking number, as I often speak to educators who primarily seem to “blame” learners for not being more successful in the “marvellous” learning designs they have created. However, if most learners follow more or less exactly the pathways that educators have developed, surely some of the responsibility of some learners dropping out, or some learners being less happy with a particular learning activity perhaps should also be linked back to the educator?
How to help educators make better learning design decisions?
One obvious way to help educators making better learning design decisions is to improve the way in which we can illustrate their decisions back to educators. In part this could for example be done by providing learning analytics data of actual learner behaviour back to educators, and help them to reflect on whether or not their learning design activities match with what learners have actually been doing. However, this is often only feasible after a particular learning activity has been designed, implemented, and evaluated by the most important critics of our learning activities, our learners.
Therefore, another potential option could be to provide the initial learning design decisions in a meaningful manner back to educators. This is where the power of visualisations comes into play, in particular learning analytics dashboards. Our free to use Balanced Design Planning tool might be useful for educators who want to receive immediate feedback on their initial learning design decisions. The tool provides automatic insight into how educators’ learning outcomes and learning design decisions are aligned, and how to ensure that learning activities are well aligned. Have a look at it yourself if you want to see how it works!
Author
Bart Rienties, The Open University, UK
Acknowledgement
We are grateful for all the educators, learning designers, and others who have provided input and suggestions to the BDP tool and concept. This study was conducted within the project “Innovating Learning Design in Higher Education (iLED)”, financed from the Erasmus+ Programme of the European Union, within approved by the Erasmus+ programme – KA2 – Cooperation partnerships in higher education
References can be downloaded here.