The world is in the grip of the corona virus. Schools have been closed and people are urged to work from home if possible. Education institutions and organisations alike are trying to figure out how to help their students or workers learn while at home. At this point, they’re forced to redesign their current offerings from face-to-face to digital at a distance. Although there are many pitfalls (redesign from in-person to virtual requires a careful design process), we thought it might be helpful to give some tips & tricks. Wilfred Rubens, a friend / colleague has recently written some useful blogs in Dutch and has kindly agreed to allow us to translate them to English. They’re all based on the book ‘Wijze lessen. Twaalf bouwstenen voor effectieve didactiek’ (‘Lessons for Learning: 12 Building Blocks for Effective Teaching’ which is at this very moment being translated into English), written by Tim Surma, Kristel Vanhoyweghen, Dominique Sluijsmans, Gino Kamp, Daniel Muijs and Paul A. Kirschner.
In the book, Surma and his co-authors, discuss how to teach effectively using twelve evidence-informed instruction principles. Wilfred saw an opportunity to elaborate on the building blocks by teasing out the relationship between each building block and learning technologies. He published 12 blogs – one for each building block – in which he explained how learning technologies can be used to facilitate and strengthen the relevant building block (you can find the original blogs here).
This is the tenth one. Stay tuned for more! We (Mirjam and Paul), together with ExCEL’s – the Expertise Centre for Effective Learning (of which Paul A. Kirschner is guest professor) Tim Surma, Kristel Vanhoyweghen (researchers at ExCEL and also authors of the book), and Tine Hoof (also a researcher of ExCEL), we’re working hard to translate the next 9 building blocks, and how learning technology can strengthen them, into English.
The authors of Lessons for Learning summarise this building block as follows:
We often think about how to get things into learners’ heads [storage] while it’s equally important to think about how to it out of their long–term memories [retrieval]. When learners practise recalling information that they’ve processed and learnt, it strengthens their memory as compared to more passive strategies such as rereading. As a result, learners remember the subject matter longer and better.
This building block is actually a part of building block 5: use learning technology to make learners process the subject matter actively, but according to Tim Surma and his co-authors it deserves more attention. By asking learners questions about content that has already been dealt with, you strengthen the act of remembering information (retrieval practice). In this chapter the authors discuss several strategies to do that, for some of which learning technology can be used for (an online quiz via for example Socrative). However, learning technology can help you in more ways:
- A quiz at the beginning of a class or series of classes (for example via an online quiz programme or a test that you make yourself).
- An exit ticket at the end of a class (for example via an online form).
- Using flash cards to practise.
- Having learners make digital Cornell notes so that they have more options to use multimedia as illustrations.
- Free recall (‘What do you remember about …?’) through tools like Padlet.
The advantages of using learning technology are:
- You have more options to alternate between different types of learning activities.
- You, as instructor/facilitator, don’t have to carry around a lot of material.
- You can have students do these activities at their own pace, whenever and wherever they want to or can. During an online session you can talk about misconceptions, commonly made mistakes, points that are debatable etc.
- You can easily monitor whether learners are actively involved. You can potentially involve more learners using learning technology.
- You can quickly process and analyse the results of tests, especially for closed-answer questions.
- You can save the results more easily.
Another strategy is having learners self-test themselves with open, more complex questions about a certain topic. Learners answer these questions and receive an ‘expert-answer’ generated by you, the instructor/facilitator, that they can compare their own answers to.
Also, when using online tests you can give automatic feedback. A meta-analysis of research on this shows that when learners study a text, computer-generated feedback supports them more than personal feedback (I [Wilfred Rubens] suppose that is because computer-generated feedback is given immediately after finishing the test). In that respect it’s important that the reading isn’t constantly interrupted by feedback. In other words, learners receive feedback as soon as they finish the test instead of question by question.
Many test applications / functionalities allow you to choose when the learners receive feedback (after answering the question, on request, after finishing the test). When dealing with texts it is recommended to give feedback immediately after the test.
Making practice tests takes time and energy
Making practice tests does take effort, but again technology can be used. I [Wilfred] am not only referring to the possibility to create a database of questions together with your colleagues (even though that is important), but also to relatively new applications like Quillionz. This application uses Natural Language Processing to compose different types of questions and answers based on texts. Instructors/facilitators can indicate key words in texts, adapt generated questions and for example import them into their online testing environment. Quillionz’ paid version supports not only knowledge questions but also application/WH-questions (who, what, where, when, why, and how).
At the moment, this application is only able to process English texts, using other languages is not yet possible. Also, the application isn’t able to create questions and answers about every text type.
Content that is structured, descriptive, and factual gives the best results with Quillionz. However, content that is very subjective, expressive, highly specialized to a domain, or contains specific jargons might not generate the best questions.
I [Wilfred] haven’t tried the application yet and undoubtedly Quillionz doesn’t run perfectly, but I do think it’s a promising development.