Designing Learning Experiences in an Evidence-Informed Way

Mirjam Neelen & Paul A. Kirschner

As you might know, we’ve been writing this research-based, evidence-informed blog for slightly over three years and we both still very much enjoy it. Our goal is to present learning professionals (teachers, learning designers, trainers, etc.) with evidence-informed ideas on how to make both the instructional and the learning experience more effective, efficient, and enjoyable (hence ‘3-star learning experiences’). We’re both passionate about this topic and we try to ‘spread the word’ as much as we can. For example, in the third week of June, Paul delivered key note about Urban Legends in Education at the Festival of Education in Wellington, UK and Mirjam presented on Evidence-Informed Learning Design at Learning Tech Day in Ghent, Belgium.

Although for us it’s clear what it means to work as Learning Designers in an evidence-informed manner, Mirjam noticed during her talk that it wasn’t necessarily clear to everyone in the audience what this meant. She realised afterwards that she should have checked with the audience beforehand about how they would define evidence-informed learning design. For example, one of the people in the audience shared that they were using and evidence-informed approach because they collect data using xAPI and that they use that evidence to make learning design decisions and/or predictions. Now, though this is a good example of ‘evidence-informed practice’ it’s definitely not informed by scientific evidence and it’s not what we mean when we say ‘evidence-informed’. So, we need to be aware that there are different types of evidence.

The image below shows how evidence-based practice approaches clinical practice these different types.


We thought it might a good idea to explicitly explain what we mean by ‘evidence-informed’ AND to explain HOW you can start doing it (perhaps we should have written this blog 3 years ago!!! 😊).

What Is Evidence-Informed Learning Design?

Our definition of evidence-informed learning design is: using evidence coming from scientific research (so it doesn’t have anything to do with using ‘learner data’ to make design decisions, although this is good practice as well, IF done right! 😊). Now, you might have noticed that we use the term evidence-informed and not evidence-based. This is for a reason.

Evidence-based practice is an interdisciplinary approach to clinical practice and is grounded in medicine. It’s “traditionally defined in terms of a “three legged stool” integrating three basic principles: (1) the best available research evidence bearing on whether and why a treatment works, (2) clinical expertise (clinical judgment and experience) to rapidly identify each patient’s unique health state and diagnosis, their individual risks and benefits of potential interventions, and (3) client preferences and values” (Wikipedia). Globally speaking, if a decision is made on the intake and working of a medicine, then it means that it was tested and approved for a certain specifically defined population (e.g., an adult with a certain BMI and with specific symptoms and pathologies) and the instruction to take a pill in the morning on an empty stomach but followed by food intake allows for a wide range of specific circumstances (at home, in the car, on the beach, when and wherever as long as it’s on an empty stomach in the morning).

Evidence-informed still means ‘based on scientific research’ but we’re in the field of learning sciences, which is an interdisciplinary science (see image). And here, lots of muddy and mucky real-life things influence an intervention’s effects.


We need to acknowledge that our field doesn’t usually deliver the quality of evidence that clinical practice does. This is simply because we’re dealing with so many variables that are extremely hard to (all) control. Literally, what worked with a class today at 9 AM won’t necessarily have the same effects on a different class at 3 PM and ‘disruptive’ Johnny’s absence will lead to a completely different situation than when Johnny is present. Hence, when we use evidence, we need to acknowledge that what works in one context doesn’t necessarily work in another. We usually use more qualitative data and so, the evidence is weaker.

It’s useful to know, what the levels of ‘scientific evidence’ are. We wrote a blog on this topic last year and Gorard’s table below gives an excellent overview of the levels of ‘design quality’ in research:


The reason why the lowest rating determines the overall trustworthiness of a study, is because even when is a study is honest and large-scale with hardly any dropout and with standardised results, if the intervention is described in a wishy-washy manner (i.e., you really don’t know or understand what the intervention exactly was) or if the intervention is not equivalent (e.g., the intervention group spent twice as much time working on the learning experience than the control group), that study, overall, has a low trustworthiness and still only gets 2 stars.

So just keep an eye out in which category the provided resources fall.

Now, here’s the good news. At Mirjam’s conference all seemed to be open to and keen on working in an evidence-informed manner, however, they’re not necessarily confident that you can ‘just’ start doing it. For example, the audience said they never read scientific articles and asked ‘how could I start?” That’s of course an excellent question, so let’s dive a bit deeper into that one.

How to Start Working in an Evidence-Informed Manner

Later that day in Belgium, luckily Dr. Pedro de Bruyckere came to the rescue by explaining how to not get fooled by following the steps to unravel bullshit (or to find out what might actually be true). De Bruyckere uses Dan Willingham’s steps to enable you to make more informed decisions (Pedro has actually translated the originally English titled book “When You Can Trust the Experts” to Dutch) about what/who to believe when. Here are the steps:

Step 1: Strip it and Flip it

The first part, ‘strip it’, means that you take a critical look at the language used in, for example, a statement like the one below:


Is the language vague?

Yes. What does ‘changing our brains’ refer to? What kind of change? And what does ‘living online’ mean? We don’t know anybody who ‘lives’ online!

Is the language emotional?

Well, not necessarily the language, but people might have strong emotions when it comes to ‘autistic spectrum disorders’.

Is it ‘hyped-up’?

The topic is ‘popular’ because the prevalence of autistic spectrum disorders is rising although we must also take into account that the expansion of diagnostic criteria plays a role here as well (we did some tracing here, see step 2)  there’s a lot of ‘buzz’ around what ‘digital’ does for/to us as humans so that way it ‘responds’ to a hype.

The second part, ‘flip it’, means that you try to turn the argument upside down. For example, would it be possible as well that, IF it’s true that ‘living online’ changes our brain in a concerning manner (No one likes the idea of an increase of autistic spectrum disorders) then it would be as likely that these changes could be utterly positive, such as that there’s an increase of brilliantly innovative people.

Pedro gave another example in the context of learning styles. The idea of learning styles intuitively and emotionally makes sense to people but if you flip it and ask, “how do you feel about pigeonholing people” (which is what you do when you think that people fall into a certain learning style category) then suddenly the idea sounds way less appealing.


Myers-Briggs is another way to pigeonhole people

Willingham recommends writing down the following statement:

“If I do X, there is a Y percent chance that Z will happen.”

So, in the above example from Susan Greenfield: “If I ‘live online’ there is Y percent (???) chance that I will get an autistic spectrum disorder.” Hm… how does the statement sound now?

Step 2: Trace it

This comes down to: Don’t just trust what people say because they’re an authority or an (self-claimed?) expert (we have blogged on this type of logical fallacies here and also on Eminence-Based Education in which we discuss prominent ‘eduquack gurus’ in our field and why they cause damage to learners). This doesn’t mean you have to extensively research everything but you need to dig a bit deeper and ask yourself what kind of evidence there actually is for the claim. What kind of resources has someone used? Just take a critical look.

A British Psychologist, Dorothy Bishop, asked a simple questions in response to Greenfield’s claim. She asked “Where’s the evidence?” So far, the silence is deafening.

This is brings us to step 3…

Step 3: Analyse it

This step requires some basic statistical knowledge but a critical eye can bring you quite a long way as well. Willingham suggests that if something sounds too good to be true, then it probably is. We would like to ‘flip’ that as well and say overall (which goes back to the ‘strip it’): If a claim sounds very strong, too generic, too dramatic, then it probably needs more nuance!

We also recommend to find people who do high quality research-to-practice work (in our field, people like Pedro de Bruyckere, Learning Scientists, Daniel Willingham, Carl Hendrick, Tom Bennett, Blake Harvard, Dylan William, Will Thalheimer, and Patti Shank do a really good job and we’re doing our best ourselves as well 😊). This doesn’t mean to believe these people blindly BUT it will make it easier for you to trace and get a feel for the research that’s out there.

Step 4. Should I do it?

In our profession, most of the time this would be about, should I apply this method, implement this strategy, buy this tool? And so forth…

We’d like to add a step 5, which is actually step 1…


Use the evidence to increase your knowledge and expertise so that you can have conversations with clients or partners, parents and colleague teachers, directors or school principals/headmasters, and so further on WHY you recommend certain design decisions. It will improve your expertise, our value in organisations, and, the most important our designs so that our learners can learn more effectively, efficiently, and enjoyably!