How to support learning for a brain that’s becoming more stupid over time

Mirjam Neelen

Let me start off with a disclaimer: This blog is based on Dr Itiel Dror’s session at this year’s Learning Technologies conference in London. I didn’t only attend the session but also chaired it. So, I might be a bit biased when I say that his session, titled What Technology is Doing to Our Brains and What This Means for All Our Futures? was one of the few sessions at the conference that really made me think.

The reason I wanted to write this blog isn’t just to summarise the intriguing session. It’s mostly because I believe that his research might help to shift our thinking in L&D and start focusing on the things that really matter for teaching and learning.

Dr Itiel Dror’s studies the human brain, mind, and cognition. In addition to his academic research, he translates theoretical scientific understanding of how people learn and what drives behavior into practical and tangible ways to improve human performance in real world domains. His applied research primarily focuses on enhanced cognition through training, decision-making, and use of technology. 

Dr Dror had a key message that might sound a bit depressing.

Technology makes us more stupid.

He calls it The Paradox of Technology. Of course, technology has wonderful new capabilities; some that we might never have been able to imagine. However, I strongly agree with Dror that the fact that we have technology, doesn’t mean that we should always use it. We need to consider how technology impacts us. Let me make clear upfront that Itiel Dror isn’t against technology (and neither am I!). But his message that technology makes us stupid and what that means for us in practice as L&D practitioners is worth pondering over.

Dror asked and (partly) answered two key questions. Given that technology makes us more stupid (and I’ll explain in a bit what he meant by that – also see Dror et al, 2010 and 2012), how does that:

  1. Impact how we learn and teach?
  2. Change what we should be teaching and learning?

OK, what does all this ‘stupid this, stupid that’ refer to?

How does technology make us more stupid and why?

Dror and Harnad (2008) explain that ‘cognising’, for example thinking, understanding, and knowing is a mental state. Technology can contribute to human cognition, but that doesn’t make them ‘cognisers’ (after all, they don’t think, understand, or know). As human ‘cognisers’, we can offload some of our cognitive functions onto technology in various ways (e.g., through using a calculator). We can also partner with technology (e.g., when we use GPS), or we can even use technology to replace some ‘cognising’ (e.g,, a self-driving car). This way, we basically extend our performance capacity beyond the limits of our own ‘brain power’ (see details at Dror et al., 2010).

However, although technology can help us to extend our performance capacity, it also diminishes our individual human performance capacity. This is what Dror means when he says that we come more ‘stupid’, or, as he likes to call it, the Paradox of Technology.

Dr Dror explained (and these examples aren’t exhaustive, there are many others) why this is the case. One of the reasons is what he calls ‘use it or lose it’. After all, if we don’t use our knowledge or skills, these will be degraded and eventually forgotten and lost. Dror told us that he used to be great at parallel parking but he lost the skill because of the sensors in modern cars. He recently found out when he was driving an older car without sensors. He got it wrong, again and again.

Another reason is that technology makes us more ignorant. For example, SPSS (a statistical analysis program) might help researchers to run their statistical analyses, but that means that they often don’t spend sufficient time to consider which statistical analysis to run, because it’s so simple & easy to do it with SPSS. We also take tons of photos without thinking because it’s easy and free with our mobile phones these days. Technology makes us more lazy and think less.

Moreover, technology also makes us ignorant, because when we use SPSS (rather than compute the statistics by hand), or use our smart phones to take photos (which automatically adjust the shutter size, speed, and many other parameters), we know and understand much less than we used to.

Then there’s confirmation bias. We do a Google search and we almost always find confirmation for what we believe (we may have to go down the list for a bit, but we will eventually find someone who agrees). I took this to the test. I found evidence that angels exist and that cats can actually talk with humans.

Last example for now (again, there are many more): We create echo chambers when using technology. There are many examples how YouTube’s, Facebook’s, Twitter’s – and so the list goes on – algorithms give us more of what we look for, without us realising that we’re creating our own biased bubble. Hence, technology rather than exposing us to different viewpoints and challenge our thinking, it’s misused as a confirmation bias apparatus.

In short, because of how we ‘collaborate’ with technology, our brains do less, aren’t challenged, and thus become more stupid.

The fact that our individual brains become more stupid isn’t necessarily worrisome because at the same time, we’re becoming smarter collectively. But that’s not the point.

Instead, the point is that we have to ask ourselves:

What does the fact that our individual brains do less because of technology mean for what we need to learn and how we should learn it?

Again, because we use so much technology, our brains get a bit lazy. Dror gave several examples of situations where workers’ ability to stay vigilant decreases because of technology. One example is airport security. Security staff scans suitcases and people all day long for dangerous goods and weapons. But how many do they find? I don’t have the stats, but it’s probably fair to say: Very little. This causes the brain to fall into a base-rate bias, see Dror, 2020.

Now, let’s go back to the questions.

  1. What should security staff learn? You might think: They need to learn how to recognise dangerous artifacts and weapons. Sure. But that isn’t enough. Because they see hundreds of people and bags a day on their scanner, they will lose focus. So, how do we help them to stay vigilant so that they don’t overlook that ONE weapon once every six months (just making up the number!)? This brings us to the second question.
  2. How should we teach them to stay vigilant? According to Dror, we can do this through the ‘terror of error’. This means that you provide an emotional, memorable learning experience. This starts with creating realistic scenarios that test common, daily, and even unlikely ‘dangerous artifacts or weapons’ situations. This could be someone trying to get a gun through security. But just that isn’t enough! Staff also needs to feel the emotional response that they would have when experiencing the consequences of them NOT staying vigilant. For example, they could receive a fake message saying that an airplane has been hijacked because they overlooked the gun. This type of experiences has a long and profound impact on their future performance. They should become terrified of making an error. That’s what will help them to stay vigilant!

Here’s the question that we, as L&D practitioners should keep front and center:

What does the fact that our individual brains do less because of technology mean for what we need to do to support people’s learning and how we should support them?  

Instead of diving into the most effective ways to support people’s learning here, I’m afraid we have to take a couple of steps back. Everyone who knows me a little bit, knows that I think L&D’s foundation is cracked. After Dror’s talk I wonder if it’s because we got lazy ourselves because of technology? And even worse, it’s like ‘we’re under its spell’; we’re making ourselves completely dependent of it. We love our ‘learning’ technology. We’re fans. We even believe that technology drives the learning process! We make big claims like “<fill in tech> will disrupt L&D”. You can see in my recent LinkedIn post, responding to Josh Bersin’s article on TikTok what my opinion is on that thinking. I think it’s silly.

The technology makes us focus on wrong questions such as ‘How can we automate content curation?’, ‘How can we design learning paths using AI?’, ‘How can we use technology to personalize learning?’. Do I think these are bad questions? No, not per se. But I do think they’re too focused on efficiency and not effectiveness, and they distract us from the real questions and fixing our foundation. Or perhaps the real questions take more effort and we’re too lazy to even ask them?

Let me give some examples.

Problem 1: Skill development needs change quickly in organisations.

Our usual solution: Let’s train people on generic/transferable skills as these will always be critical. Let’s offer content on our LXP on collaboration, communication, critical thinking, creativity, and so forth.

The problem with this usual solution: Generic/transferable skills are only generic/transferable at a very shallow ‘rules’ level. In order to develop them, you still need to practice in a specific domain, in the specific context of a task (see our blog ‘domain-specific knowledge:1, domain-independent skills: 0’. So, first, our usual solution doesn’t help people to actually get better at these skills in the first place and second, content isn’t going to help build any skills.

A better solution: Spend more time on analysis. This way, we can better decide where we actually need to design experiences and/or support and guidance. Because that’s where we should focus our attention, NOT on curating content for so-called ‘transferable skills’. And if we find evidence that people need to improve these skills through learning (and we really have to wonder as these are all ancient human skills!), then we need to integrate the learning in the context of their job.


Problem 2: People don’t have time to learn.

Our usual solution: Small nuggets of content (also known as microlearning or nano learning – shoot me!), through a technology platform (LXP, Teams, etc).

The problem with this usual solution: First, it gives the wrong message, namely that people can learn effectively, efficiently, and enjoyably when they go through a small piece of content and then back to work. Which we hopefully all know isn’t the case. Second, we seem to accept that people don’t have time to learn. We never ask: What should we do to ensure that people have more time (headspace rather!) to learn? Or maybe they do have time to learn, but find most of the learning useless… if they really benefited from it, perhaps they would make & find time for it.

A better solution: Spend more time on analysis. Perhaps we should challenge our rat race organisations and ways of working. Perhaps we should design organisations, work, and learning so that we can help people stay focused for longer periods of time if needed. My hypothesis is that we’re making things worse through giving in to the lack of time and/or headspace (or/and priorities). People need to learn to stay focused to be able to learn and develop new skills to do new things in their job or to do old things better.

After all, we know that learning is a process, we know that real learning is hard and requires focus, especially when it comes to complex skills (see our blog here)!

One more, just for the heck of it…


Problem 3: People don’t complete the learning paths on our LMS and/or LXP, they seem to find them boring.

Our usual solution: Gamification!

The problem with this usual solution: Even if people enjoy a gamified solution, that doesn’t mean they actually learn something.

A better solution: Spend more time on analysis. What do people actually need? What is relevant for them? What is their context? And how can we design an experience and/or guidance and support to actually help people do new things in their job or do old things better.


Again, I’m not sure if it’s the only reason, but after Dror’s talk I really can’t get shake off the idea that technology not only has made us L&D practitioners more lazy (&  stupid?), it’s even worse! We seem to be tech-intoxicated! The only good news is that we can still be somewhat lazy, sober up, and do better! All we need to focus on are the two questions that Itiel Dror asked.

What does the fact that our individual brains do less because of technology mean for:

  1. What we need to do to support people’s learning and
  2. How we should support them? 

I offered an initial answer to simply spend more time on analysis! And only when we’re really dealing with a need that requires learning, we can move on to designing learning experiences (including guidance and support) using an evidence-informed approach!

Let’s continue finding the answers to these two questions. I know it’s a lot to ask given that we’re all increasingly becoming more lazy/stupid, but let’s at least try!

References

Dror,  I.  E. (2020). Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias. Analytical Chemistry, 92 (12), 7998–8004. DOI: 10.1021/acs.analchem.0c00704. Available at: https://pubs.acs.org/doi/10.1021/acs.analchem.0c00704    

Dror, I. E., Wertheim, K., Fraser‐Mackenzie, P., & Walajtys, J. (2012). The impact of human–technology cooperation and distributed cognition in forensic science: biasing effects of AFIS contextual information on human experts. Journal of forensic sciences57(2), 343-352.

Dror, I. E. & Mnookin, J. (2010). The use of technology in human expert domains: Challenges and risks arising from the use of automated fingerprint identification systems in forensics. Law, Probability and Risk, 9 (1), 47-67. Retrieved  from: https://nebula.wsimg.com/e9dcccb677f3606566852d3ca83e83cf?AccessKeyId=09634646A61C4487DFA0&disposition=0&alloworigin=1

Dror, I., & Harnad, S. (2008). Offloading cognition onto cognitive technology. John Benjamins Publishing.


8 thoughts on “How to support learning for a brain that’s becoming more stupid over time

  1. Rob Foshay says:

    Bravo! This is a great statement of the need to understand how learning really works, and not to be seduced by the technology-of-the-week. Simple, superficial answers don’t work.
    I would expand your example about learners not having enough time to learn (or failure to persist in learning) to a more general argument based on the Human Performance Improvement (HPI) framework: training, at best, can only improve knowledge. Most failures to perform are due to the context and environmental factors that drive and limit performance. To succeed, training has to be designed so that (1) it works in this environment, if possible; and (2) it is only expected to improve knowledge, not the the other environmental factors; and (3) it is part of a coordinated change management initiative that also aligns the environmental factors that are often the barriers to performance.
    The problems come when we try to fix the non-knowledge factors with training, and when we are seduced into believing a simple answer (such as your examples of far transfer, or gamification) to the complex question of understanding the whole performance environment.
    I am reminded of one of David Berliner’s points (paraphrased): in education, the interaction effects are stronger than the main effects. If all you do is expect a big main effect (e.g., gamification or any other technology-of-the-week), you will always be disappointed.

    Like

    • 3starlearningexperiences says:

      Hi Rob, thank you for your reaction. Very much agree with your comment around other factors that might prevent people from performing. Don’t agree with your comment that “training, at best, can only improve knowledge.”. Well designed training will take environment etc into consideration and focuses on knowledge, skills, and attitudes. But I agree that that’s USUALLY not what training looks like in an organisation.

      Liked by 1 person

  2. Roger Brownlie says:

    I agree with Dr Dror completely. My point is more that ‘Resistance is futile’ and maybe roles that are being deskilled are close to being redundant anyway. If I’m head of security, do I invest my budget in L&D programs to improve vigilance, or do I upgrade my software to auto-flag suspicious items? Calculator vs paper and pencil? That’s the point of tech. And while it makes us stupid at some things, it frees us up to do other things. Like read blogs instead if handwashing the dishes.

    Liked by 1 person

    • 3starlearningexperiences says:

      I’m not sure what your point is, sorry. You seem to think that somewhere Dror implies that we should resist tech? Dror describes 3 roles of tech. Offloading, partnering, replacing. He explains the impact of how we use tech. It’s up to us to identify the trade-offs and wins and make a decision on what’s best and why (don’t just use tech because it’s available, but use it when it makes sense). Then, the last point is that IF we are using technology, we have to realise the impact it has on what we need to learn (differently than before we were using the tech) and how we could best learn it.

      Liked by 1 person

Leave a comment