Reading Time: 6 minutes

First, a confession about the term ‘experiential learning’

Ask ten L&D professionals what experiential learning means and you will get ten different answers. Some will say it is learning by doing. Some will say it is Kolb’s cycle. Some will say it is anything that is not a slide deck. The term has been stretched so far that it now covers everything from a ten-minute role-play to a six-month action learning set, and calling both of those things the same thing is not particularly helpful if you are trying to decide which one to use.

Kolb’s model (1984) gave us the best foundation we have: learning as a continuous cycle of concrete experience, reflection, conceptualisation and active experimentation. What it did not give us was a clear distinction between the types of experience that produce different outcomes. That distinction matters enormously in practice.

The experiential family broadly contains six recognisable approaches: case-based learning, problem-based learning, project-based learning, simulation-based learning, service learning and challenge-driven learning. They share the same basic architecture. They differ in ways that fundamentally change
what they can actually do.

“Learning is the process whereby knowledge is created through the transformation of experience.”

David kolb, 1984

The failure to distinguish between these approaches is not just an academic problem. It produces learning programmes that look experiential on paper but are not doing the neurological and behavioural work that experiential learning is supposed to do. A case study is not a simulation. A simulation is not a challenge. These are genuinely different things and conflating them costs organisations real capability development.

A spectrum, not a category

The most useful way to think about experiential learning is as a spectrum rather than a single thing. At one end, the structure, the problem and the boundaries of acceptable answers are largely pre-determined. At the other end, the learner is navigating genuine uncertainty with real consequences attached to what they decide.

The experiential learning spectrum

More structured → More ambiguous

ApproachNature of problemLearning environment
Case studyStructured problem, known answer frameworkClear boundaries, guided thinking
Problem-basedDefined problem, open solution pathEncourages analysis and multiple solutions
SimulationScenario-based, modelled consequencesDecision-making with controlled outcomes
Challenge-drivenIll-defined, real stakes, real consequencesHigh ambiguity, real-world complexity

Moving along this spectrum changes the learner’s relationship with the material in a fundamental way. At the structured end, the learner is solving a puzzle with known parameters. At the challenge-driven end, the learner must first work out what the puzzle actually is, and that work of problem definition is itself where much of the learning happens.

This maps directly onto professional reality. Most meaningful challenges at work do not arrive with clear boundaries and a defined success criteria. They arrive partial, urgent and ambiguous. The person who has been trained to navigate structure will struggle with this. The person who has been trained to operate without it will not.

Most corporate learning programmes are clustered at the structured end of this spectrum. There is nothing wrong with that in isolation. The question is whether they are only operating there.

Worth Sitting With

Experiential learning asks: what did you learn from that experience? Challenge-driven learning
asks: what will you do about a problem with no clean answer and real consequences
attached? Those are different questions. They produce different people.

What challenge-driven learning actually does differently

Not a longer list. A different kind of learning.

01. The problem is ill-defined — on purpose

Problem definition is a learning outcome in itself

In case-based and problem-based learning, someone has already decided what the problem is. The learner’s job is to solve it. In challenge-driven learning, the challenge is deliberately underspecified. Participants must first ask: what is actually the problem here? That act of definition, the ability to identify what really needs solving and resist the pull towards premature action, is one of the most valuable professional capabilities there is. It is also almost completely absent from structured learning design.

We ran a commercial capability programme recently where teams were given a client brief with enough information to act on but not enough to be certain. Two teams spent forty minutes on detailed financial modelling before realising they had misread the brief entirely. The debrief on that moment alone was worth more than any lecture on commercial awareness we could have delivered.

02. Consequences that are felt — not just noted

The difference between knowing and caring

A case study has no consequences. Neither does a role-play. Simulations model consequences, which is better, but modelled outcomes and real ones are not neurologically equivalent. When a decision produces a tangible result that the learner genuinely cares about, something different happens in terms of how that experience is encoded. It gets tagged as important. The brain does not treat all experiences equally and the emotional weight attached to real consequence is part of what determines whether an experience becomes a durable memory or just another session they vaguely recall attending.

The consequence does not have to be severe. It just has to be real. A team whose bid wins or loses in a simulation feels something that a team who analyses the same scenario in a case study does not.

03. Engagement as structure — not aspiration

You should not have to beg people to take it seriously

One thing that experienced facilitators know, though it is rarely said openly, is that many experiential learning methods require significant facilitation energy just to sustain engagement. The room needs managing. Energy needs generating. Participants need to be invited, nudged and sometimes cajoled into treating the exercise as real.

Challenge-driven learning, designed properly, largely removes that problem. When the stakes are genuine, people do not need to be told to care. The facilitator’s job shifts from generating engagement to directing it, which is a more interesting problem to solve and a more sustainable design principle to build on.

04. Developing judgement — not just competence

The thing most training programmes cannot teach

Competence is knowing what to do in a known situation. Judgement is knowing what to do when the situation is unclear, the information is incomplete and there are legitimate arguments for more than one course of action. Most training programmes develop the former. Almost none develop the latter, in part because it is genuinely difficult to create conditions under which judgement is required rather than just competence.

Challenge-driven learning creates those conditions structurally. Because the problem is underspecified and the consequences are real, participants cannot rely on applying a known framework to a known situation. They have to make actual judgement calls. That is an uncomfortable experience for many people. It is also where the development happens.

05. Something is made

Artefacts travel; reflections rarely do

Most experiential learning produces an outcome that lives inside the learner: insight, changed behaviour, new awareness. Challenge-driven learning produces that, and it also produces something you can hold, show, share or screen. A proposal. A film. A solution presented to a real audience. This matters because producing something, rather than just discussing it, requires a higher order of commitment. You cannot half-make a thing. And the artefact travels back into the business in ways that personal reflection cannot. It gets shared with teams, used in onboarding, shown at away-days. The learning event becomes a source of ongoing advocacy rather than a memory that fades by the following Monday.

Where other approaches are genuinely better

An argument worth making honestly

There is a version of this article that would argue challenge-driven learning is always superior. That version would be wrong, and it would be the kind of wrong that damages trust in learning design more broadly.

Case-based learning: when structure matters first

Case-based learning is better when you need to build a strong knowledge base before you can deploy it. If someone does not yet understand the basic mechanics of financial modelling, throwing them into an ill-defined commercial challenge will not help them. It will just confuse them. Structure is not the enemy of learning. It is the foundation on which more demanding approaches are built.

Simulation-based learning: when risk is too high

Simulation-based learning is the right choice when the consequences of real failure are too high to permit. Surgical training cannot use the challenge-driven approach. Neither can nuclear safety or crisis management at the highest level. In those contexts, fidelity and psychological safety within a contained environment matter more than authenticity of consequence.

The real design question

The design question is not which approach is best in the abstract. It is which approach is appropriate for this specific capability gap, in this organisation, at this point in people’s development. The answer will rarely be the same twice.

What is worth questioning is the instinct to stay at the structured end of the spectrum because it is easier to design, easier to facilitate and easier to defend when something goes wrong in the room.

Experiential learning vs challenge-driven learning

Broader experiential learningChallenge-driven learning
Problem is defined by the designerProblem definition is part of the challenge
Engagement depends on facilitation qualityUrgency and ownership are structurally built in
Consequences are modelled or discussedConsequences are tangible and felt
Produces insight and behaviour changeProduces insight, behaviour change and an artefact
Develops knowledge, skill and attitudeDevelops judgement under genuine uncertainty
Outcome stays within the learning eventArtefact travels back into the business
Better suited to building a knowledge baseBetter suited to developing applied performance

The question worth asking

The best diagnostic question for any learning programme is not whether it is experiential. Almost all programmes claim to be. The question is where it sits on the spectrum and whether that position was a deliberate design choice or simply the default that felt safest.

Challenge-driven learning is harder to design well. The ambiguity that makes it effective makes it uncomfortable to build and to run. Facilitators need to sit with moments of genuine uncertainty in the room, which is not something all facilitators are trained for. Organisations need to accept that the discomfort participants feel is not a failure of design but a feature of it.

What it produces in return is worth the investment. Not learning that participants remember fondly. Learning that changes what they do when a real problem lands on their desk at seven in the morning and there is no framework to reach for.

“The goal is not comfortable learning. The goal is learning that transfers. Challenge-driven design is the shortest route between the training room and the demands of the real world.”

MDA Training