Testing Fatigue: When Tabletop Exercises Stop Teaching Us Anything.
Have our tabletop resilience tests become too comfortable?
When tabletop exercises start validating existing controls instead of exposing new risks, resilience testing loses its edge. Here’s how to bring challenge back into the process.
Testing is one of the cornerstones of resilience, but what happens when testing itself becomes routine?
Tabletop exercises are often well intentioned, but over time they can drift into predictable territory; familiar scenarios, scripted outcomes, and the same lessons learned each time. It seems as though its stale repetition.
It’s what can be known as “Testing Fatigue”, when the process still runs, but the learning stops.
Some practitioners and commentary on resilience suggest that many continuity exercises end up validating existing controls rather than exposing new risks. Reports from firms such as MHA Consulting and Riskconnect highlight that exercises are frequently treated as compliance events rather than learning opportunities, leading to repetitive outcomes and missed insights.
This pattern isn’t just anecdotal, academic work on BCM maturity (for example Riana Steen et al, 2023) shows that maintenance, testing and continuous improvement factors are among the weakest areas of organisational resilience, suggesting that lessons from exercises often fail to translate into action.
True resilience testing should challenge assumptions, not confirm what we already know. It should make people think, feel pressure, and reveal gaps between our plans and our behaviours.
Breaking out of the cycle
Testing fatigue usually creeps in quietly, when exercises stop evolving, or when participation becomes more about attendance than engagement.
So how do we refresh a programme that’s stuck in routine?
Change the scenario lens. Alternate between cyber, facilities, people, and supplier disruptions.
Rotate facilitators. A different voice brings new perspective and challenge.
Link to live risks. Build exercises around real incidents, not theoretical ones.
Track lessons through to closure. Learning only counts when it drives measurable improvement.
Measure engagement, not attendance. Ask participants what they thought, what surprised them, what they took away.
These aren’t radical steps, they simply reconnect the exercise to its purpose: to test our capability, not our comfort.
Resilience isn’t proven by how often we test, it’s proven by how much we learn each time we do!