Even experienced mappers make predictable errors. Recognizing these helps avoid costly mistakes:
Linear Thinking in Non-Linear Systems
Cascades rarely follow straight lines. Feedback loops, amplifications, and dampening create non-linear dynamics.
Martin assumed each restaurant closure would reduce food demand proportionally. He missed how closures concentrated demand in groceries, creating shortages despite overall demand reduction. Non-linear thinking would have revealed this opportunity.
Underestimating Human Adaptation
Humans adapt to cascades, changing their trajectories. Static models miss these adaptations.
Lisa mapped cascades assuming fixed behaviors. She missed how quickly people adapted to new conditions, creating different opportunities than her static model predicted. Dynamic models incorporating adaptation proved more accurate.
Overestimating Cascade Completeness
Not every domino falls. Systems have more resilience than catastrophic thinking suggests.
David assumed complete cascade failure in air travel. He missed how essential travel continued, creating premium opportunities for safety and convenience. Partial cascade recognition would have revealed better positioning.
Timing Precision Fallacy
Cascade timing is directional, not precise. Over-specific timing assumptions lead to missed opportunities.
Carol tried timing cascade events to the week. This precision proved impossible, causing her to miss windows that opened earlier or later than predicted. Flexible timing ranges work better than false precision.