Mindcraft Impuls

Experience manipulation live:
Why smart people fall for simple tricks

Approx. 5 minutes read

On May 21, 2026, Eric Flury and Tomislav Bodrozic will give a talk at Take Aware in Duesseldorf that shows no slides about attack types and presents no checklists. Instead, participants will experience live how their own brains lead them astray, and why that is the real basis of every successful social engineering attack.

Ankündigungsbild Vortrag Take Aware 2026
Announcement image for the Take Aware 2026 talk

Anyone reading this article knows the problem. Awareness programs are carried out, phishing simulations run, training is completed, and incidents still happen. Someone still clicks the wrong link. Someone still gives out data they should never have disclosed.

The usual explanation is that people were inattentive. Or they did not pay attention. Or they did not know better.

That explanation is wrong. And that is exactly what this talk is about.

The real question: why can smart people be manipulated?

Social engineering does not primarily work because people know too little. It works because human perception, attention, certainty, and judgment are systematically fallible under completely normal conditions, and can be deliberately influenced.

This is not a philosophical claim. It is an empirically well-supported fact from cognitive psychology, and it has direct consequences for how awareness programs must be built if they are really to work.

Manipulation arbeitet nicht gegen das Gehirn – sondern mit seinen normalen Mechanismen.
Manipulation does not work against the brain, but with its normal mechanisms.

The talk follows a clear line: four deceptions, four psychological stages, one uncomfortable conclusion.

Stage 1: attention can be hijacked

The first deception shows something most participants do not believe before they experience it themselves: even when people are focused and attentive, they still overlook essential things.

In cognitive psychology, the phenomenon is known as inattentional blindness. Daniel Simons and Christopher Chabris showed in their famous 1999 experiment that people who concentrate on a specific task reliably overlook obvious changes in their immediate surroundings, even when they believe they are fully attentive.*

The problem is not a lack of concentration. The problem is that attention already has a job, and everything outside that job automatically fades into the background.

For social engineering, that means that if an attacker makes one thing visible, attention is drawn away from the actual attack. The urgency of a fake email, the authoritative role of a caller, the familiar subject line: these are not tricks that work against the brain. They are levers that simply use it.

Stage 2: coherence feels like truth

The second deception makes something even more uncomfortable visible: even what we subjectively experience with certainty is not neutral reality. It is a plausible construction of our brain.

Stimmigkeit wird schnell mit Wahrheit verwechselt – das ist kein Fehler, sondern normale Gehirnfunktion.
Coherence is quickly confused with truth. That is not a bug, but normal brain function.

The rubber hand illusion, first described by Botvinick and Cohen in 1998, demonstrates this impressively: the brain integrates contradictory sensory stimuli into a plausible whole, and it does so faster and more convincingly than we think possible.** The brain does not build a one-to-one copy of the world. It builds a hypothesis.

When enough signals fit together, something feels real. The role fits. The language fits. The timing fits. The medium fits. Coherence is then easily confused with truth. In companies, people sometimes call that: it looked plausible.

Subjective certainty is not a seal of truth. That is one of the central sentences of the talk, and it has direct consequences for anyone who evaluates phishing simulations or social engineering tests.

Stage 3: familiarity makes us lazy about checking

The third deception shows how poorly we detect changes when the context feels familiar.

Change blindness, the inability to perceive changes in a scene when they are introduced at the right moment, was studied extensively by Daniel Simons and Daniel Levin.*** People detect context shifts, role changes, and identity changes much worse than they believe. If an environment appears familiar, the brain automatically reduces checking effort.

For attackers, this is one of the most valuable properties of human perception. A new contact person, a slightly altered email address, a minimally different process step: everything still feels familiar enough that nobody internally hits the brakes.

Almost familiar is often more dangerous than completely foreign.

Stage 4: even when thinking, we seek confirmation, not refutation

The fourth deception hits hardest those who rely on their analytical abilities.

Confirmation Bias: Wir suchen Bestätigung für das, was wir bereits glauben.
Confirmation bias: we look for confirmation of what we already believe.

The confirmation bias, the tendency to preferentially search for information that confirms one's first hypothesis, is one of the best documented cognitive biases of all. Peter Wason first described it in 1960; Daniel Kahneman later made it accessible to a broad audience in "Thinking, Fast and Slow".**** Once we have a plausible explanation, we stabilize it instead of questioning it.

That feels like analysis. But often it is only comfortable confirmation. In incident response, fraud assessment, and risk evaluation, this is a real problem. Manipulation does not end with the first impression. It becomes truly strong when our thinking helps afterward.

The conclusion: what this means for awareness

If these four stages are true, it is not enough to inform employees about attack patterns. Mere recognition is too little.

Effective awareness has to train something else: interrupt the first impression. Do not confuse plausibility with truth. Do not confuse familiarity with legitimacy. Do not confuse the first hypothesis with insight.

That does not happen through slides. It happens through formats in which people do not merely hear about these mechanisms, but experience them firsthand. That is exactly the approach Mindcraft follows in its formats, and exactly what Eric Flury and Tomislav Bodrozic will show live on stage at Take Aware 2026 on May 21.

About the conference: why this talk matters now

Take Aware Konferenz-Atmosphäre oder Programmflyer

Take Aware 2026 is held under the motto #8ORING, mindfulness and new boredom with and through AI. The background is a depth-psychological study by the Brand Science Institute commissioned by the BSI, showing that AI not only saves time, but also creates inner restlessness and idle time. And idle time, like general underload, is a serious information security risk.

The talk by Eric Flury and Tomislav Bodrozic fits this frame more precisely than it may first appear. If AI changes employees' cognitive load by taking over routine tasks while also demanding concentration and judgment in new ways, then the four stages of the talk become newly urgent: who in an AI-shaped working day is still able to interrupt the first impression? Who checks plausibility when AI-generated content sounds plausible by definition?

For CISOs and awareness managers who want to understand why their programs do not produce the desired behavior change despite high investment, this talk offers a concrete point of entry: not another presentation about attack types, but a direct engagement with what makes awareness difficult at its core.

Take Aware takes place from May 19 to 21, 2026, at the b'mine Hotel in Duesseldorf-Flingern. The talk by Eric Flury and Tomislav Bodrozic is scheduled for May 21 at 1:45 p.m. in the main program.

Conclusion

Manipulability is not a sign of stupidity. It is the price we pay for the fact that our brains work quickly, efficiently, and usually sensibly. Attackers use precisely this strength and turn it into a weakness.

Good awareness does not begin with better explaining. It begins with better experiencing, checking, and interrupting.


* Simons, D. J. & Chabris, C. F. (1999). Gorillas in our midst: Sustained inattentional blindness for dynamic events. Perception, 28(9), 1059–1074.

** Botvinick, M. & Cohen, J. (1998). Rubber hands 'feel' touch that eyes see. Nature, 391, 756.

*** Simons, D. J. & Levin, D. T. (1998). Failure to detect changes to people during a real-world interaction. Psychonomic Bulletin & Review, 5(4), 644–649.

**** Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129–140. Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.