Neurotypica Lab Manual
Lab Manual · Phenomenon ref: the-rules-on-the-wall
phenomenon

The Rules on the Wall

Have you ever wondered why the values on the wall rarely match the behaviour in the corridor?

Every institution has two sets of values: the ones it displays and the ones it practises. When they diverge, the practised values always win, because the brain learns from experience, not from posters.

Have you ever wondered why the values on the wall don't match the behaviour in the corridor? Most organisations have a mission statement, a set of values, a code of conduct. Most people in those organisations could recite at least some of them. And most of those people have watched the values get ignored, worked around, or actively contradicted by the very leaders who espouse them.

This isn't just cynicism. It's a predictable consequence of how the brain learns. You don't learn values from statements---you learn them from experience. The brain maps statistical regularities: what actually happens, who actually gets rewarded, what actually gets punished. If the stated values say "integrity" but the experienced reality says "results at any cost," the brain maps the experienced reality. The poster on the wall doesn't stand a chance against ten thousand daily observations of what the institution actually does.

Consider a military unit whose published values include integrity and accountability. But the commanding officer is known for protecting his favourites from consequences while punishing identical infractions by outsiders. The unit doesn't learn "integrity and accountability." It learns "protect the in-group, manage appearances, distribute consequences by status." That learning is automatic, continuous, and far stronger than any annual ethics briefing---because the briefing is one data point, and the daily experience is thousands.

The phrase that captures this best is simple: the standard you walk past is the standard you accept. Every time a leader sees a violation and does nothing, the neural pathways of everyone watching update to treat that behaviour as within bounds. The violation doesn't even need to be dramatic. Small things---turning a blind eye to minor corner-cutting, laughing at an inappropriate joke, promoting someone known for cutting corners---each one writes a small update to the institutional map that everyone's brain is building.

How can the brain help us understand this?

Heuristics for understanding
01. The enacted values are the real ideology
The brain maps perceptions to actions through frames that highlight some meanings and sacrifice others. We don't choose between truth and ideology---we choose between ideologies.
+
How This Explains The Rules on the Wall

Every institution has an ideology---a set of frames that filter perception and guide behaviour. But the ideology isn't what's written in the values statement. It's what's enacted in daily practice. The stated values are aspirational; the enacted values are operational. And the brain, which maps regularities, learns the operational ones.

This is why institutional hypocrisy is so corrosive. When espoused and enacted values diverge, the system doesn't split the difference---it maps the enacted values and discounts the espoused ones. Worse, the divergence itself becomes part of the ideology: people learn that values statements are performative, that the institution doesn't mean what it says, and that the real game is reading the unwritten rules.

So what can you do? Close the gap from the enacted side, not the espoused side. Adding more values training when the daily experience contradicts the values doesn't help---it reinforces the lesson that values are talk, not action. Instead, change what actually gets rewarded and punished. Promote people who embody the values. Correct violations visibly and consistently. The ideology updates through practice, not through policy.

02. The environment teaches the real rules
+
How This Explains The Rules on the Wall

The input-output machine learns from what the environment provides. If the environment provides rewards for corner-cutting, that's what gets learned. If it provides consequences for violations, that's what gets learned. The rules on the wall are inputs too---but they're weak inputs, easily overridden by the much louder inputs of daily experience.

Think about monitoring density: how often do supervisors actually observe what their people do? In organisations with low monitoring density, the gap between stated and enacted values can grow enormous, because there's no environmental input reinforcing the stated rules. In organisations with high monitoring density, the gap shrinks---not because people become more virtuous, but because the environmental inputs are more consistent with the stated values.

But monitoring density has a dangerous failure mode: it tightens adherence to whatever norms the monitoring network enforces. When institutional-level oversight is weak but unit-level monitoring is strong, the mechanism can tighten adherence to deviant norms rather than institutional standards. The unit's enacted values become tightly enforced by peer pressure, while the organisation's stated values become irrelevant. This is how systematic violations can persist within a tightly disciplined group---the discipline is real, but it's disciplined adherence to the wrong norms.

So what can you do? Design the environment so the stated values have environmental inputs backing them up. This doesn't mean surveillance---it means presence, visibility, and consistency. Leaders being present, seeing what happens, and responding to it is the single most powerful environmental input for aligning enacted and espoused values. The social map updates from what it observes, and it observes constantly. And ensure monitoring operates at the right level. Unit-level monitoring alone is insufficient---if it can tighten around deviant norms, it will. External oversight, cross-unit observers, independent reporting chains, and rotation of personnel all ensure that institutional norms are what tightens, not local subculture.

03. People predict from practice, not policy
The brain predicts what should happen next---in the world and in the body. When predictions fail, you feel something, attention pivots, and behaviour updates.
+
How This Explains The Rules on the Wall

The prediction engine builds its model from experience. If experience says "violations get overlooked," the system predicts that violations will continue to get overlooked, and it adjusts behaviour accordingly. If experience says "this leader means what she says," the system predicts consistency and adjusts behaviour to match. The prediction is built from every data point the brain has accumulated, and the stated values are a tiny fraction of that data.

This is why leadership transitions are so powerful---or so dangerous. A new leader changes the prediction. But the old predictions are deeply encoded, and a few instances of new behaviour won't override years of the old pattern. The new behaviour has to be consistent enough, and sustained enough, for the prediction engine to update.

So what can you do? Be relentlessly consistent, especially in the first months of any change. The prediction engine is watching, and it's looking for evidence that the new rules are real. Every inconsistency resets the counter. Every exception teaches the system that the new rules are as performative as the old ones. Consistency over time is the only signal strong enough to rewrite a deeply encoded prediction.

Sources

analects/everything-is-ideology.md

analects/ideologies-stack.md

analects/social-learning.md

lab manual Ctrl+/ search