Thinking like the group
Have you ever wondered why groups converge on shared beliefs and behaviours?
Groups coordinate which patterns are salient through shared practices---your brain maps those regularities automatically, absorbing beliefs and behaviours without deliberate choice.
Have you ever wondered why groups converge on shared beliefs and behaviours? You join a new workplace, friend group, or community, and within a few months you find yourself talking like them, valuing what they value, even holding opinions you didn't have before. Or you notice that everyone in a particular subculture---preppers, yoga enthusiasts, startup founders---seems to share not just one belief but a whole cluster of them, even when the beliefs don't logically connect. Or you try to voice a dissenting opinion in a tight-knit group and feel an almost physical discomfort, a sense that you're violating something unspoken.
This isn't about peer pressure in the obvious sense, where someone explicitly tells you what to think. It's about how brains learn from the social environment. Communities coordinate which patterns are salient through their shared practices, language, and rituals, and your brain maps those regularities automatically---just like it maps any other environmental pattern. You don't reason your way into group beliefs; you absorb them by exposure. Let's see what neurotypica helps us understand about why thinking like the group happens so predictably, and what you can do about it.
How can the brain help us understand this?
01.
Communities coordinate patterns
Brains link features into meaningful chunks; attention binds chunks into goal‑directed episodes---fast to use, hard to see past.
+
Your brain learns chunks by observing what co-occurs: which ideas get mentioned together, which behaviours follow which situations, which beliefs cluster in the people around you. Communities coordinate these co-occurrences through shared practices. When everyone in your group treats "distrust institutions" and "value self-sufficiency" as naturally linked, your brain learns that association automatically---not because you reasoned it through, but because those ideas travel together in your social environment.
The topology of these social links matters. Strong, dense ties within a group---bonding capital---build trust and coordination but also insularity. The group reinforces its own chunks and resists outside information. Weak ties that bridge across groups---bridging capital---expose you to different chunks, different framings, different co-occurrences. Groups with only bonding capital harden their belief stacks. Groups that also maintain bridging capital keep the chunks open to revision.
So what can you do? Recognise that beliefs stack socially, not logically. If you're trying to change someone's mind about one belief in a cluster, challenging it in isolation often fails because the other beliefs in the cluster activate in defence. Instead, work at the level of practice: change the coordinated activities, shift which community someone is exposed to, or introduce new coordinated pairings that chunk the ideas differently. And build bridging capital deliberately---cross-group ties, rotations, exposure to different communities---so that alternative chunks have a chance to compete with the established ones.
02.
The group model is the safest prediction
The brain predicts what should happen next---in the world and in the body. When predictions fail, you feel something, attention pivots, and behaviour updates.
+
In social contexts, predicting "I'll do what the group does" minimises surprise and social cost. Your brain builds a model of what's normal, expected, and valued in each group, and acting to confirm that prediction is usually the lowest-error path. This is why dissent feels uncomfortable---it generates prediction errors not just in your brain but in the social feedback you get. The group expected you to agree, you didn't, and now there's mismatch.
So what can you do? If you want to maintain independent thinking in a group, you have to tolerate the prediction errors that come with dissent. Expect discomfort---that's the system signalling mismatch between your action and the group model. You can reduce the cost by building relationships where dissent is explicitly valued, so the group model includes "sometimes we disagree" as a normal pattern. Or you can separate social belonging from intellectual agreement, so prediction errors in one domain don't threaten the other.
03.
Conformity is bias; diversity is noise
Bias trades flexibility for precision; noise trades precision for flexibility. Brains tune this trade‑off by context, stress, and uncertainty.
+
Groups naturally bias towards conformity because it speeds coordination and reduces conflict. Everyone using the same chunks, the same language, the same framings makes communication efficient and action aligned. But that bias suppresses alternatives---the group stops sampling different approaches, and bad ideas can entrench simply because they're shared. Diversity is noise: it slows things down, creates friction, but it's also what allows the group to explore and correct.
So what can you do? Design for the right balance. When you need coordination and fast execution, bias is productive---tighten norms, reinforce shared practices, align the team. When you need innovation or course correction, introduce noise deliberately: invite dissent, bring in outside perspectives, create space for alternative framings. Don't mistake conformity for correctness, and don't mistake diversity for chaos. Both are tools; use them strategically depending on whether you need exploitation or exploration.
04.
Tight cultures enforce tighter norms
The brain maps perceptions to actions through frames that highlight some
meanings and sacrifice others. We don't choose between truth and
ideology---we choose between ideologies.
+
The degree to which groups enforce conformity isn't random---it correlates with perceived threat. Cultures that face persistent external threats tend to develop tighter monitoring and stronger norms. The mechanism is straightforward: threat increases the value of coordinated behaviour, which increases monitoring density (how closely members watch each other), which increases the certainty of sanctions for deviance, which constrains individual behaviour. This is what cross-cultural psychology calls tight-loose variation, and it explains why military units, religious communities, and marginalised groups often develop strongly enforced group norms while affluent, secure communities tend to be more permissive.
The ideological frames that structure group thinking aren't just about what the group believes---they're about how strongly those beliefs are enforced. In tight groups, the frames are policed: deviation is noticed, sanctioned, and corrected rapidly. In loose groups, the same frames exist but enforcement is lax, leaving more room for individual variation.
So what can you do? Recognise that enforcement density is a design variable, not a fixed feature of the group. If you're leading a tight group, monitor what your enforcement infrastructure is actually enforcing---it may be sustaining norms you'd rather change. If you want to loosen a group's thinking, reduce the monitoring density or introduce diverse information sources that weaken the enforcement loop. But be honest about the trade-off: loosening enforcement gains flexibility at the cost of coordination.
05.
Authority is granted, not imposed
+
The classic reading of Stanley Milgram's obedience experiments is that people blindly obey authority. But a closer reading suggests something more interesting: participants who continued weren't obeying mindlessly---they were engaged with the experimenter's project because the experimenter represented their in-group identity as contributors to science. This is engaged followership: people comply with authority figures who they perceive as representing "us." When the identification breaks---when a dissenter provides an alternative representative, or the authority figure stops seeming to represent the group's values---compliance collapses.
The input-output machine responds to who is in the room, what they represent, and whether they're seen as "us" or "them." The authority figure's identity is an input, and the behaviour---compliance or resistance---is the output. Change who's in the room or who they're perceived to represent, and the output changes.
So what can you do? If you want people to follow good leadership, ensure the leader visibly represents the group's identity and values. If you want people to resist bad leadership, provide alternative representatives---people who model dissent from within the group, showing that questioning is part of "who we are." The key is that authority flows through identity, not hierarchy. You don't break bad authority by removing the hierarchy; you break it by providing a competing representative who embodies different values. And if you want people to question bad decisions, ensure that the authority figure represents questioning as part of the group identity, not a threat to it.
06.
The group activates different parts of you
The mind as sub‑agents with competing goals; coordination, not unanimity, drives behaviour.
+
Group membership doesn't just add new beliefs---it activates different parts of you. Your society of mind contains a "work self," a "family self," a "mates self," and each of these sub-agents has its own priorities, values, and scripts. When you enter a group context, the sub-agent associated with that group comes to the front, and the others recede. This is why you can hold one set of values with your family and a contradictory set with your unit---different sub-agents, different group contexts, different scripts running.
The discomfort of dissent is partly a coalition problem. When the active sub-agent---the one aligned with the group---is asked to contradict the group, it feels like betrayal, because from that sub-agent's perspective it is betrayal. The part of you that sees the problem may exist, but it's a different sub-agent, and it's not the one currently in charge. Getting it heard requires the coalition to shift, which costs energy and social capital.
So what can you do? Create contexts where the questioning sub-agent has legitimacy. If "the person who asks hard questions" is a recognised role in the group---not a troublemaker, but someone doing a job the group values---then the sub-agent that does the questioning gets activated by the group context rather than in spite of it. Devil's advocate roles, structured red-teaming, and explicit norms around challenge all do this: they make dissent a group-endorsed activity, so the dissenting sub-agent and the group-loyal sub-agent aren't in conflict.
analects/ideologies-stack.md
analects/social-learning.md
analects/making-meaning-in-the-brain.md