Changing your mind

April 1, 2026 · 7 minute read

Recently, my friends have been more outspoken about the frequency at which I change my decisions, which some of them term as indecisiveness. When I (infrequently) choose to respond to this personal attack, it is usually with a rationalization of how my thought process is not inherently indecisive; and in fact how most of my decisions are made with intent and confidence behind them, until an information discrepancy comes to mind and subsequently alters said decision. This is different than actual indecisiveness, which is more generally defined as an inability to even make those decisions with certainty in the first place.

As implied by my lack of desire to deal with this revelation my friends have had, I don't think this is that large of a deal, and I believe many other humans behave in a similar way. But I do think the reason people conflate the two is because we have built a world where consistency is socially positioned as a proxy for intelligence. If someone holds the same position for a long time and defends it well, we tend to assume that the position was the product of serious thought, when in many cases it is just the product of never having encountered (or never having been open to) a sufficiently good counterargument. Someone who changed their mind on a particular topic multiple times in a short timespan may have done more genuine thinking than someone who has held the same stance for a decade, but the second person will almost always be taken more seriously, because their track record gives the appearance of having figured something out.

I feel that this issue is larger than it seems because society has actually structured our idea of credibility around the absence of it. Obviously to have strong conviction is almost always a compliment, and many times is used just to applaud one's assertiveness. In the cases where we talk about a belief system though, that statement is usually never followed by the question of whether the ideas that one holds have been tested or revised. The praise is for simply having the idea, not for the thinking that may or may not have gone into what is being held. When we reward the holding of beliefs as a trait in itself, independent of whether those beliefs are any good, we create an environment in which people are structurally discouraged from thinking critically beyond those beliefs. (*This is not a diss on those who hold firm beliefs, I actually think that is one of the traits I find most admirable.)

On that note, one perspective I hold that differs from the overconfidence of contemporary society is that changing our minds can sometimes be evidence of good thinking, and that the part of the discomfort we feel when doing so is not a signal that something has gone wrong but more so reflective of the stigma that comes with being perceived as being indecisive.

Do you remember being told by teachers to be instinctive and always keep the first multiple choice answer you bubbled in, because the kids who changed them always ended up being wrong because of their lack of confidence? That advice is the complete opposite of what actually happens. Studies consistently find that when students change an answer, they are far more likely to switch from wrong to right than from right to wrong, and the belief that our first instinct is usually correct persists mostly because people remember the times they changed to a wrong answer far more vividly than the times they changed to a right one.

Though this example is a bit elementary and not entirely centered on societal expectations, it does demonstrate the fact that we have been conditioned to associate revision with error rather than with improvement, and that this conditioning starts early enough that by the time we are grown, the instinct to hold our original position feels like intellectual discipline.

The penalty we feel when changing our minds also grows worse with publicity. A thought you keep to yourself can evolve over years and nobody will ever know or care, but a position stated in front of other people becomes something you are socially accountable to in a way that has nothing to do with whether you genuinely believe it. People remember what you said, and when you move away from it, they are forced to reconcile the version of you that said it with the version of you that no longer agrees, and sometimes people resolve that tension by assuming that a change in perspective itself is the problem rather than considering what factors may have spurred that change.

The intellectual sunk cost fallacy that exists in what I'm describing is completely understandable. Once you have invested in a position emotionally, monetarily, however it is, abandoning it is not an easy task to endure. This position or perspective you have is now something you hold because of what it has cost you to hold it, and at that point you are no longer reasoning about whether it is true, or whether you still agree with or want this particular thing; you are reasoning about whether you can afford for it not to be true. This, I think, explains why many get stuck in a job they don't particularly enjoy, and why so many say that once you start working, it's impossible to stop.

One more reason this is so difficult to overcome, even for people who are otherwise very capable of clear thinking, is simply that humans are loss averse. Kahneman and Tversky demonstrated decades ago that we feel losses roughly twice as intensely as equivalent gains, and I honestly speculate that impact is multiplied when there are social factors at hand (specifically judgement). The marginal benefit of reversing a bad decision might be enormous, and the marginal cost of admitting the reversal might be relatively small in any objective sense, but the person standing inside that decision does not experience those magnitudes objectively, because they are weighing a concrete, immediate, and socially painful loss (the admission of error, the reputational cost, the vulnerability of having been publicly incorrect) against a gain that is abstract.

The institutional and structural environment beyond society exacerbates this problem further because organizations are generally built to validate the thinking of the people making the decisions. A CEO surrounded by people whose careers depend on the CEO's strategy being correct is not operating in an environment that is going to surface disconfirming evidence efficiently, and so the rigidity that might have started as a psychological discomfort with reversal becomes structurally reinforced by the information ecosystem around them. The sunk cost is no longer just emotional; it is organizational, reputational, and in many cases financial, and at that point the idea that someone could simply "look at the data and change their mind" becomes almost naive in its simplicity, because it assumes a kind of objectivity that the situation is specifically designed to prevent, although in this specific case, it's usually for good reason.

Another great example here is politics. People will defend preferences they formed years ago not because those preferences still reflect who they are but because reversing them would require a kind of self-confrontation that most people would rather avoid. The position has been load-bearing for long enough that removing it would require rebuilding the parts of their identity entirely and that is not something that many people are willing to do.

I could keep going on, but hopefully by now you've also thought of some examples of what I'm getting across. If you had a belief at some point, and that belief was based on the best version of what you knew at the time, and then you encountered something that genuinely did not fit inside of it, letting the belief adjust is not a concession, but an intellectually honest response available to you. The fact that doing so feels like loss, that it registers emotionally as weakness or inconsistency rather than as correction, tells us something important about how little our attachment to our own positions has to do with whether those positions are actually correct, and how much it has to do with the identity we have built on top of them. We do not hold onto ideas because we have continuously re-evaluated them and found them to still be right; we hold onto them because they have become structural, because other beliefs and relationships and self-conceptions are resting on them, and because the cost of change feels less like switching our perspective and more like depersonalization. But the willingness to do it anyway, to let a belief go when it no longer earns its place, even knowing that other people saw you hold it and will notice that you stopped, is probably as close to honest thinking as we will achieve.

Receive my updates