Against Epistemic Humility and for Epistemic Precision
How mindlessly hedging our beliefs weakens us.
Hello, fellow knowledge enthusiast.
Knowing things is hard, sharing our knowledge accurately is even harder, and it is all too easy to claim more knowledge than we have.
Many, when confronted with this, fall back to Epistemic Humility. They err towards being under-confident.
This is bad. I’ll explain why here, and then point to an alternative virtue to cultivate instead: Epistemic Precision.
But first, let me give some context.
Social Bids
Before getting to Epistemic Humility, I should introduce the concept of Social Bids.
In the social world, we are all constantly making bids. And the impact of Epistemic Humility is better understood through them than traditional philosophical epistemology.
—
For instance, we make bids for others to see things our way.
In a meeting, when a colleague speaks and audibly says “Obviously, this is a bad plan”, they are doing more than sharing their point of view with the rest of the team.
They are making a bid for the team to accept that this is a bad plan. They are challenging people to contest their statement. If no one challenges them and people seem at least neutral to the statement, the bid will have been accepted.
Once the bid is accepted, anyone in the team can build upon the assertion that the plan in question is bad. As more people do so, it becomes more and more common knowledge.
—
Bids come in all shapes and sizes.
Parents and bosses make explicit and straightforward bids when they order their respective children and employees.
We make bids to consider a proposition, to adopt a frame, or to use specific words.
We make bids, we counter people’s bids, we negotiate and compromise, we accept and reject the bids of others.
We bid using power, authority, status, emotional appeals, and to a much lesser extent, appeals to reason.
It’s the full chaos of the social world.
—
Most importantly, we have a Bid Budget.
Regardless of the type of bid, everyone knows and feels that they have a limited budget.
A parent can only order their children to do so many things before they revolt. The same is true for a manager or a captain.
Similarly, in a group, we can only make so many surprising statements before people start doubting us and in the end stop believing us.
Budgets are hard to manage. And bid budgets are no exception.
I have personally found that most people I care about under-use their bid budgets. They do not negotiate at work, they do not state their wants enough, they do not push their ideas, they do not claim the attention of the group they belong to when they should, etc.
Bids and Epistemic Humility
The impact of Epistemic Humility is better understood through Bids.
After all, from a pure epistemic standpoint, generic Epistemic Humility states to be less confident in everything. It’s almost void of meaning: it doesn’t change one’s beliefs in the end.
But Bids bypass traditional epistemic considerations.
Traditional Epistemology is about considering whether a statement is true or false, whether a plan is good or bad, whether an argument is sound or not.
Bids are largely about selecting which statements, plans and arguments are up for consideration in the first place.
This is crucial in the context of common knowledge: Plan A may be better than Plan B, but if Plan B is discussed more often, it is easier for people to coordinate around it.
For instance, I think of my current situation as an intellectual in these terms.
On the idea front, I believe I am doing well. I consistently get complimented on them.
However, I am failing to acquire enough social capital to bid for them.
This has very little to do with epistemic considerations, and much more with my (lacking) skills at earning social capital.1
—
The effect of generic Epistemic Humility is to weaken its followers.
The generic stance of Epistemic Humility dictates that we should be less confident in our beliefs because of our biases and epistemic failures.
At the very least, it recommends that we should express less confidence and qualify our sentences.
In practice, this is bad.
Sometimes, nerds say things like “I’m not sure but possibly [X]”, “Plausibly, [X]”, or “I think that [X] may be true”.
When nerds say this, they mean to make a regular-sized bid to add “[X] has 20% chance of being true” to what the group believes.
However, groups (not necessarily individuals, but the groups themselves), will interpret it as them making a much-smaller-than-usual bid for the group to believe [X].2
Put differently, the main effect of generic Epistemic Humility is for its followers to behave in a way that lowers their social impact. It doesn’t even make them more truthful, as it doesn’t help the groups they talk calibrate on better probabilities.
—
The effect of selective Epistemic Humility is pernicious.
Selective Epistemic Humility dictates that we should be less confident in some beliefs.
There are two cases in which this may happen.
First is when it is actually warranted. Someone provided evidence that a specific belief was wrong. In that case, it is not epistemic humility at all to believe less in it. It is nothing more than changing one’s mind as a reaction to new information.
When it is not warranted is where it shines. It is an isolated demand for rigour (Scott Alexander on the topic) masquerading as an emotional appeal to humility.
De facto, the selection of beliefs covered by the selective Epistemic Humility will gather less attention than others. It is a sneaky way to reduce their impact, without having to ever make the case that they are actually wrong.
Epistemic Cowardice
I tend to think of generic Epistemic Humility as Epistemic Cowardice.
Epistemic Humility often amounts to taking decisions to protect one’s ego from the risks associated with uncertainty. Risks of being wrong, of people showing that we’re wrong, of being ridiculed for our beliefs.
In practice, I see people being ill-at-ease when they are uncertain. So, in a manner not too dissimilar to cognitive dissonance, they try to reduce their uncertainty by staying in their comfort zones.
Instead of Epistemic Humility, I find that I need people to be much more Epistemically Brave. I need them to figure out what’s best, and to be at ease with committing to courses of action even in the presence of uncertainty.
They should not quash their feelings of uncertainty and assume they are right, instead they should stay the course and not quiver in spite of the uncertainty.
—
Institutions and politics require a lot of Epistemic Bravery to go well.
From my point of view, I see that both are lacking in qualified smart people.
In my experience, Epistemic Humility has been a direct cause of this.
Many smart people told me things like:
I’ll never engage with politics. It’s so corrupt, so hard to figure things out, and so easy to cause things to go wrong.
Only psychopaths or power-hungry people are willing to do this, and I am neither.
And as a direct result of these thoughts and feelings, they have made their decision to not engage with institutions and politics.
At best, this is Epistemic Humility gone awry, wherein people feel that they do not deserve to Take Power, even through the Rightful Means.
At worst, it is a cowardice finding a convenient excuse in Epistemic Humility.
—
The same problem exists in situations with smaller stakes.
In many social situations (think of families, friend groups, corporate offices), when things get hairy, there are a lot of people who go “Oh, I don’t want to take sides in this drama.”
This attitude strongly empowers sociopaths. They know that until they screw up badly, they have a clear field to abuse people.
For the same reason, their victims are alone and have to fend for themselves. They have to bring neatly packaged irrefutable evidence to the group before it acts, which is very hard to do without the support of the group.
Victims often even get blamed when they try to prove what is happening! While the group doesn’t know who’s right, the victim’s attempts to gather proofs will be perceived as accusatory, mean and paranoid.
Sociopaths are aware of it. And they leverage it.
Victims are aware of it. They often resign themselves as a result of knowing people won’t help them.
This is morally bad.
I don’t mean that one ought to get involved with all the drama that happens around them: we are puny little humans. We have a limited ability and will to do what’s Good, and we thus necessarily do bad things.
But I think one should still be aware when we are doing something bad. We should especially not elevate our weakness to a virtue. We should not act as if we were superior to the people going through drama by virtue of ignoring it.
—
A friend responded to the previous section with:
I agree this is a real problem! Often, though, I think it arises out of avoiding acting out of fears of the consequence rather than any stance of Epistemic Humility.
This may be an excuse, I’ve more often heard people directly state they didn’t want to make anyone mad in these situations, or they prefer to reserve their role as a peace keeper, or so on (and have done so myself, for better and worse).
I believe my friend is largely right, but under-appreciates the extent to which ensuring that good norms are followed takes both common knowledge and someone getting dirty.
These two things trade-off: the clearer a rule violation is, the easier it is to punish the violator. And conversely, as a situation gets less and less clear, the more the rule-enforcer will have to bid in order to investigate and fix it.
When someone violates a good rule, someone else must punish them, and it takes a toll. At the very least, they must go to the rule-breaker and tell them “Hey man, you broke the rule.”
The rule-breaker will always counter-argue. Either because they acted in bad faith, or because they acted in good faith and feel the need to justify themselves.
Thus the rule-enforcer will have to pay a social cost to make the bids needed to make it clear that nope, the rule was broken. They will be the one asking people to pay attention to the arguments of the different parties, and take on the role of both prosecutor and executioner.
Then, the rule-enforcer will additionally need to pay the social costs required to enforce an eventual punishment: whether it is an apology to the group, a promise to not repeat the behaviour, a penalty, an exclusion, etc.
And finally, they will be the one to incur the dislike (or the wrath!) of the punished party.
This can all be mitigated by common knowledge and clarity. The more common knowledge there is about what happened and what the rules are, the smoother all of this goes. The less it costs to the person who will enforce the rules.
—
“Oh, but, can we really know what’s good or bad? How can we ever figure out who’s right or wrong? Can we really know if they meant to act badly?”
Such undirected Epistemic Humility only serves to weaken common knowledge. It thus becomes much harder for someone of comparable status to fault someone else.
This is not hypothetical.
Join spaces that take pride in their open-mindedness and humility, and you will see a lot of enablement of sociopaths borne out of “Oh, but who is to know what is truly bad or not???” to any bad behaviour on the edge.
In practice, enforcement in such places is even more asymmetrical than usual. It only happens when a person of low status hurts a person of high status.
This is in stark contrast with a Rule of Law, where Laws are Respected, and where everyone is equal in front of the norms.
Epistemic Precision
Now, onto a more positive vision. In fact, I have an alternative to Epistemic Humility.
Instead of Epistemic Humility, I recommend thinking in terms of Epistemic Precision. Epistemic Precision is not about being humble about what we know, but being precise.
We are not random machines, outputting random sounds and writing random symbols.
There’s always a reason for why we think what we think, say what we say, and do what we do. Epistemic Precision is the practice of paying close attention to it, enough to get a reliable understanding of where one’s confidence comes from.
Let’s go through some examples.
Benchmark
I was once asked for feedback on a benchmark suite. It claimed to measure [some property].
But the author did not think that their suite was in fact measuring the property. They didn’t think the benchmarks were close to doing so, and they never used the suite themselves to evaluate [the property].
I thought this was thus blatant academic misrepresentation, and told the author I thought he was lying and should not do so.
—
They thanked me for reminding them to be Epistemically Humble, and to instead claim that their work was only a stepping stone to measuring the property.
I vehemently disagreed!
The assertion “The suite measures [the property]” is wrong. 10% of it is still wrong, just 10% as much. The direction was incorrect.
So they asked me if I thought they should retract. They were sad about it, because they put in a lot of work, and they thought it could still be useful.
I asked them why they thought it could be useful. They responded that they crawled through hundreds of benchmarks to build the suite, and that even though the benchmarks were bad, these were the closest they could find.
And my conclusion was: “Just say that!”
Indeed, “just saying that”, saying what they believed, would have transformed the project. It would have gone from “one more example of Academic Misrepresentation” to “Providing strong evidence for one of the major problems of the field: that its objective measures are thoroughly inadequate.”3
This is the magic of Epistemic Precision.
Soft Knowledge
The example above was a bit too academic.
Where I have found that Epistemic Precision shines the most, is in situations that have to do with less cut-and-dry knowledge.
Here are a few small examples.
—
Quite often, I ask a question that seems deceptively easy. Not on purpose, it’s just how the world is: sometimes, questions seem easy to answer but they are not.
In these situations, I often get an unreflected answer, that reflects more their preconceptions than their actual thoughts.
I then follow up with “Do you know the answer to be true, or are you inferring that it is?”
It helps them a lot with pausing, and realising that they were making too many assumptions. Assumptions that I may not know about, that I may disagree with, that they themselves may disagree with, or which may turn out to be wrong.
—
On another note. As a human being, a lot of my knowledge is intuitive.
I value my intuitive knowledge quite a lot. But it is sadly much more subtle than formal knowledge, and thus hard to capture with words.
I also value other people’s intuitive knowledge. Sometimes, someone will make a claim that doesn’t seem natural to me. When I try to tease out why they make that claim, they become defensive.
Instead of an “I got this intuition from [doing X]”, they often start a barrage of rationalisations and fake post-hoc pseudo-logical explanations.
When I tell people about said rationalisations (by engaging with them, by telling them I want to learn about their intuitions, etc.), they revert back to some Epistemic Humility: “Oh well, I guess I don’t really know…”
This is so inefficient! There is a reason why they hold their intuitions and paid attention to them, it’s not just random noise. By paying attention to it, introspecting and understanding where their intuition comes from; we can then both learn and infer even more than what they immediately intuit.
Instead of them being Humble, I want them to be Precise. I want them to tell me what their intuition feels like, what reinforces it, where they have seen it work well, etc. I don’t want a bland “Welp, I guess it’s not Proper Formal Knowledge and it is thus Worthless, I should be Epistemically Humble and ignore it :(“
Indirect Knowledge
I have found Epistemic Precision to also be very valuable in dealing with “Indirect Knowledge”. Indirect Knowledge is knowledge that I have not gotten by myself, but instead that I have gotten from books, people, social media, and the like.
To me, “Indirect Knowledge” doesn’t feel real, it doesn’t feel concrete.
—
For instance, when I think “The sky is blue”, I have a pretty clear impression of what I feel and expect. I can think of all the time I have seen the sky being blue, I know what I mean. It is real knowledge.
When I think “The sky is blue because of scattering”, it feels super fake.
I can of course try to come up with an explanation. I have learnt some optics and some wave physics, but man.
Let’s consider the first explanation I come up with: “The atmosphere is mostly made of nitrogen. And nitrogen scatters blue more easily than the other colours.”
Even though I said that, I don’t know why fog makes everything grey (“water scatters grey more?? it’s not even in the light spectrum!”); I don’t know why the sky is orange during twilights; I don’t know why it is purple during typhoons. And to be clear, I haven’t played much with gaseous nitrogen either.
This is why “The sky is blue because of scattering” doesn’t mean much to me.
—
If I ever read “A paper shows that liberal christians are kinder than conservative atheists”, I wouldn’t even perceive it as “knowledge”. Like, this will not change, not at all, how I perceive liberals, christians, conservatives and atheists.
The only knowledge that I would have gotten there is that a “researcher” wrote it, and managed to pass it as “science” to many people.
To a large extent, this is how I relate to most of my indirect knowledge.
At this point, whenever I read claims that are not tied to a specific operationalisation, I treat them as social claims. A form of emotivism.
Concretely, if I came to read the headline above today, it would purely register as “Hurray Liberal Christians! Boo Conservative Atheists!”
—
To be clear, there are things I do to internalise indirect knowledge.
For instance, I initially learnt about solid dynamics and mechanics from books.
But through many exercises, watching standard experiments, building things myself, talking to teachers about it, and more; I managed to make this knowledge mine.
When I talk about solid dynamics and mechanics now, I know what I mean. I can dig into the details.
I now consider it direct knowledge.
—
However, if I am asked about something that I only know indirectly, I will be clear to the person about the fact that the knowledge I am relaying is indirect.
If I have a good recollection of where I got it from, I’ll say something like “Word is on social media that […]”, “I have read in a book that […]”, “Some researcher wrote in a paper that […]”.
If I can’t, I’ll say “I can’t remember where, but I remember having heard/read that […]. I have never experienced it myself though.”
Or, the worst, “I can’t remember at all. But I think someone once said [X] or something similar. Given how little I remember about it, I don’t even know what they meant.”
At that point, is it even knowledge? I do not think I would purposefully change my decisions based on this, nor that any of my interlocutors ever would.
This is how I deal with indirect knowledge in the framework of Epistemic Precision. I have cultivated an ability to state what nature of knowledge I have, how indirect it is, rather than just a vague feeling of “I should be humble.”
Conclusion
Armed with all of these concepts, the core points of this essay can be neatly summarised.
1) Epistemic Humility doesn’t serve an epistemic function. Its main effect is to weaken one’s social bids.
2) It is thus a good fit in situations where people benefit from weakening their own bids and not standing for their beliefs. This makes it a convenient cover for Epistemic Cowardice.
3) Instead, the actual virtue is Epistemic Precision. Being clear and confident in one’s actual beliefs. While these beliefs may be intuitive or indirect, one should still be clear about them.
On this, cheers!
I am working on getting better at this, which is in large part why I am writing more :)
But if you want to help me with this, let me know!
One may wonder “But then, how can I confidently convey to the group that I only think that [X] has a 20% chance of being true?”
I don’t have a perfect solution. Group epistemics are not made for this.
If I mean that we can not decisively be confident in any option, I may say:
“I am making a strong statement. We do not understand the situation enough and certainty is unwarranted.”
If I mean to put the emphasis on the fact that [X] has at least a 20% chance of being true:
“I believe that although it is unlikely, we can not exclude [X] from our considerations and we should have contingencies ready for it.”,
If on the contrary, I want to state that it has at most a 20% of being true:
“At this time, it is unwarranted to give [X] too much attention. First, we must think more about the case where [X] is false.”
In the end, they stuck to claiming that they were measuring [the property].



You’ve identified a real blind spot in public discourse.
When people conclude that politics is “dirty” and therefore not worth engaging with, they often end up leaving the field open to rule-breakers. This dynamic is common in public affairs. In an era when technology profoundly shapes social structures, it becomes especially troubling when technological and academic debates fall into this pattern. Sociopaths capture the spotlight, while the genuinely difficult and important questions struggle to surface.
I’ve developed a personal workaround for a related problem. When something cannot be demonstrated in academic way—such as inferential knowledge of the kind you mentioned—I make a bet.
The structure of the bet mirrors academic inquiry: under certain conditions, we expect to observe a certain phenomenon. The difference is that it bypasses methodological obstacles that often constrain social science research.
As is well known, social science methodologies impose strict limits on scope and applicability. Moreover, in domains like politics, organizations, and social systems, those who possess relevant knowledge frequently withhold critical details. Outsiders cannot directly verify the underlying conditions, which means the claims rarely meet academic standards.
It becomes even more complicated at the level of psychological and social fundamentals. When you propose a hypothesis, critics can always say, “This time is different—the conditions aren’t the same as before.” That move effectively blocks meaningful evaluation.
So I simplify it. I make concrete, falsifiable wagers like"A highly hyped technology will not achieve a particular category of application" , "A major political party has never discussed a specific key fact", "A company’s executive team is unaware of a critical piece of information."
What’s striking is this: the structure of evaluation is almost identical to academic testing. But when framed as a bet—when accountability is explicit—those who lean toward relativism or avoid responsibility tend to withdraw.
At least in my personal experience, I have been correct most of the time.
I’m not sure whether this is a defensible method. But it does seem to expose something about incentives, accountability, and how people respond when claims are tied to consequences rather than merely discussion.