Epistemic status: an explainer and some thoughts
Epistemic status: …hang on a second.[1]
It’s common to see posts on the EA Forum (this platform) start with “Epistemic status: [something about uncertainty, or time spent on the post].”
This post tries to do three things: [2]
Briefly explain what “Epistemic status” means
Suggest that writers consider hyperlinking the phrase (e.g. to this explainer)
Discuss why people use “epistemic status”
What does “epistemic status” mean?
According to Urban Dictionary:[3]
The epistemic status is a short disclaimer at the top of a post that explains how confident the author is in the contents of the post, how much reputation the author is willing to stake on it, what sorts of tests the thesis has passed.
It should give a reader a sense of how seriously they should take the post.
I think that’s a surprisingly good explanation. Commenters might be able to add to it, in which case I’ll add an elaboration.
A bunch of examples:
Epistemic status: Pretty confident. But also, enthusiasm on the verge of partisanship
Epistemic Status: I have worked for 1 year in a junior role at a large consulting company. Most security experts have much more knowledge regarding the culture and what matters in information security. My experiences are based on a sample size of five projects, each with different clients. It is therefore quite plausible that consulting in information security is very different from what I experienced. Feedback from a handful of other consultants supports my views.
Epistemic status: personal observations, personal preferences extrapolated. Uses one small random sample and one hard data source, but all else is subjective.
Epistemic status/effort: I spent only around 5 hours on the work test and around 3 hours later on editing/adapting it, though I happened to have also spent a bunch of time thinking about somewhat related matters previously
Epistemic status: uncertain. I removed most weasel words for clarity, but that doesn’t mean I’m very confident. Don’t take this too literally. I’d love to see a cluster analysis to see if there’s actually a trend, this is a rough guess.
Epistemic status (how much you should trust me): Engaging with the Forum is my job, and I ran this by a few people, who all agreed with the argument. One person was surprised that this was an issue. So I’m more confident than usual.
Epistemic status: Writing off-the-cuff about issues I haven’t thought about in a while—would welcome pushback and feedback
Epistemic status: Divine revelation.
Epistemic Status: In this post I’m mainly referring to university group community builders. It’s possible that a lot of what I say will still apply to city / country / other groups, but I’m less confident of this. In my problem section, I give some percentage estimates of how much organizers are marketing (defined later) and how much they should be marketing. This is based off of some rough estimates, which I’m not confident in. I’d love to see someone better estimate this or run a survey.
Epistemic status: a rambling, sometimes sentimental, sometimes ranting, sometimes informative, sometimes aesthetic autobiographical reflection on ten years trying to do the most good.
Note: epistemic confidence is lower here, as not much time was spent looking into these relative to other areas.
Epistemic note: I am engaging in highly motivated reasoning and arguing for veg*n.
Epistemic Status: I am uncertain about these uncertainties! Most of them are best guesses. I could also be wrong about the inconsistencies I’ve identified. A lot of these issues could easily be considered bike-shedding. [This post also includes: “Effort: This took about 40 hours to research and write, excluding time spent developing Squiggle.”]
Most of these express a lot of uncertainty. I’d be excited to see more posts that lean into their beliefs — if that’s the real position of the author.
Valuable information to list in an epistemic status
Some things I think are especially useful to include, when relevant:
Biases you might have
E.g. “I’m funded by the main organization discussed in this post…” or “I’m arguing that this should be a priority area, but it’s also what I specialized in, so there might be some suspicious convergence.”
This can also include reasons the data you’re using might be biased (e.g. if you’re talking about events, but only have experience about certain types of events).
The main reasons (the cruxes) you believe what you write, especially if it might not be obvious from the body of the post (e.g. if you reference data or information that’s not actually crucial to your personal belief in the conclusion)
E.g. “I list some data in this post, but the strongest factor in me believing the conclusion I describe is my personal experience.” Or “The main reason I wrote this post is because of a conversation I had with a professional. Arguments 3-6 presented here are more like add-ons after I thought about it a little longer.”
This is very related to Epistemic Legibility.
Your qualifications
E.g. “This is not my field of expertise, but I read [this book] about it,” or “I have a Ph.D. in a related field, and have thought about this for at least 80 hours.”
The effort you put into this post and into making its claims very precise
E.g. “I wrote this up in 30 minutes, might have made mistakes or misstated my actual opinions,” or “I spent 40 hours researching this subject and writing this report.”
The number and type of people who gave feedback on the writing or project, and broadly what their feedback was
E.g. “I ran a sketch of this argument past 2 friends who are also in my field, and they broadly agreed.” Or “These are consolidated views. X, Y, and Z gave me feedback on this, and I’ve incorporated it.”
If you end up having an epistemic status on a longer post with different claims, I think it’s also often very useful to have epistemic statuses or “how much I believe this and how much you should trust me” notes on the different sections, as in this post.
Consider hyperlinking the phrase
Epistemic status for this section: I think the harm I describe is real, but I’m not sure it’s actually very big. I’m pretty confident more people should just hyperlink the term, though.
I think the phrase “epistemic status” can be confusing to some readers, and some of what Michael writes in his post, 3 suggestions about jargon in EA, applies here. In particular, while I think adding epistemic statuses is often very useful (see below), newcomers might be disoriented by it, especially at the very top of a post. This seems especially relevant for posts that are aimed at a wider or more general audience.
One solution: simply hyperlink this post (or some other explanation) when you use “epistemic status.”
(As a reminder, you don’t need to have an epistemic status note.)
Why have an “epistemic status”? (And are there reasons to not have one?)
Adding a field like this can help readers broadly understand how seriously they should take what’s been written, highlight biases of the writer that readers might not be aware of (but should be), make the post more legible, and clarify the purpose of the post. I think this is especially important if the author is someone whose views might get accepted because they have some status or authority.
Epistemic statuses can also help us discuss things more collaboratively. If I add “Epistemic status; just figuring things out, really uncertain” at the top of my post, commenters might feel more welcome to point out the flaws in my argument, and might do so more generously than if I had just straightforwardly argued for something incorrect.
And, importantly (and relatedly), epistemic statuses can help us avoid information cascades, which is a way to collectively arrive at false beliefs when people defer to each other without understanding the true reasons for why others might believe something. (Here’s a silly toy example; imagine three people A, B, and C trying to understand a given subject. C knows that A and B both hold belief X, and, deferring, decides that X is probably right. A and B notice that C also thinks X, and become more confident in X. In reality, B only thinks X because A thinks X. So everyone is depending on A’s belief in X, which could be the result of only one bit of independent information.
I’m probably missing some reasons for using epistemic statuses in Forum posts. I’d welcome more suggestions in the comments!
Note: an epistemic status communicating uncertainty doesn’t mean that all readers will fully process the uncertainty
One caveat to all of the above is that an epistemic status doesn’t fully remove the danger of over-deferral, and might give false confidence that we’ve addressed over-deferral.
So if you’re very uncertain about what you’re claiming, or about specific claims you’re making, it’s worth stating that clearly (and repeatedly) in the body of the post.
As readers, we often less-than-critically accept the conclusions of a post (especially if it’s by someone who has some expertise or status), even if the post has a note at the top disclaiming: “epistemic status — these are just rough thoughts.” If misused, epistemic statuses might give people false confidence that they’ve caveated their writing enough. There’s a related discussion here.
(Note also that deferring isn’t always bad. I just think it’s important to know when and why we’re deferring.)
Further reading
- ^
For real though: Epistemic status: this is a post I think could be useful to some people, but I’m not confident I captured the right reasoning for the different things I argue for, and hope the comments will supplement anything I missed or got wrong. A couple of people looked at a draft, and their feedback led to minor edits, but I’m not sure they endorse everything here.
- ^
This post is not an attempt to get more people to use epistemic statuses. I think they can be useful, but don’t think they’re always necessary or even always helpful.
- ^
Thanks to Lorenzo for discovering this! Some hyperlinks removed.
- We all teach: here’s how to do it better by 30 Sep 2022 2:06 UTC; 172 points) (
- Understanding the diffusion of large language models: summary by 21 Dec 2022 13:49 UTC; 127 points) (
- List of Interventions for Improving Institutional Decision-Making by 24 Nov 2022 3:54 UTC; 92 points) (
- New Epistemics Tool: ThEAsaurus by 1 Apr 2024 15:32 UTC; 79 points) (
- Why people want to work on AI safety (but don’t) by 24 Jan 2023 6:41 UTC; 70 points) (
- Notes on how I want to handle criticism by 8 Jun 2023 11:47 UTC; 63 points) (
- Are there diseconomies of scale in the reputation of communities? by 27 Jul 2023 18:43 UTC; 52 points) (
- EA & LW Forums Weekly Summary (28 Aug − 3 Sep 22’) by 6 Sep 2022 11:06 UTC; 51 points) (LessWrong;
- EA & LW Forums Weekly Summary (28 Aug − 3 Sep 22’) by 6 Sep 2022 10:46 UTC; 42 points) (
- Largest AI model in 2 years from $10B by 24 Oct 2023 15:14 UTC; 36 points) (
- Understanding the diffusion of large language models: summary by 16 Jan 2023 1:37 UTC; 26 points) (LessWrong;
- 2 Jan 2023 0:17 UTC; 24 points) 's comment on Your 2022 EA Forum Wrapped 🎁 by (
- 19 Nov 2022 16:19 UTC; 22 points) 's comment on Dancer’s Quick takes by (
- What resources should job seekers know about? by 12 Oct 2022 11:31 UTC; 18 points) (
- 31 Aug 2022 23:41 UTC; 9 points) 's comment on Epistemic status: an explainer and some thoughts by (
- Career Investments vs Giving in the short-term by 10 Aug 2023 14:14 UTC; 5 points) (
- In-Context Learning: An Alignment Survey by 30 Sep 2024 18:44 UTC; 5 points) (LessWrong;
- 19 Aug 2023 3:35 UTC; 1 point) 's comment on Open Thread: July—September 2023 by (
- W2SG: Introduction by 10 Mar 2024 16:25 UTC; 1 point) (LessWrong;
Thanks for this, I applaud the effort to make forum posts more legible to
newcomerseveryone.I would go even further and eliminate the jargon completely (or preserve it just to link to this post):
How sure I am of this: I’m typing this at 2:40 am, so not much
OR
How sure I am of this (epistemic status): I’m writing this at 2:40 am, so not much
Thank you for posting this; with respect to the first footnote, I think that even if the post is missing some parts or is slightly miscalibrated, having it on the forum might nonetheless help raise the forum’s epistemic standards.
Some considerations:
I find Epistemic Status notes more useful when the author includes the extent to which they researched or thought about something, which you mentioned under The effort you put into this post and into making its claims very precise. It might also be useful to include statements concerning roughly what fractions of sources of evidence referenced in their post they skimmed and what fraction they engaged with deeply.
This point at the end of your post—we often less-than-critically accept the conclusions of a post (especially if it’s by someone who has some expertise or status) - could probably be elaborated on in its own paragraph; additionally, briefly addressing what people should take away / how people should update their beliefs in the Epistemic Status note might help with this.
In my view, Epistemic Status falls under the category of “general transparency”. I highly recommend the Reasoning Transparency, which you list in Further Reading, and want to quote what I consider to be a highly valuable and highly relevant part of the Motivation section :
In my own writing, I try to employ the following “canned transparency” list, which overlaps with both your list of valuable information to include under Epistemic Status and the above list from Reasoning Transparency. Of course, not all of these will be useful to include for every post; more speculative posts might benefit most from a simple Epistemic Status—the list below is geared more towards reports, reviews, and essays.
Why does this essay exist?
1-2 sentences about why there is a need for this writing or why the topic deserves attention
Who is this post for?
1 sentence explaining the author’s opinion for which people / communities would benefit (find interesting, improve research, find insightful) most from reading this
How good is this essay?
1-3 sentences about the time spent writing, claim-robustness of, evidence incorporated in this essay, including shortcuts
This essay’s claims?
1-2 sentences on the types of claims made in the essay
What is my confidence?
1-2 sentences on the confidence you have in the claims you’ve made
Can you trust me?
1-2 sentences on your qualifications, your track record, and your expertise
What are my priors?
1 sentence on what you believed about the topic before writing this post / essay /review
My updated beliefs?
1 sentence on how you updated your beliefs
My sense of where you should update?
1 sentence on how you believe others should or might update their beliefs
My contribution?
1 sentence on what, if any, contribution you believe you made in the space you are writing in
These are more of a blueprint towards a standard transparency block at the beginning of substantive posts or reviews, so I am open to edits / other people’s thoughts on the comprehensiveness / utility of this listing.