We hope people in the EA community can benefit from some of our advice and programmes, and we welcome their engagement with and feedback on our ideas. But overall, we are not focused on career advice for members of the effective altruism community in particular.
This seems like it could mean different things:
“The 80k advice is meant to be great for a broad audience, which includes, among others, EAs. If we’d focus specifically on EAs it would be even better, but EAs are our target audience like anyone else is”, or
“The 80k advice is targeted at non-EAs. EAs might get some above-zero value from it, or they might give useful comments, and we don’t want to tell EAs not to read 80k, but we know it is often probably bad-fit advice for EAs. For example, we talk a lot about things EAs already know, and we only mention in brief things that EAs should consider in length.”
Or even, ”.. and we push people towards direction X while most EAs should probably be pushed towards NOT-X. For example, most non-EAs should think about how they could be having more impact, but most EAs should stop worrying about that so much because it’s breaking them and they’re already having a huge impact”
This feels fairly tricky to me actually—I think between the two options presented I’d go with (1) (except I’m not sure what you mean by “If we’d focus specifically on EAs it would be even better”—I do overall endorse our current choice of not focusing specifically on EAs).
However, some aspects of (2) seem right too. For example, I do think that we talk about a lot of things EAs already know about in much of our content (though not all of it). And I think some of the “here’s why it makes sense to focus on impact”—type content does fall into that category (though I don’t think it’s harmful for EAs to consume that, just not paritcularly useful).
The way I’d explain it:
Our audience does include EAs. But there are a lot of different sub-audiences within the audience. Some of our content won’t be good for some of those sub-audiences. We also often prioritise the non-EA sub-audiences over the EA sub-audience when thinking about what to write. I’d say that the website currently does this the majority of the time. but sometimes we do the reverse.
We try to produce different content that is aimed primarily at different sub-audiences, but which we hope will still be accessible to the rest of the target audience. So for example, our career guide is mostly aimed at people who aren’t currently EAs, but we want it to be at-all useful for EAs. Conversely, some of our content—like this post on whether or not to take capabilities-enhancing roles if you want to help with AI safety (https://80000hours.org/articles/ai-capabilities/), and to a lesser extent our career reviews—are “further down our funnel” and so might be a better fit for EAs; but we also want those to be accessible to non-EAs and put work into making that the case.
This trickiness is a downside of having a broad target audience that includes different sub-audiences.
I guess if the question is “do I think EAs should ever read any of our content” I’d say yes. If the question is “do I think all of our content is a good fit for EAs” I’d say no. If the question is “do I think any of our content is harmful for EAs to read” I’d say “overall no” though there are some cases of people (EAs and non-EAs) being negatively affected by our content (e.g. finding it demoralising).
I was specifically thinking about career guides (and I’m most interested in software, personally).
(I’m embarrassed to say I forgot 80k has lots of other material too, especially since I keep sharing that other-material with my friends and referencing it as a trusted source. For example, you’re my go-to source about climate. So totally oops for forgetting all that, and +1 for writing it and having it relevant for me too)
This seems like it could mean different things:
“The 80k advice is meant to be great for a broad audience, which includes, among others, EAs. If we’d focus specifically on EAs it would be even better, but EAs are our target audience like anyone else is”, or
“The 80k advice is targeted at non-EAs. EAs might get some above-zero value from it, or they might give useful comments, and we don’t want to tell EAs not to read 80k, but we know it is often probably bad-fit advice for EAs. For example, we talk a lot about things EAs already know, and we only mention in brief things that EAs should consider in length.”
Or even, ”.. and we push people towards direction X while most EAs should probably be pushed towards NOT-X. For example, most non-EAs should think about how they could be having more impact, but most EAs should stop worrying about that so much because it’s breaking them and they’re already having a huge impact”
Could you clarify what you mean?
This feels fairly tricky to me actually—I think between the two options presented I’d go with (1) (except I’m not sure what you mean by “If we’d focus specifically on EAs it would be even better”—I do overall endorse our current choice of not focusing specifically on EAs).
However, some aspects of (2) seem right too. For example, I do think that we talk about a lot of things EAs already know about in much of our content (though not all of it). And I think some of the “here’s why it makes sense to focus on impact”—type content does fall into that category (though I don’t think it’s harmful for EAs to consume that, just not paritcularly useful).
The way I’d explain it:
Our audience does include EAs. But there are a lot of different sub-audiences within the audience. Some of our content won’t be good for some of those sub-audiences. We also often prioritise the non-EA sub-audiences over the EA sub-audience when thinking about what to write. I’d say that the website currently does this the majority of the time. but sometimes we do the reverse.
We try to produce different content that is aimed primarily at different sub-audiences, but which we hope will still be accessible to the rest of the target audience. So for example, our career guide is mostly aimed at people who aren’t currently EAs, but we want it to be at-all useful for EAs. Conversely, some of our content—like this post on whether or not to take capabilities-enhancing roles if you want to help with AI safety (https://80000hours.org/articles/ai-capabilities/), and to a lesser extent our career reviews—are “further down our funnel” and so might be a better fit for EAs; but we also want those to be accessible to non-EAs and put work into making that the case.
This trickiness is a downside of having a broad target audience that includes different sub-audiences.
I guess if the question is “do I think EAs should ever read any of our content” I’d say yes. If the question is “do I think all of our content is a good fit for EAs” I’d say no. If the question is “do I think any of our content is harmful for EAs to read” I’d say “overall no” though there are some cases of people (EAs and non-EAs) being negatively affected by our content (e.g. finding it demoralising).
Thanks
I was specifically thinking about career guides (and I’m most interested in software, personally).
(I’m embarrassed to say I forgot 80k has lots of other material too, especially since I keep sharing that other-material with my friends and referencing it as a trusted source. For example, you’re my go-to source about climate. So totally oops for forgetting all that, and +1 for writing it and having it relevant for me too)