This feels fairly tricky to me actually—I think between the two options presented I’d go with (1) (except I’m not sure what you mean by “If we’d focus specifically on EAs it would be even better”—I do overall endorse our current choice of not focusing specifically on EAs).
However, some aspects of (2) seem right too. For example, I do think that we talk about a lot of things EAs already know about in much of our content (though not all of it). And I think some of the “here’s why it makes sense to focus on impact”—type content does fall into that category (though I don’t think it’s harmful for EAs to consume that, just not paritcularly useful).
The way I’d explain it:
Our audience does include EAs. But there are a lot of different sub-audiences within the audience. Some of our content won’t be good for some of those sub-audiences. We also often prioritise the non-EA sub-audiences over the EA sub-audience when thinking about what to write. I’d say that the website currently does this the majority of the time. but sometimes we do the reverse.
We try to produce different content that is aimed primarily at different sub-audiences, but which we hope will still be accessible to the rest of the target audience. So for example, our career guide is mostly aimed at people who aren’t currently EAs, but we want it to be at-all useful for EAs. Conversely, some of our content—like this post on whether or not to take capabilities-enhancing roles if you want to help with AI safety (https://80000hours.org/articles/ai-capabilities/), and to a lesser extent our career reviews—are “further down our funnel” and so might be a better fit for EAs; but we also want those to be accessible to non-EAs and put work into making that the case.
This trickiness is a downside of having a broad target audience that includes different sub-audiences.
I guess if the question is “do I think EAs should ever read any of our content” I’d say yes. If the question is “do I think all of our content is a good fit for EAs” I’d say no. If the question is “do I think any of our content is harmful for EAs to read” I’d say “overall no” though there are some cases of people (EAs and non-EAs) being negatively affected by our content (e.g. finding it demoralising).
I was specifically thinking about career guides (and I’m most interested in software, personally).
(I’m embarrassed to say I forgot 80k has lots of other material too, especially since I keep sharing that other-material with my friends and referencing it as a trusted source. For example, you’re my go-to source about climate. So totally oops for forgetting all that, and +1 for writing it and having it relevant for me too)
This feels fairly tricky to me actually—I think between the two options presented I’d go with (1) (except I’m not sure what you mean by “If we’d focus specifically on EAs it would be even better”—I do overall endorse our current choice of not focusing specifically on EAs).
However, some aspects of (2) seem right too. For example, I do think that we talk about a lot of things EAs already know about in much of our content (though not all of it). And I think some of the “here’s why it makes sense to focus on impact”—type content does fall into that category (though I don’t think it’s harmful for EAs to consume that, just not paritcularly useful).
The way I’d explain it:
Our audience does include EAs. But there are a lot of different sub-audiences within the audience. Some of our content won’t be good for some of those sub-audiences. We also often prioritise the non-EA sub-audiences over the EA sub-audience when thinking about what to write. I’d say that the website currently does this the majority of the time. but sometimes we do the reverse.
We try to produce different content that is aimed primarily at different sub-audiences, but which we hope will still be accessible to the rest of the target audience. So for example, our career guide is mostly aimed at people who aren’t currently EAs, but we want it to be at-all useful for EAs. Conversely, some of our content—like this post on whether or not to take capabilities-enhancing roles if you want to help with AI safety (https://80000hours.org/articles/ai-capabilities/), and to a lesser extent our career reviews—are “further down our funnel” and so might be a better fit for EAs; but we also want those to be accessible to non-EAs and put work into making that the case.
This trickiness is a downside of having a broad target audience that includes different sub-audiences.
I guess if the question is “do I think EAs should ever read any of our content” I’d say yes. If the question is “do I think all of our content is a good fit for EAs” I’d say no. If the question is “do I think any of our content is harmful for EAs to read” I’d say “overall no” though there are some cases of people (EAs and non-EAs) being negatively affected by our content (e.g. finding it demoralising).
Thanks
I was specifically thinking about career guides (and I’m most interested in software, personally).
(I’m embarrassed to say I forgot 80k has lots of other material too, especially since I keep sharing that other-material with my friends and referencing it as a trusted source. For example, you’re my go-to source about climate. So totally oops for forgetting all that, and +1 for writing it and having it relevant for me too)