We haven’t done a full annual review of 2023 and the complete data isn’t in yet, so we haven’t done a thorough assessment of the answer to your question yet. The answers to your question probably differ quite a bit programme to programme. But here are a few thoughts that seemed relevant to me:
On web:
Over the past couple of years, the biggest predictor of change in web engagement time appears to be changes in our marketing spending. In 2022 we substantially increased our marketing spend. In 2023 our marketing spend was not dramatically larger than in 2022. This is reflected in the web engagement time metrics. (We are actively investigating the cost-effectiveness of marginal marketing spending, and are not fundraising for marketing as part of this public fundraising round as it is already being covered by Open Philanthropy.)
We have also put more effort into driving off-site engagement time in 2023, e.g. via our AI video, improvements to our newsletter, etc. This is not included in the engagement time metrics in the chart, but we estimate that in 2023 we grew off-site engagement time notably more than we did on-site engagement time.
On podcast:
The drivers of engagement with the podcast are more mysterious to me, and I have trouble making accurate predictions of future engagement time with the podcast. Viewed on a quarterly basis, growth in the podcast appears to be healthy.
On advising:
In 2023 we focused more on growing and systematising headhunting, active outreach and systems, and relatively less on increasing call numbers.
We didn’t make as many calls as we had hoped to, due in part to a manager on the team leaving.
We also put relatively more focus on improving call quality, for example by putting in place feedback systems. This was a focus because we grew the team in 2021 and 2022 and wanted more systems to keep everyone in sync and ensure continued quality.
On job board:
We’ve actually reduced our FTE input into the job board in 2023, but we are still seeing solid quarter-on-quarter growth.
Additional points:
Some of our staff growth came from hires to our internal systems team, which should strengthen our capacity over time but won’t result in direct improvements on these metrics.
We do expect some diminishing returns to staff growth over time. I’ll address this in another comment on this thread.
(1) “we seem to be hitting diminishing returns in outreach encouraging more people to apply to advising...” (page 7 under current challenges)
and again
(2) “overall, we’d guess that 80,000 Hours continued to see diminishing returns to its impact per staff member per year.” (on page 10 under impact evaluation)
What is the strategy/argument for “expand the team” being the best intervention for increasing organizational outreach and subsequent impact? Is it really just a capacity issue or could it be a scope issue?
Thanks for the question. To be clear, we do think growing the team will significantly increase our impact in expectation.
We do see diminishing returns on several areas of investment, but having diminishing returns is consistent with significantly increasing impact.
Not all of our impact is captured in these metrics. For example, if we were to hire to increase the quality of our written advice even while maintaining the same number of website engagement hours, we’d expect our impact to increase (though this is of course hard to measure).
In our view, investments in 80k’s growth are still well above the cost-effectiveness bar for similar types of organisations and interventions in the problem areas we work on.
Hey John, unfortunately a lot of the data we use to assess our impact contains people’s personal details or comes from others’ analyses that we’re not able to share. As such, it is hard for me to give a sense of how many times more cost-effective we think our marginal spending is compared with the community funding bar.
But the original post includes various details about assessments of our impact, including the plan changes we’ve tracked, placements made, the EA survey, and the Open Philanthropy survey. We will be working on our annual review in spring 2024 and may have more details to share about the impact of our programmes then.
If you are interested in reading about our perspective on our historical cost-effectiveness from our 2019 annual review, you can do so here.
I’ll ask the obvious awkward question:
Staff numbers are up ~35% this year but the only one of your key metrics that has shown significant movement is “Job Vacancy Clickthroughs”.
What do you think explains this? Delayed impact, impact not caught by metrics, impact not scaling with staff—or something else?
Hey George —thanks for the question!
We haven’t done a full annual review of 2023 and the complete data isn’t in yet, so we haven’t done a thorough assessment of the answer to your question yet. The answers to your question probably differ quite a bit programme to programme. But here are a few thoughts that seemed relevant to me:
On web:
Over the past couple of years, the biggest predictor of change in web engagement time appears to be changes in our marketing spending. In 2022 we substantially increased our marketing spend. In 2023 our marketing spend was not dramatically larger than in 2022. This is reflected in the web engagement time metrics. (We are actively investigating the cost-effectiveness of marginal marketing spending, and are not fundraising for marketing as part of this public fundraising round as it is already being covered by Open Philanthropy.)
We have also put more effort into driving off-site engagement time in 2023, e.g. via our AI video, improvements to our newsletter, etc. This is not included in the engagement time metrics in the chart, but we estimate that in 2023 we grew off-site engagement time notably more than we did on-site engagement time.
On podcast:
The drivers of engagement with the podcast are more mysterious to me, and I have trouble making accurate predictions of future engagement time with the podcast. Viewed on a quarterly basis, growth in the podcast appears to be healthy.
On advising:
In 2023 we focused more on growing and systematising headhunting, active outreach and systems, and relatively less on increasing call numbers.
We didn’t make as many calls as we had hoped to, due in part to a manager on the team leaving.
We also put relatively more focus on improving call quality, for example by putting in place feedback systems. This was a focus because we grew the team in 2021 and 2022 and wanted more systems to keep everyone in sync and ensure continued quality.
On job board:
We’ve actually reduced our FTE input into the job board in 2023, but we are still seeing solid quarter-on-quarter growth.
Additional points:
Some of our staff growth came from hires to our internal systems team, which should strengthen our capacity over time but won’t result in direct improvements on these metrics.
We do expect some diminishing returns to staff growth over time. I’ll address this in another comment on this thread.
Equally curious about the push to grow the team if not seeing significant increase in impact, especially given the $2M marketing push this past year.
In 80K’s 2021-2022 Review it mentioned:
(1) “we seem to be hitting diminishing returns in outreach encouraging more people to apply to advising...” (page 7 under current challenges)
and again
(2) “overall, we’d guess that 80,000 Hours continued to see diminishing returns to its impact per staff member per year.” (on page 10 under impact evaluation)
What is the strategy/argument for “expand the team” being the best intervention for increasing organizational outreach and subsequent impact? Is it really just a capacity issue or could it be a scope issue?
Thanks for the question. To be clear, we do think growing the team will significantly increase our impact in expectation.
We do see diminishing returns on several areas of investment, but having diminishing returns is consistent with significantly increasing impact.
Not all of our impact is captured in these metrics. For example, if we were to hire to increase the quality of our written advice even while maintaining the same number of website engagement hours, we’d expect our impact to increase (though this is of course hard to measure).
In our view, investments in 80k’s growth are still well above the cost-effectiveness bar for similar types of organisations and interventions in the problem areas we work on.
I think this comment would be more persuasive if it shared some evidence or reasoning as to why its claims are likely true
Hey John, unfortunately a lot of the data we use to assess our impact contains people’s personal details or comes from others’ analyses that we’re not able to share. As such, it is hard for me to give a sense of how many times more cost-effective we think our marginal spending is compared with the community funding bar.
But the original post includes various details about assessments of our impact, including the plan changes we’ve tracked, placements made, the EA survey, and the Open Philanthropy survey. We will be working on our annual review in spring 2024 and may have more details to share about the impact of our programmes then.
If you are interested in reading about our perspective on our historical cost-effectiveness from our 2019 annual review, you can do so here.