Submission: “I facilitated and lead Computer Science Education Week activities for 25 5-7 year olds at an elementary school in Santa Monica, CA. The event lasted for approximately 90 minutes. And included roughly an hour of preparation time in advance.”
This was pretty tough to compare to the direct EA work, and it was a small project. We did a ridiculous evaluation just to see how it turned out.
We tried to make the comparison by thinking about how many dollars of EA donations would be required to achieve a comparably good outcome (according to our values). To think about this we considered the “scaled up” version of the activity that reached essentially all US youth from a single year and was repeated enough to have a substantial effect on their education and view of computer science, which we imagined was about 2 million times (or 10 such sessions per student). We then compared that impact to the effect of targeted funding in the areas of science with the most leverage and altruistic impact. We guesstimated that the impact would be about 1/100k the impact of an EA grant the size of the annual budget for US R&D and science (and change) which is something like $1T. (The 100k came from multiplying estimates for the impact of marginal STEM education on research quality/enthusiasm/etc., the relative importance of research quality vs. funding, the extra bang-for-your-buck by targeting the best areas and spending money effectively. We chose the annual US budget because the scaled up intervention reached 1 year of students in the US. The impact on marginal STEM education takes into account the fact that the intervention is just 10 sessions.)
This all suggests a value of something like $5 of stimulated EA donations, which we wouldn’t take too seriously :)
(In case it’s not clear, we don’t endorse this procedure for prioritizing very different causes.)
Submission: “I facilitated and lead Computer Science Education Week activities for 25 5-7 year olds at an elementary school in Santa Monica, CA. The event lasted for approximately 90 minutes. And included roughly an hour of preparation time in advance.”
Our very crude evaluation:
This was pretty tough to compare to the direct EA work, and it was a small project. We did a ridiculous evaluation just to see how it turned out.
We tried to make the comparison by thinking about how many dollars of EA donations would be required to achieve a comparably good outcome (according to our values). To think about this we considered the “scaled up” version of the activity that reached essentially all US youth from a single year and was repeated enough to have a substantial effect on their education and view of computer science, which we imagined was about 2 million times (or 10 such sessions per student). We then compared that impact to the effect of targeted funding in the areas of science with the most leverage and altruistic impact. We guesstimated that the impact would be about 1/100k the impact of an EA grant the size of the annual budget for US R&D and science (and change) which is something like $1T. (The 100k came from multiplying estimates for the impact of marginal STEM education on research quality/enthusiasm/etc., the relative importance of research quality vs. funding, the extra bang-for-your-buck by targeting the best areas and spending money effectively. We chose the annual US budget because the scaled up intervention reached 1 year of students in the US. The impact on marginal STEM education takes into account the fact that the intervention is just 10 sessions.)
This all suggests a value of something like $5 of stimulated EA donations, which we wouldn’t take too seriously :)
(In case it’s not clear, we don’t endorse this procedure for prioritizing very different causes.)