My impression is that few people are researching new interventions in general, whether in climate change or other areas (I could name many promising ideas in global development that haven’t been written up by anyone with a strong connection to EA).
I can’t speak for people who individually choose to work on topics like AI, animal welfare, or nuclear policy, and what their impressions of marginal impact may be, but it seems like EA is just… small, without enough research-hours available to devote to everything worth exploring.
(Especially considering the specialization that often occurs before research topics are chosen; someone who discovers EA in the first year of their machine-learning PhD, after they’ve earned an undergrad CS degree, has a strong reason to research AI risk rather than other topics.)
Perhaps we should be doing more to reach out to talented researchers in fields more closely related to climate change, or students who might someday become those researchers? (As is often the case, “EAs should do more X” means something like “these specific people and organizations should do more X and less Y”, unless we grow the pool of available people/organizations.)
An example of what I had in mind was focusing more on climate change when running events like Raemon’s Question Answering hackathons. My intuition says that it would be much easier to turn up insights like the OP than insights of “equal importance to EA” (however that’s defined) in e.g. technical AI safety.
My impression is that few people are researching new interventions in general, whether in climate change or other areas (I could name many promising ideas in global development that haven’t been written up by anyone with a strong connection to EA).
I can’t speak for people who individually choose to work on topics like AI, animal welfare, or nuclear policy, and what their impressions of marginal impact may be, but it seems like EA is just… small, without enough research-hours available to devote to everything worth exploring.
(Especially considering the specialization that often occurs before research topics are chosen; someone who discovers EA in the first year of their machine-learning PhD, after they’ve earned an undergrad CS degree, has a strong reason to research AI risk rather than other topics.)
Perhaps we should be doing more to reach out to talented researchers in fields more closely related to climate change, or students who might someday become those researchers? (As is often the case, “EAs should do more X” means something like “these specific people and organizations should do more X and less Y”, unless we grow the pool of available people/organizations.)
An example of what I had in mind was focusing more on climate change when running events like Raemon’s Question Answering hackathons. My intuition says that it would be much easier to turn up insights like the OP than insights of “equal importance to EA” (however that’s defined) in e.g. technical AI safety.