I think this is a potentially very large problem with a lot of EA work. For eg, with AI research, it seems like we’re still at the far left end of the graph, since no-one is actually building anything resembling dangerous AI yet, so all research papers do is inform more research papers.
We should also really stop talking about neglectedness as an independent variable from importance and tractability—IMO it’s more like a function of the two, that you can use as a heuristic to estimate them. If a huge-huge problem has had a merely huge amount of resources put into it (eg climate change), it might still turn out to be relatively neglected.
I think this is a potentially very large problem with a lot of EA work. For eg, with AI research, it seems like we’re still at the far left end of the graph, since no-one is actually building anything resembling dangerous AI yet, so all research papers do is inform more research papers.
We should also really stop talking about neglectedness as an independent variable from importance and tractability—IMO it’s more like a function of the two, that you can use as a heuristic to estimate them. If a huge-huge problem has had a merely huge amount of resources put into it (eg climate change), it might still turn out to be relatively neglected.