Among the EA community, the questions “what are the most important, the most effective causes?” is a crucial topic. There are even lists of the most important ones, so people can focus their effort on them.
I assume that to build these lists, a lot of causes have been studied and compared. In the end, only the most neglected, important, and tractable ones have been chosen, the other ones have been discarded.
In my opinion knowing what are the most effective cause is a must, but knowing which causes have been discarded is also important. Here are a couple reasons why:
If someone decides to use its time to search for new promising areas, it would be a waste of time to explore already discarded areas.
In my opinion, for someone discovering EA (such as myself), some of the most important problems are often counter intuitive. Therefore, to have explanation about these problems helps to have a better understanding of the EA’s way of thinking. I believe that also understanding what problems are ineffective and why they are ineffective would also be an interesting approach to EA’s way of thinking.
So here is my question: Are there lists of causes (that seemed promising but are) known to be ineffective?
This seems to me like a good question/a good idea.
Some quick thoughts:
I can’t think of such a list (at least, off the top of my head).
There was a very related comment thread on a recent post from 80,000 Hours. I’d recommend checking that out. (It doesn’t provide the sort of list you’re after, but touches on some reasons for and against making such a list.)
I’ve now also commented a link to this question from that thread, to tie these conversations together.
I’d suggest avoiding saying “known to be ineffective” (or “known to be low-priority”, or whatever). I think we’d at best create a list of causes we have reason to be fairly confident are probably low-priority. More likely, we’d just have a list of causes we have some confident are low-priority, but not much confidence, because once they started to seem low-priority we (understandably) stopped looking into them.
To compress that into something more catchy, we could maybe say “a list of causes that were looked into, but that seem to be low-priority”. Or even just “a list of causes that seem to be low-priority”.
This sort of list could be generated not only for causes, but also for interventions, charities, and/or career paths.
E.g., I imagine looking through some of the “shallow reviews” from GiveWell and Charity Entrepreneurship could help one create lists of charities and interventions that were de-prioritised for specific reasons, and that thus may not be worth looking into in future.
In an old post, Michael Dickens writes:
I think this is a good example of something seeming like a plausible idea for making the world better, but which turned out to seem pretty ineffective.