Neglectedness is a heuristic to find promising problems to work on. This heuristic however only applies when there is a cause for neglectedness that is independent of relevancy and tractability. Furthermore, if you decide to work on a problem because it is neglected for a particular reason, this reason should inform you of the kind of work that needs to be done. Finally, the emphasis on neglectedness might be misleading and not conducive to doing the most good.
The consensus in the community is that the most pressing neglected problems are: existential risk (specially from artificial intelligence), global poverty, animal welfare, building effective altruism and working on cause prioritization.
Our current understanding of the world indicates that a few things seem to be consistently neglected: people from distant parts of the world, people from the distant future, non-human moral agents, preventing risks of uncertain and small probability, making bets with very small probability of paying off. All of these seem to require cognitive skills beyond what evolution really prepared us to, the first three requires an expansion of empathy while the last two require very careful reasoning.
A problem may also be neglected because it is irrelevant or intractable. Without diving into the problem, it is hard to tell why it is neglected. It is possible however to come up with heuristics to inform this decision, just as neglectedness was introduced as a heuristic to come up with tractable problems.
I now go on to less robust arguments that nonetheless follow from this line of reasoning.
Empathy vs Reason
The difference between these two causes of neglectedness is fundamental. Although the literature on cognitive biases / imperfectly rational behaviour is well established, it should not be forgotten that the basic incentives to be rational are still there. That is, there are solid reasons to believe that neglectedness coming from a lack of rationality is less stable than neglectedness coming from a lack of empathy. This is something we have seen with climate change and artificial intelligence, as the threat gets more real and progress is made on making it salient, society devotes more resources to it.
While at some margin this may mean that effective altruists should not work on these problems, using the neglectedness heuristic, at another it may mean that these are the most pressing problems to work on. This is because the possibility of flipping a problem from neglected to not neglected makes it immensely more tractable. The key takeaway here is to think about the kind of work that needs to be done. While a problem that is more fundamentally neglected may always need effective altruists directly working on it, superficially neglected problems may only require an effective altruist kickstart to push it into the mainstream. This kickstart may look like directly working on the problem, but it seems reasonable that advocacy and monetary contributions would have relatively higher impact.
Relevant, paradigmatic and neglected
Some of the problems targeted by effective altruists do not appear to have a good reason to be neglected. These include: international cooperation and conflict, global priorities research, institutional decision making and forecasting.
Scientific progress on these problems would be promptly rewarded in the disciplines of International Relations, Political Science, Economics, etc. Putting that scientific knowledge to work would be aligned with the basic self interest of the parties involved.
The conclusion seems to be that these problems are intractable. The best argument for working on an intractable problem that is highly relevant would be that the expected value might still be large, even as risk-averse and irrational agents continue to ignore it.
It’s important to ground ourselves in our epistemic position here. The original idea of looking for neglected problems was to identify problems that are tractable.Deciding then to pick up a problem that is intractable should at least be justified in stronger grounds than neglectedness.
The EA community seems far less interested in other intractable problems, like climate change or nuclear fusion, because they are not neglected. But shouldn’t a large amount of attention being given to an intractable problem highlight that this is the kind of intractable problem that has a high expected value? Their intractability should promote the same cognitive aversion as more neglected problems.
Maybe then it is neglectedness itself that serves as a fundamental cause of neglectedness. Currently, it is feasible and socially acceptable to spend your working life failing to solve climate change by working a particularly intractable margin of the problem. It is much harder to do so for a problem that is not widely seen as important. An effective altruist that chooses to tackle such a problem is a moral entrepreneur of the same kind that exercises a wider reach of empathy, but in this case they are leveraging their impact on their ability to ignore social incentives.
While this argument holds, there are two caveats. First, it must be true: for your comparative advantage to be best expressed as a willingness to ignore social incentives, you should really be a person that ignores them, and not a person that is convincing themselves that they should ignore them. A person of the later kind is better off working on a tractable problem. Second, if the situation here is that the problem is socially undervalued, wouldn’t it be the case that the greatest margin for impact is changing that, maybe through advocacy and monetary contributions? Again, the key point is that there is a failure of the neglectedness argument: it may the case that this is the best problem to work on, but not just because it is neglected.
Final thoughts
Neglectedness is not everything. Another important component of impact is personal fit. I believe it is important for the EA community to have at the ready a series of problems that would fit a diverse set of people, interested in doing the most good. Personal fit, although already an important concept in the community, is perhaps undervalued.
I expect that a smaller emphasis on neglectedness, together with a higher emphasis on more common sense notions of impact (like personal fit), would increase the effectiveness of the EA community. It would also make it more readily accessible and closer to the mainstream. In doing so, it may significantly contribute to growing effective altruism, an important problem in itself.
It would make the ideas of the community less unique and perhaps less exciting. It may weaken community ties. It may make it less cultish. We must keep in mind the distinction between doing the most good and doing what you want to do while flavouring it with rationalist arguments.
I’m afraid the emphasis on neglectedness that prevails in the community has less to do with positive impact and more to do with the social psychology of groups. The EA movement seems made to be minoritarian. Aversion to the group from the outside, while surprising to some in the community, is built from within. Neglectedness is at the center of this. It grounds the movement, at its core, in non-mainstream thinking.
Neglectedness is not enough
Neglectedness is a heuristic to find promising problems to work on. This heuristic however only applies when there is a cause for neglectedness that is independent of relevancy and tractability. Furthermore, if you decide to work on a problem because it is neglected for a particular reason, this reason should inform you of the kind of work that needs to be done. Finally, the emphasis on neglectedness might be misleading and not conducive to doing the most good.
The consensus in the community is that the most pressing neglected problems are: existential risk (specially from artificial intelligence), global poverty, animal welfare, building effective altruism and working on cause prioritization.
Our current understanding of the world indicates that a few things seem to be consistently neglected: people from distant parts of the world, people from the distant future, non-human moral agents, preventing risks of uncertain and small probability, making bets with very small probability of paying off. All of these seem to require cognitive skills beyond what evolution really prepared us to, the first three requires an expansion of empathy while the last two require very careful reasoning.
A problem may also be neglected because it is irrelevant or intractable. Without diving into the problem, it is hard to tell why it is neglected. It is possible however to come up with heuristics to inform this decision, just as neglectedness was introduced as a heuristic to come up with tractable problems.
I now go on to less robust arguments that nonetheless follow from this line of reasoning.
Empathy vs Reason
The difference between these two causes of neglectedness is fundamental. Although the literature on cognitive biases / imperfectly rational behaviour is well established, it should not be forgotten that the basic incentives to be rational are still there. That is, there are solid reasons to believe that neglectedness coming from a lack of rationality is less stable than neglectedness coming from a lack of empathy. This is something we have seen with climate change and artificial intelligence, as the threat gets more real and progress is made on making it salient, society devotes more resources to it.
While at some margin this may mean that effective altruists should not work on these problems, using the neglectedness heuristic, at another it may mean that these are the most pressing problems to work on. This is because the possibility of flipping a problem from neglected to not neglected makes it immensely more tractable. The key takeaway here is to think about the kind of work that needs to be done. While a problem that is more fundamentally neglected may always need effective altruists directly working on it, superficially neglected problems may only require an effective altruist kickstart to push it into the mainstream. This kickstart may look like directly working on the problem, but it seems reasonable that advocacy and monetary contributions would have relatively higher impact.
Relevant, paradigmatic and neglected
Some of the problems targeted by effective altruists do not appear to have a good reason to be neglected. These include: international cooperation and conflict, global priorities research, institutional decision making and forecasting.
Scientific progress on these problems would be promptly rewarded in the disciplines of International Relations, Political Science, Economics, etc. Putting that scientific knowledge to work would be aligned with the basic self interest of the parties involved.
The conclusion seems to be that these problems are intractable. The best argument for working on an intractable problem that is highly relevant would be that the expected value might still be large, even as risk-averse and irrational agents continue to ignore it.
It’s important to ground ourselves in our epistemic position here. The original idea of looking for neglected problems was to identify problems that are tractable. Deciding then to pick up a problem that is intractable should at least be justified in stronger grounds than neglectedness.
The EA community seems far less interested in other intractable problems, like climate change or nuclear fusion, because they are not neglected. But shouldn’t a large amount of attention being given to an intractable problem highlight that this is the kind of intractable problem that has a high expected value? Their intractability should promote the same cognitive aversion as more neglected problems.
Maybe then it is neglectedness itself that serves as a fundamental cause of neglectedness. Currently, it is feasible and socially acceptable to spend your working life failing to solve climate change by working a particularly intractable margin of the problem. It is much harder to do so for a problem that is not widely seen as important. An effective altruist that chooses to tackle such a problem is a moral entrepreneur of the same kind that exercises a wider reach of empathy, but in this case they are leveraging their impact on their ability to ignore social incentives.
While this argument holds, there are two caveats. First, it must be true: for your comparative advantage to be best expressed as a willingness to ignore social incentives, you should really be a person that ignores them, and not a person that is convincing themselves that they should ignore them. A person of the later kind is better off working on a tractable problem. Second, if the situation here is that the problem is socially undervalued, wouldn’t it be the case that the greatest margin for impact is changing that, maybe through advocacy and monetary contributions? Again, the key point is that there is a failure of the neglectedness argument: it may the case that this is the best problem to work on, but not just because it is neglected.
Final thoughts
Neglectedness is not everything. Another important component of impact is personal fit. I believe it is important for the EA community to have at the ready a series of problems that would fit a diverse set of people, interested in doing the most good. Personal fit, although already an important concept in the community, is perhaps undervalued.
I expect that a smaller emphasis on neglectedness, together with a higher emphasis on more common sense notions of impact (like personal fit), would increase the effectiveness of the EA community. It would also make it more readily accessible and closer to the mainstream. In doing so, it may significantly contribute to growing effective altruism, an important problem in itself.
It would make the ideas of the community less unique and perhaps less exciting. It may weaken community ties. It may make it less cultish. We must keep in mind the distinction between doing the most good and doing what you want to do while flavouring it with rationalist arguments.
I’m afraid the emphasis on neglectedness that prevails in the community has less to do with positive impact and more to do with the social psychology of groups. The EA movement seems made to be minoritarian. Aversion to the group from the outside, while surprising to some in the community, is built from within. Neglectedness is at the center of this. It grounds the movement, at its core, in non-mainstream thinking.