The focus of creating as much value as possible in the Expected Value calculation is one I am not very sure about. I understand the concept, but as someone who works with data, I’d be more drawn to causes that have been proven to have high value already. Thus, the idea of fringe ideas would not be one of my immediate concerns. I always feel that there are more pertinent issues which are visible and measurable that we should focus on currently. This is the same thought process I hold behind not being an advocate of longtermism. I believe we already have very many important causes right now that I’d rather we focus on.
Another area I’m a bit skeptical about is the expansion of the empathy circle. For most of my life I’ve not had a lot of empathy towards non-human animals. I would not want them tortured, but I would not say that they deserve as much empathy as human beings. Again, I feel like our circle of empathy still has a long way to go in dealing with humans so conversations on expanding it to other sentient beings (and stretching the idea as far as algorithms) is not a cause I feel holds that much importance to me.
Finally, one of my biggest concerns is the urgency placed on climate change in the community. When talking of the most important causes, I feel like climate change is under-ranked as a massive danger to the future of our world. I find it misleading to place speculative causes such as AI risks, catastrophic pandemics, nuclear war, and great power conflicts ahead of climate change in importance because climate change is something we are currently experiencing and it’s only bound to get worse. In my opinion, climate change is the top risk as it promises to destabilise society in the next few compared to some other risks (the warming targets we have set are on course to be missed as soon as 2030). The more we continue sliding into worse climate scenarios, the more social issues shall arise and they shall very possibly lead to the rise of all these other risks—AI/nuclear/biological wars triggered by the race for resources such as water, higher land away from the rising oceans, productive soil, and many more. Thus, I believe the cause prioritization should be reconsidered to account for these second factor effects of climate change.
You might wanna review the idea of neglectedness to asses impact. The idea isn’t necessarily that climate change is less important than other causes. Just that there are already a ton of resources being put into work on climate change so adding more resources there will have less impact.
This article addresses neglectedness among other things.
Concerns about EA
The focus of creating as much value as possible in the Expected Value calculation is one I am not very sure about. I understand the concept, but as someone who works with data, I’d be more drawn to causes that have been proven to have high value already. Thus, the idea of fringe ideas would not be one of my immediate concerns. I always feel that there are more pertinent issues which are visible and measurable that we should focus on currently. This is the same thought process I hold behind not being an advocate of longtermism. I believe we already have very many important causes right now that I’d rather we focus on.
Another area I’m a bit skeptical about is the expansion of the empathy circle. For most of my life I’ve not had a lot of empathy towards non-human animals. I would not want them tortured, but I would not say that they deserve as much empathy as human beings. Again, I feel like our circle of empathy still has a long way to go in dealing with humans so conversations on expanding it to other sentient beings (and stretching the idea as far as algorithms) is not a cause I feel holds that much importance to me.
Finally, one of my biggest concerns is the urgency placed on climate change in the community. When talking of the most important causes, I feel like climate change is under-ranked as a massive danger to the future of our world. I find it misleading to place speculative causes such as AI risks, catastrophic pandemics, nuclear war, and great power conflicts ahead of climate change in importance because climate change is something we are currently experiencing and it’s only bound to get worse. In my opinion, climate change is the top risk as it promises to destabilise society in the next few compared to some other risks (the warming targets we have set are on course to be missed as soon as 2030). The more we continue sliding into worse climate scenarios, the more social issues shall arise and they shall very possibly lead to the rise of all these other risks—AI/nuclear/biological wars triggered by the race for resources such as water, higher land away from the rising oceans, productive soil, and many more. Thus, I believe the cause prioritization should be reconsidered to account for these second factor effects of climate change.
You might wanna review the idea of neglectedness to asses impact. The idea isn’t necessarily that climate change is less important than other causes. Just that there are already a ton of resources being put into work on climate change so adding more resources there will have less impact.
This article addresses neglectedness among other things.
https://forum.effectivealtruism.org/s/x3KXkiAQ6NH8WLbkW/p/ER4gAtS5LAx2T3Y98