I think I personally find myself emotionally tugged away from longtermism a little by these events. When thereās so much destruction happening āright before my eyesā and in a short enough future that it can really emotionally resonate, itās like on some level my brain/āemotions are telling me āHow could you be worried about AI risk or a future bioengineered pandemic at a time like this! There are people dying right now. This already is a catastrophe!ā
And itās slightly hard to feed into my emotions the fact that a very different scale of catastrophe, and a much more permanent type, could still possibly happen at some point. (Again, Iām not dismissing that the current pandemic really is a catastrophe, and I do believe it makes sense to reallocate substantial effort to it right now.)
On the other hand, this pandemic also seems to validate various things longtermists have been saying for a while, such as about how civilization is perhaps more fragile than people imagine, how we need to improve the speed at which we can develop vaccines, etc. And it provides an emotionally powerful reminder of just how bad and real a catastrophe can be, which might make it easier for people to feel how bad it is that we could have a catastrophe thatās even worse, and that in fact destroys civilization as a whole.
I think Iād tentatively guess that this pandemic will make the general public slightly more ālongtermistā in their values in general. Iād also guess that itāll make the general public substantially more in favourāfor present-focused reasonsāof things that also happen to be good from a longtermism perspective (e.g., increased spending on future pandemic preparedness in general).
But Iām not sure how itāll affect people who are already quite longtermist. From my sample size of 1 (myself), it seems it wonāt really change behaviours, but will slightly reduce the emotional resonance of longtermism right now (as opposed to just general focus on GCRs).
Very speculative and anecdotal
I think I personally find myself emotionally tugged away from longtermism a little by these events. When thereās so much destruction happening āright before my eyesā and in a short enough future that it can really emotionally resonate, itās like on some level my brain/āemotions are telling me āHow could you be worried about AI risk or a future bioengineered pandemic at a time like this! There are people dying right now. This already is a catastrophe!ā
And itās slightly hard to feed into my emotions the fact that a very different scale of catastrophe, and a much more permanent type, could still possibly happen at some point. (Again, Iām not dismissing that the current pandemic really is a catastrophe, and I do believe it makes sense to reallocate substantial effort to it right now.)
On the other hand, this pandemic also seems to validate various things longtermists have been saying for a while, such as about how civilization is perhaps more fragile than people imagine, how we need to improve the speed at which we can develop vaccines, etc. And it provides an emotionally powerful reminder of just how bad and real a catastrophe can be, which might make it easier for people to feel how bad it is that we could have a catastrophe thatās even worse, and that in fact destroys civilization as a whole.
I think Iād tentatively guess that this pandemic will make the general public slightly more ālongtermistā in their values in general. Iād also guess that itāll make the general public substantially more in favourāfor present-focused reasonsāof things that also happen to be good from a longtermism perspective (e.g., increased spending on future pandemic preparedness in general).
But Iām not sure how itāll affect people who are already quite longtermist. From my sample size of 1 (myself), it seems it wonāt really change behaviours, but will slightly reduce the emotional resonance of longtermism right now (as opposed to just general focus on GCRs).