That sounds right to me. (And Will, your drawbridge metaphor is wonderful.)
My impression is that there already is some grumbling about EA being too elitist/out-of-touch/non-diverse/arrogant/navel-gazing/etc., and discussions in the community about what can be done to fix that perception. Add to that Toby Ord’s realization (in his well-marketed book) that hey, perhaps climate change is a bigger x-risk (if indirectly) than he had previously thought, and I think we have fertile ground for posts like this one. EA’s attitude has already shifted once (away from earning-to-give); perhaps the next shift is an embrace of issues that are already in the public consciousness, if only to attract more diversity into the broader community.
I’ve had smart and very morally-conscious friends laugh off the entirety of EA as “the paperclip people”, and others refer to Peter Singer as “that animal guy”. And I think that’s really sad, because they could be very valuable members of the community if we had been more conscious to avoid such alienation. Many STEM-type EAs think of PR considerations as distractions from the real issues, but that might mean leaving huge amounts of low-hanging utility fruit unpicked.
Explicitly putting present-welfare and longtermism on equal footing seems like a good first step to me.
That sounds right to me. (And Will, your drawbridge metaphor is wonderful.)
My impression is that there already is some grumbling about EA being too elitist/out-of-touch/non-diverse/arrogant/navel-gazing/etc., and discussions in the community about what can be done to fix that perception. Add to that Toby Ord’s realization (in his well-marketed book) that hey, perhaps climate change is a bigger x-risk (if indirectly) than he had previously thought, and I think we have fertile ground for posts like this one. EA’s attitude has already shifted once (away from earning-to-give); perhaps the next shift is an embrace of issues that are already in the public consciousness, if only to attract more diversity into the broader community.
I’ve had smart and very morally-conscious friends laugh off the entirety of EA as “the paperclip people”, and others refer to Peter Singer as “that animal guy”. And I think that’s really sad, because they could be very valuable members of the community if we had been more conscious to avoid such alienation. Many STEM-type EAs think of PR considerations as distractions from the real issues, but that might mean leaving huge amounts of low-hanging utility fruit unpicked.
Explicitly putting present-welfare and longtermism on equal footing seems like a good first step to me.