I think surely EA is still pluralistic (“a question”) and it wouldn’t be at all surprised if longtermism gets de-emphasized or modified. (I am uncertain, as I don’t live in a hub city and can’t attend EAG, but as EA expands, new people could have new influence even if EAs in today’s hub cities are getting a little rigid.)
In my fantasy, EAs realize that they missed 50% of all longtermism by focusing entirely on catastrophic risk while ignoring the universe of Path Dependencies (e.g. consider the humble Qwerty keyboard―impossible to change, right? Well, I’m not on a Qwerty keyboard, but I digress. What if you had the chance to sell keyboards in 1910? There would still be time to change which keyboard layout became dominant. Or what if you had the chance to prop up the Esperanto movement in its heyday around that time? This represents the universe of interventions EAs didn’t notice. The world isn’t calcified in every way yet―if we’re quick, we can still make a difference in some areas. (Btw before I discovered EA, that was my angle on the software industry, and I still think it’s important and vastly underfunded, as capitalism is misaligned with longtermism.)
In my second fantasy, EAs realize that many of the evils in the world are a byproduct of poor epistemics, so they work on things that either improve society’s epistemics or (more simply) work around the problem.
I think surely EA is still pluralistic (“a question”) and it wouldn’t be at all surprised if longtermism gets de-emphasized or modified. (I am uncertain, as I don’t live in a hub city and can’t attend EAG, but as EA expands, new people could have new influence even if EAs in today’s hub cities are getting a little rigid.)
In my fantasy, EAs realize that they missed 50% of all longtermism by focusing entirely on catastrophic risk while ignoring the universe of Path Dependencies (e.g. consider the humble Qwerty keyboard―impossible to change, right? Well, I’m not on a Qwerty keyboard, but I digress. What if you had the chance to sell keyboards in 1910? There would still be time to change which keyboard layout became dominant. Or what if you had the chance to prop up the Esperanto movement in its heyday around that time? This represents the universe of interventions EAs didn’t notice. The world isn’t calcified in every way yet―if we’re quick, we can still make a difference in some areas. (Btw before I discovered EA, that was my angle on the software industry, and I still think it’s important and vastly underfunded, as capitalism is misaligned with longtermism.)
In my second fantasy, EAs realize that many of the evils in the world are a byproduct of poor epistemics, so they work on things that either improve society’s epistemics or (more simply) work around the problem.