The suits who hear forecasts that AGI (or other stuff) is powerful and doom-inducing might just hear that it’s POWERFUL and doom-inducing whereas the message we really want to get across (to the extent we want to get messages across at all) is that it’s powerful and DOOM-INDUCING.
Altruistic actors may be more inclined to steer the world towards some plausible conceptions of utopia. In contrast, even if we avert doom, less altruistic actors might still overall be inclined to preserve existing hierarchies and stuff, which could be many orders of magnitude away from optimality.
Some quick thoughts:
The suits who hear forecasts that AGI (or other stuff) is powerful and doom-inducing might just hear that it’s POWERFUL and doom-inducing whereas the message we really want to get across (to the extent we want to get messages across at all) is that it’s powerful and DOOM-INDUCING.
Altruistic actors may be more inclined to steer the world towards some plausible conceptions of utopia. In contrast, even if we avert doom, less altruistic actors might still overall be inclined to preserve existing hierarchies and stuff, which could be many orders of magnitude away from optimality.
Also happy to chat further in person.