I mostly agree with you, but wanted to give my opinion on a few points:
This kind of Bostromian calculus is basically Pascal’s mugging. We have such huge uncertainty about wether we’ll spread to space, how many people there will be, when it will happen and what our personal part in it will be, that trying to reason about it is meaningless.
Spending lots of money on EAs doesn’t attract people who value effectiveness. It attracts people who like having money spent on them.
But that doesn’t mean it’s negative to spend money like this—I don’t necessarily think it is. We have to have a sustainanle core in order to help solve problems over a long time. I’m not really sure where the line passes.
Edit: after reading the article you linked to, I feel obliged to say that I remember interacting with the author on the forum. I think there may be other contributing factors to his experience.
I mostly agree with you, but wanted to give my opinion on a few points:
This kind of Bostromian calculus is basically Pascal’s mugging. We have such huge uncertainty about wether we’ll spread to space, how many people there will be, when it will happen and what our personal part in it will be, that trying to reason about it is meaningless.
Spending lots of money on EAs doesn’t attract people who value effectiveness. It attracts people who like having money spent on them.
But that doesn’t mean it’s negative to spend money like this—I don’t necessarily think it is. We have to have a sustainanle core in order to help solve problems over a long time. I’m not really sure where the line passes.
Edit: after reading the article you linked to, I feel obliged to say that I remember interacting with the author on the forum. I think there may be other contributing factors to his experience.