I’m still a little confused as to whether these certificates are intended to confer social status. If not, why should I value universes in which I own certificates more highly than universes in which I don’t?
Should I just look at the big picture and decide it’s beneficial to self-modify so as to give ownership of certificates intrinsic value in my utility function?
One possible use for certificates other than bragging rights is A/B testing—pick two EAs with similar skills and resources but different strategies, and see who ends up with more certificates.
You can think of it as a way of doing accounting for causal responsibility if you want. But yes, the argument is aiming for “if we did this, the outcome would be good,” and I’m leaving it up to your decision theory to justify doing things that lead to good outcomes.
I’m still a little confused as to whether these certificates are intended to confer social status. If not, why should I value universes in which I own certificates more highly than universes in which I don’t?
Should I just look at the big picture and decide it’s beneficial to self-modify so as to give ownership of certificates intrinsic value in my utility function?
One possible use for certificates other than bragging rights is A/B testing—pick two EAs with similar skills and resources but different strategies, and see who ends up with more certificates.
You can think of it as a way of doing accounting for causal responsibility if you want. But yes, the argument is aiming for “if we did this, the outcome would be good,” and I’m leaving it up to your decision theory to justify doing things that lead to good outcomes.