Keeping the world around probably does that, so you should donate to Longtermist charities (especially because they potentially increase the number of people ever born, thus giving more people a chance of getting into heaven).
I often get the sense that people into fanaticism think that it doesn’t much change what they actually should support. That seems implausible to me. Maybe you should support longtermist causes. (You probably have to contort yourself to justify giving any money to shrimp welfare.) But I would think the longtermist causes you should support will also be fairly different from ‘mainstream’ causes, and look rather weird close up. You don’t really care if the species colonizes the stars and the future is full of happy people living great lives. If some sort of stable totalitarian hellscape offers a marginally better (but still vanishingly small) chance of producing infinite value, that is where you should put your money.
Maybe the best expected value would be to tile the universe with computers trying to figure out the best way to produce infinite value under every conceivable metaphysical scheme consistent with what we know and run them all until the heat death of the universe before trying to act. Given that most people are almost certainly not going to do that, you might think that we shouldn’t be looking to build an aligned AI, we should want to build a fanatical AI.
Has your fanaticism changed your mind much about what is worth supporting?
It’s made me a bit more Longtermist. I think that one of the more plausible scenarios for infinite value is that God exists and actions that help each other out infinitely strengthen our eternal relationship, and such a judgment will generally result in doing conventionally good things. I also think that you should have some uncertainty about ethics, so you should want the AI to do reflection.
I often get the sense that people into fanaticism think that it doesn’t much change what they actually should support. That seems implausible to me. Maybe you should support longtermist causes. (You probably have to contort yourself to justify giving any money to shrimp welfare.) But I would think the longtermist causes you should support will also be fairly different from ‘mainstream’ causes, and look rather weird close up. You don’t really care if the species colonizes the stars and the future is full of happy people living great lives. If some sort of stable totalitarian hellscape offers a marginally better (but still vanishingly small) chance of producing infinite value, that is where you should put your money.
Maybe the best expected value would be to tile the universe with computers trying to figure out the best way to produce infinite value under every conceivable metaphysical scheme consistent with what we know and run them all until the heat death of the universe before trying to act. Given that most people are almost certainly not going to do that, you might think that we shouldn’t be looking to build an aligned AI, we should want to build a fanatical AI.
Has your fanaticism changed your mind much about what is worth supporting?
It’s made me a bit more Longtermist. I think that one of the more plausible scenarios for infinite value is that God exists and actions that help each other out infinitely strengthen our eternal relationship, and such a judgment will generally result in doing conventionally good things. I also think that you should have some uncertainty about ethics, so you should want the AI to do reflection.