In particular one that affects one’s whole life focus in a massive way.
I’m actually not very confident that dropping the first four claims would affect my actual behaviours very much. (Though it definitely could, and I guess we should be wary of suspicious convergence.)
To be clearer on this, I’ve now edited the post to say:
Perhaps most significantly, as noted in the spreadsheet, it seems plausible that my behaviours would stay pretty similar if I lost all credence in the first four claims
Here’s what I say in the spreadsheet I’d do if I lost all my credence in the 2nd claim:
Maybe get back into video games, stand-up comedy, and music? But it feels hard to say, partly because currently I think spending lots of time on EA-aligned things and little time on video games etc. is best for my own happiness, since otherwise I’d have nagging sense that I should be contributing to things that matter. But maybe that sense would go away if I lost my belief that there are substantial moral reasons? Or maybe I’d want to push that updated belief aside and keep role-playing as if morality mattered a lot.
And here’s what I say I’d do if I lost all my credence in the 4th claim:
If I lost belief in this claim, but thought there was a non-negligible chance we could learn about moral truths (maybe by creating a superintelligence, exploring distant galaxies, “breaking out of the simulation”, or whatever), I might try to direct all efforts and resources towards learning the moral truth, or towards setting ourselves up to learn it (and then act on it) in future.
This might look pretty similar to reducing existential risk and ensuring a long reflection can happen. (Though it also might not. And I haven’t spent much time on cause prioritisation from the perspective of someone who doesn’t act on those first four claims, so maybe my first thoughts here are mistaken in some basic way.)
I’m actually not very confident that dropping the first four claims would affect my actual behaviours very much. (Though it definitely could, and I guess we should be wary of suspicious convergence.)
To be clearer on this, I’ve now edited the post to say:
Here’s what I say in the spreadsheet I’d do if I lost all my credence in the 2nd claim:
And here’s what I say I’d do if I lost all my credence in the 4th claim:
This might look pretty similar to reducing existential risk and ensuring a long reflection can happen. (Though it also might not. And I haven’t spent much time on cause prioritisation from the perspective of someone who doesn’t act on those first four claims, so maybe my first thoughts here are mistaken in some basic way.)