In particular one that affects oneās whole life focus in a massive way.
Iām actually not very confident that dropping the first four claims would affect my actual behaviours very much. (Though it definitely could, and I guess we should be wary of suspicious convergence.)
To be clearer on this, Iāve now edited the post to say:
Perhaps most significantly, as noted in the spreadsheet, it seems plausible that my behaviours would stay pretty similar if I lost all credence in the first four claims
Hereās what I say in the spreadsheet Iād do if I lost all my credence in the 2nd claim:
Maybe get back into video games, stand-up comedy, and music? But it feels hard to say, partly because currently I think spending lots of time on EA-aligned things and little time on video games etc. is best for my own happiness, since otherwise Iād have nagging sense that I should be contributing to things that matter. But maybe that sense would go away if I lost my belief that there are substantial moral reasons? Or maybe Iād want to push that updated belief aside and keep role-playing as if morality mattered a lot.
And hereās what I say Iād do if I lost all my credence in the 4th claim:
If I lost belief in this claim, but thought there was a non-negligible chance we could learn about moral truths (maybe by creating a superintelligence, exploring distant galaxies, ābreaking out of the simulationā, or whatever), I might try to direct all efforts and resources towards learning the moral truth, or towards setting ourselves up to learn it (and then act on it) in future.
This might look pretty similar to reducing existential risk and ensuring a long reflection can happen. (Though it also might not. And I havenāt spent much time on cause prioritisation from the perspective of someone who doesnāt act on those first four claims, so maybe my first thoughts here are mistaken in some basic way.)
Iām actually not very confident that dropping the first four claims would affect my actual behaviours very much. (Though it definitely could, and I guess we should be wary of suspicious convergence.)
To be clearer on this, Iāve now edited the post to say:
Hereās what I say in the spreadsheet Iād do if I lost all my credence in the 2nd claim:
And hereās what I say Iād do if I lost all my credence in the 4th claim:
This might look pretty similar to reducing existential risk and ensuring a long reflection can happen. (Though it also might not. And I havenāt spent much time on cause prioritisation from the perspective of someone who doesnāt act on those first four claims, so maybe my first thoughts here are mistaken in some basic way.)