My philosophical axioms that are relevant for EA are largely utilitarian as long as that doesn’t interfere with truthfulness. To be clear though I am not a moral realist!
My interests are:
-forecasting
-animal welfare
-politics (unfortunately)
-intelligence research
In the row for “Nuclear conflict scales beyond Ukraine in the next month after the initial nuclear weapon use” it says that the probability is 0.36%. I think that is a typo and should say 0.386%.