My philosophical axioms that are relevant for EA are largely utilitarian as long as that doesn’t interfere with truthfulness. To be clear though I am not a moral realist!
My interests are:
-forecasting
-animal welfare
-politics (unfortunately)
-intelligence research
As far as I can tell this is his “apology” from back then.