I do independent research on EA topics. I write about whatever seems important, tractable, and interesting (to me).
I have a website: https://mdickens.me/ Much of the content on my website gets cross-posted to the EA Forum, but I also write about some non-EA stuff over there.
I used to work as a software developer at Affirm.
Superforecasters tend to believe x-risk isn’t a big deal. Regardless of whether they’re using reasonable procedures, they’re getting the wrong object-level answer in this case. FRI’s consulting plausibly made the scaling policies worse. Hard to say without knowing more details.
(I’m thinking particular of XPT, which is from 2023 so it may be outdated at this point. But that tournament had superforecasters predicting only a 1% chance of AI extinction, which is ludicrous and should not be used as the basis for decisions.)