I do independent research on EA topics. I write about whatever seems important, tractable, and interesting (to me).
I have a website: https://mdickens.me/ Much of the content on my website gets cross-posted to the EA Forum, but I also write about some non-EA stuff like [investing](https://mdickens.me/category/finance/) and [fitness](https://mdickens.me/category/fitness/).
My favorite things that I’ve written: https://mdickens.me/favorite-posts/
I used to work as a software developer at Affirm.
Not OP but I would say that if we end up with an ASI that can misunderstand values in that kind of way, then it will almost certainly wipe out humanity anyway.
That is the same category of mistake as “please maximize the profit of this paperclip factory” getting interpreted as “convert all available matter into paperclip machines”.