Futurologist, public speaker, consultant https://danilamedvedev.com
NeyroKod architect http://neyrokod.info (knowledge management, exocortex, complex thinking)
NanoLab founder http://nanolabvr.com (VR chemistry education and nanomachine design)
KrioRus founder/director https://kriorus.com
Other interests: intelligence augmentation https://augmentek.online innovation systems, R&D policy aging, life extension, transhumanism
Though I am disappointed by the thrust of the author. Nanotech may be important, therefore longtermist EAs should not work on it, should not talk about it and should only study it in secret, getting paid through some EA foundation to just sit and “strategize” about its risks. Improving lives of billions of people with APM/nanotech is not valuable, saving billions of lives is not valuable, increasing man’s power over matter is not valuable, preventing civilizational collapse due to resource depletion/climate change is not valuable.
I am starting to think that longtermism may indeed be a cognitive cancer that is consuming parts of EA and transhumanism. Let’s hope I am not put on some kill list by well-meaning longtermists for this comment...