I don’t think Eliezer Yudkowsky and the rationalists should be throwing stones here. Sam Altman himself claimed that “eliezer has IMO done more to accelerate AGI than anyone else”. They’ve spent decades trying to convince people of the miraculous powers of AI, and now are acting shocked that this motivated people to try and build it.
Well, they’re not claiming the moral high ground; they can consistently say that EA has been net negative, and been net negative themselves for human survival.
Yeah IIRC I think EY do consider himself to have been net-negative overall so far, hence the whole “death with dignity” spiral. But I don’t think one can claim his role has been more negative than OPP/GV deciding to bankroll OpenAI and Anthropic (at least when removing the indirect consequences due to him having influenced the development of EA in the first place).
I don’t think Eliezer Yudkowsky and the rationalists should be throwing stones here. Sam Altman himself claimed that “eliezer has IMO done more to accelerate AGI than anyone else”. They’ve spent decades trying to convince people of the miraculous powers of AI, and now are acting shocked that this motivated people to try and build it.
Well, they’re not claiming the moral high ground; they can consistently say that EA has been net negative, and been net negative themselves for human survival.
Yeah IIRC I think EY do consider himself to have been net-negative overall so far, hence the whole “death with dignity” spiral. But I don’t think one can claim his role has been more negative than OPP/GV deciding to bankroll OpenAI and Anthropic (at least when removing the indirect consequences due to him having influenced the development of EA in the first place).