Not really. I would guess 600, under a definition like “is currently working seriously on at least one alignment project”. (And I’m not counting the indirect work which I previously obsessed over.)
With “100-200” I really had FTEs in mind rather than the >1 serious alignment threshold (and maybe I should edit the post to reflect this). What do you think the FTE number is?
Not really. I would guess 600, under a definition like “is currently working seriously on at least one alignment project”. (And I’m not counting the indirect work which I previously obsessed over.)
Great, thanks—I appreciate it. I’d love a systematic study akin to the one Seb Farquhar did years back.
https://forum.effectivealtruism.org/posts/Q83ayse5S8CksbT7K/changes-in-funding-in-the-ai-safety-field
With “100-200” I really had FTEs in mind rather than the >1 serious alignment threshold (and maybe I should edit the post to reflect this). What do you think the FTE number is?
I wouldn’t want my dumb guess to stand with any authority. But: 350?