I think you actually shifted me slightly to the ‘announcement was handled well’ side (even if not fully) with the idea that blatant honesty (since their work was mainly AI anyway for the last year or so) plus the very clear change descriptors.
I am a bit wary of such a prominent resource such as 80k endorsing a sudden cause shift without first reconstructing the gap- I know they don’t owe it to anyone, especially during such a tumultous time of AI risk, and there are other orgs (Probably Good, etc) but to me, 80k seemed like a very good intro into ‘EA Cause Areas’ that I can’t think of another current substitute for. The problem profiles for example not being featured/promoted is fine for individuals already aware of their existence, but when I first navigated to 80k, I saw the big list of problem profiles and that’s how I actually started getting into them, and what led to my shift from clinical medicine to a career in biosec/pandemics.
I think you actually shifted me slightly to the ‘announcement was handled well’ side (even if not fully) with the idea that blatant honesty (since their work was mainly AI anyway for the last year or so) plus the very clear change descriptors.
I am a bit wary of such a prominent resource such as 80k endorsing a sudden cause shift without first reconstructing the gap- I know they don’t owe it to anyone, especially during such a tumultous time of AI risk, and there are other orgs (Probably Good, etc) but to me, 80k seemed like a very good intro into ‘EA Cause Areas’ that I can’t think of another current substitute for. The problem profiles for example not being featured/promoted is fine for individuals already aware of their existence, but when I first navigated to 80k, I saw the big list of problem profiles and that’s how I actually started getting into them, and what led to my shift from clinical medicine to a career in biosec/pandemics.