Hmm the short answer is that the job markets aren’t necessarily efficient, so if it seems too good to be true for you, it might just be a really good option for you!
The longer answer is that the set of tradeoffs that are common in EA work may well sound appealing to you, but it’s not necessarily that appealing to other people. Some quick things that might make EA work less appealing for many people (especially when compared to academia):
The set of possible actions are vast, the subset of optimal actions are tiny.
Most of my EA-adjacent friends in academia do work that they think of as extremely interesting. In contrast, EA work necessarily (at least in theory) filters heavily on impact, and it’s unlikely that the same research questions will be both extremely interesting and extremely impactful.
So from an academic perspective, giving up intellectual freedom to do impactful work is often a huge sacrifice in comparison
On the flip side, if you have the type of psychology that naturally finds (e.g.) corrigibility in AI alignment or timelines for alternative proteins maximally interesting, then this may not look like a sacrifice to you at all!
More realistically, most of us reorient ourselves to make impact itself seem interesting.
It’s harder to get external prestige for doing impactful EA work (though maybe this is changing)
Compared to academia, just isn’t the same system of citations, promotions, etc, that’s as externally legible as some other career tracks like academia or the corporate world.
There’s a lot of responsibility in EA work, and this can be stressful or emotionally hard to deal with.
Hmm the short answer is that the job markets aren’t necessarily efficient, so if it seems too good to be true for you, it might just be a really good option for you!
The longer answer is that the set of tradeoffs that are common in EA work may well sound appealing to you, but it’s not necessarily that appealing to other people. Some quick things that might make EA work less appealing for many people (especially when compared to academia):
The set of possible actions are vast, the subset of optimal actions are tiny.
Most of my EA-adjacent friends in academia do work that they think of as extremely interesting. In contrast, EA work necessarily (at least in theory) filters heavily on impact, and it’s unlikely that the same research questions will be both extremely interesting and extremely impactful.
So from an academic perspective, giving up intellectual freedom to do impactful work is often a huge sacrifice in comparison
On the flip side, if you have the type of psychology that naturally finds (e.g.) corrigibility in AI alignment or timelines for alternative proteins maximally interesting, then this may not look like a sacrifice to you at all!
More realistically, most of us reorient ourselves to make impact itself seem interesting.
It’s harder to get external prestige for doing impactful EA work (though maybe this is changing)
Compared to academia, just isn’t the same system of citations, promotions, etc, that’s as externally legible as some other career tracks like academia or the corporate world.
There’s a lot of responsibility in EA work, and this can be stressful or emotionally hard to deal with.