Nice, I hope you train this bundle well! Was it linked somewhere in the In-Depth Program?
Here are 4 hypotheses for what could be going on:
Any generally smart person, in an EA context, will develop a bunch of these traits, to their advantage
There’s some factor that’s not general intelligence, but not specific to EA, that will cause these to be more correlated than expected, but would also apply to someone if they went into another field like biotech
There’s some factor that causes these traits to be more correlated in the EA context in particular, such that noticing this allows you to find people who are unusually good fits for doing EA-style work
That factor is trainable
1 & 2 are kinda interesting, but you don’t need a post on the EA Forum to tell you about intelligence and something like “competence for intellectual work.”
3 starts to get interesting because then you can take someone’s ability to both speak fluently about fish welfare and to use probabilities as a sign that someone will be able to understand and improve your organizations strategy, and even to be unusually cooperative.[1]
If that last clause sounds dangerous to you, I don’t disagree. I think over-reliance of the cooperativeness of other people sharing these traits has caused more than one problem. I nevertheless think this is one of the things that makes EA extremely powerful as an idea.
4 is where you might start getting really hyped about projects like an In Depth Fellowship or an EA Forum. 🙂
That’s what correlation means. Learning about correlations means you can update your bayesian prior about one fact when you learn about a correlated fact.
Nice, I hope you train this bundle well! Was it linked somewhere in the In-Depth Program?
Here are 4 hypotheses for what could be going on:
Any generally smart person, in an EA context, will develop a bunch of these traits, to their advantage
There’s some factor that’s not general intelligence, but not specific to EA, that will cause these to be more correlated than expected, but would also apply to someone if they went into another field like biotech
There’s some factor that causes these traits to be more correlated in the EA context in particular, such that noticing this allows you to find people who are unusually good fits for doing EA-style work
That factor is trainable
1 & 2 are kinda interesting, but you don’t need a post on the EA Forum to tell you about intelligence and something like “competence for intellectual work.”
3 starts to get interesting because then you can take someone’s ability to both speak fluently about fish welfare and to use probabilities as a sign that someone will be able to understand and improve your organizations strategy, and even to be unusually cooperative.[1]
If that last clause sounds dangerous to you, I don’t disagree. I think over-reliance of the cooperativeness of other people sharing these traits has caused more than one problem. I nevertheless think this is one of the things that makes EA extremely powerful as an idea.
4 is where you might start getting really hyped about projects like an In Depth Fellowship or an EA Forum. 🙂
That’s what correlation means. Learning about correlations means you can update your bayesian prior about one fact when you learn about a correlated fact.