Even if we grant the conclusions of this post as a premise, you and Gleb haven’t necessarily reached the point where you disagree with one another yet. Supposing that the primary purpose of the EA movement is to foster the very most effective EAs possible:
it could be that the very most effective EAs are will not turn out to be the hardcore EAs and most of the hardcore EAs are just spinning their wheels
it could be that Gleb’s post is a valuable indicator that the “EA talent pipeline” is leaky and we’re doing a bad job of inspiring softcore EAs to become hardcore EAs (perhaps we could e.g. emphasize self-sacrifice less and emphasize specialization of labor more, with everyone getting a distinct purpose & area of focus—what do successful groups do?)
if one assumes that moral behavior has its evolutionary origin in sending a signal to others about how virtuous a person you are, it’s possible that any given population of “softcore” EAs can only support a population of “hardcore” EAs of a certain size before signaling opportunities are used up (the 1% of Clearview High School’s students who are best at math identify as “the school math nerds”—if you take them away and put them in a magnet school, the top 1% of the remaining students will take up the “school math nerd” identity—you can see this phenomenon at work when a bright student who cruises through high school goes to an elite university and realizes they aren’t that bright by elite university standards and they have to reform their identity)
Overall, the idea that improving the experience of “softcore” EAs will significantly trade off against efforts to foster “hardcore” EAs hasn’t yet been supported. (Personally the best argument I can think of would probably something like: the “softcore” EAs will be more superficial in their judgements of which cause areas to pursue and will overwhelm the more carefully thought through judgement of the “hardcore” EAs—not sure how much to worry about this.)
It’s also worth noting that some of the world’s “most effective people” (e.g. Angela Merkel, or any successful politician or activist really) got that way by becoming popular with the masses, to the point where it’s not exactly obvious whether the masses elected Angela Merkel as their representative or Angela Merkel inspired the masses to elect her. See https://en.wikipedia.org/wiki/Great_Man_theory#Criticism
Even if we grant the conclusions of this post as a premise, you and Gleb haven’t necessarily reached the point where you disagree with one another yet. Supposing that the primary purpose of the EA movement is to foster the very most effective EAs possible:
it could be that the very most effective EAs are will not turn out to be the hardcore EAs and most of the hardcore EAs are just spinning their wheels
it could be that Gleb’s post is a valuable indicator that the “EA talent pipeline” is leaky and we’re doing a bad job of inspiring softcore EAs to become hardcore EAs (perhaps we could e.g. emphasize self-sacrifice less and emphasize specialization of labor more, with everyone getting a distinct purpose & area of focus—what do successful groups do?)
if one assumes that moral behavior has its evolutionary origin in sending a signal to others about how virtuous a person you are, it’s possible that any given population of “softcore” EAs can only support a population of “hardcore” EAs of a certain size before signaling opportunities are used up (the 1% of Clearview High School’s students who are best at math identify as “the school math nerds”—if you take them away and put them in a magnet school, the top 1% of the remaining students will take up the “school math nerd” identity—you can see this phenomenon at work when a bright student who cruises through high school goes to an elite university and realizes they aren’t that bright by elite university standards and they have to reform their identity)
Overall, the idea that improving the experience of “softcore” EAs will significantly trade off against efforts to foster “hardcore” EAs hasn’t yet been supported. (Personally the best argument I can think of would probably something like: the “softcore” EAs will be more superficial in their judgements of which cause areas to pursue and will overwhelm the more carefully thought through judgement of the “hardcore” EAs—not sure how much to worry about this.)
It’s also worth noting that some of the world’s “most effective people” (e.g. Angela Merkel, or any successful politician or activist really) got that way by becoming popular with the masses, to the point where it’s not exactly obvious whether the masses elected Angela Merkel as their representative or Angela Merkel inspired the masses to elect her. See https://en.wikipedia.org/wiki/Great_Man_theory#Criticism