You’re bringing up a lot of questions that are core to the EA movement, and which have been debated in many different places. The links from CEA’s strategy page might interest you; they go into CEA’s models of how to build communities, and where “impact” comes from.
In general, there’s no simple answer to how much a person’s personal values matter for their potential impact. To give a simplistic example, value alignment with EA seems more important for a moral philosopher (whose work is all about their values) than for a biologist (if someone decides to work on anti-aging research because they want to win a Nobel Prize and think Aubrey de Grey has a cool beard, they may still do excellent, world-shaping work despite non-EA motives).
You may want to check your intuition that older generations are more value-driven against data; older people tend to be more religious, but younger people tend to give “better” answers on many important moral questions (look up “the expanding moral circle” for more on this idea). Meanwhile, the extent to which people make sacrifices to act on their values seems to fluctuate from generation to generation; political protests go from popular to unpopular to popular again, people worry less about pollution but more about eating meat, etc.
Thanks to modern communication systems and growing moral cosmopolitanism throughout the world, this is probably the best time in history to promote something like EA, and conditions are getting better every year.
You’re bringing up a lot of questions that are core to the EA movement, and which have been debated in many different places. The links from CEA’s strategy page might interest you; they go into CEA’s models of how to build communities, and where “impact” comes from.
In general, there’s no simple answer to how much a person’s personal values matter for their potential impact. To give a simplistic example, value alignment with EA seems more important for a moral philosopher (whose work is all about their values) than for a biologist (if someone decides to work on anti-aging research because they want to win a Nobel Prize and think Aubrey de Grey has a cool beard, they may still do excellent, world-shaping work despite non-EA motives).
You may want to check your intuition that older generations are more value-driven against data; older people tend to be more religious, but younger people tend to give “better” answers on many important moral questions (look up “the expanding moral circle” for more on this idea). Meanwhile, the extent to which people make sacrifices to act on their values seems to fluctuate from generation to generation; political protests go from popular to unpopular to popular again, people worry less about pollution but more about eating meat, etc.
Thanks to modern communication systems and growing moral cosmopolitanism throughout the world, this is probably the best time in history to promote something like EA, and conditions are getting better every year.