I think I’m less worried about the risk of increased deception.
you won’t have the time for long 10-hour conversations when you hang out in the evening.
The analogy breaks down somewhat because these number of 10-hour conversations are also scaling with size of the movement, right? And I think it’s relatively discernible whether somebody actually cares about doing good when you talk to them a lot. I don’t think you need to be a particularly senior EA for noticing altruistic and impact-driven intentions.
we could actually bring any measurable fraction of humanity’s ingenuity and energy to bear on preventing humanity’s extinction and steering us towards a flourishing future, and most of those people of course will be more motivated by their own self-interest than their altruistic motivation.
Additionally I’m also less worried because I think most people actually also care about doing good and doing things efficiently. EA will still select for people who are less motivated to work in industry, where I expect wages to still be higher for somebody capable enough to scheme a great grant proposal.
I think I’m less worried about the risk of increased deception.
The analogy breaks down somewhat because these number of 10-hour conversations are also scaling with size of the movement, right? And I think it’s relatively discernible whether somebody actually cares about doing good when you talk to them a lot. I don’t think you need to be a particularly senior EA for noticing altruistic and impact-driven intentions.
Additionally I’m also less worried because I think most people actually also care about doing good and doing things efficiently. EA will still select for people who are less motivated to work in industry, where I expect wages to still be higher for somebody capable enough to scheme a great grant proposal.