Appreciate the post—I agree that building EA could probably be bigger and this post made me update toward maybe being more likely to act this out. It also got me thinking of weird potentialities like ‘missing generations’ of EAs (produced by swings in comm-build effort) making us less likely to translate to the future.
To steelman your bolde[d] hypothetical conclusion with a counterexample I thought of, I’ve got a weird impression that even that 1% of people who understood the principles would have their impact somehow diminished (idk, imagine EAG where 1⁄100 people are super valuable, seems hard to trust and find people as easily). Or maybe by some weird spillover effects by the majority of EAs just making it worse for the better ones—could look for proxy/adjacent groups where we do expect 1% to be actually impactful…
Appreciate the post—I agree that building EA could probably be bigger and this post made me update toward maybe being more likely to act this out. It also got me thinking of weird potentialities like ‘missing generations’ of EAs (produced by swings in comm-build effort) making us less likely to translate to the future.
To steelman your bolde[d] hypothetical conclusion with a counterexample I thought of, I’ve got a weird impression that even that 1% of people who understood the principles would have their impact somehow diminished (idk, imagine EAG where 1⁄100 people are super valuable, seems hard to trust and find people as easily). Or maybe by some weird spillover effects by the majority of EAs just making it worse for the better ones—could look for proxy/adjacent groups where we do expect 1% to be actually impactful…
Thanks for posting :)
Thanks :3
This is my first post, and I’m so happy to see that you appreciate it. I’ll try to address the “dilution” concern in more depth in my later posts.