Yeah, I think you’re missing the flow-through effects of contributing to the EA hivemind. There’s substantial value in having a large number of reasonably well-informed EAs thinking about and discussing EA ideas. It’s possible this is insignificant compared to the contributions of a handful of prominent full-time thinkers, but seems like an important and nontrivial question in its own right.
+1 to this. I’d say I’ve spent anywhere from 70-150 hours consuming EA content, whether that’s through events, books, podcasts, or videos. That’s mainly because I’m a community builder and I’m earning to give, and so I wanted to find out a lot about the different causes and concepts within EA.
However, even for non community builders, I’d still recommend them to learn more about EA.
There starts to be diminishing returns past 50 hours of learning maybe, but if you’re someone who wants to help convince other people to be EA’s, or want to better understand and explain what EA is, you’ll need about 30-50 hours of immersion.
I think that continually reading the EA Forum and understanding EA concepts more in-depth helps me convey to my network various perspectives and concepts that are EA-related.
For example, I can point people to resources on climate change, AI Safety, global health, and cause prioritization because of reading widely about EA and its causes. And I think more people should be aiming to do that.
Having a wide understanding of EA allows you to spread the knowledge more easily, and I believe there’s a lot of value to spreading EA knowledge.
Agreed with this. Some positive things that can come from this:
cross-pollination of ideas, especially across cause area and academic discipline
relatedly, preventing ideas from becoming too niche/insular—allowing ideas to be accessible from different view points within the community.
encouraging people to explore ideas they find off-putting in more depth, which might ultimately change their actions in the long-term. For example, many people come to longtermism after many years in the community.
reinforcing norms of critically engaging with ideas, learning, keeping an open mind and so on.
Yeah, I think you’re missing the flow-through effects of contributing to the EA hivemind. There’s substantial value in having a large number of reasonably well-informed EAs thinking about and discussing EA ideas. It’s possible this is insignificant compared to the contributions of a handful of prominent full-time thinkers, but seems like an important and nontrivial question in its own right.
+1 to this. I’d say I’ve spent anywhere from 70-150 hours consuming EA content, whether that’s through events, books, podcasts, or videos. That’s mainly because I’m a community builder and I’m earning to give, and so I wanted to find out a lot about the different causes and concepts within EA.
However, even for non community builders, I’d still recommend them to learn more about EA.
There starts to be diminishing returns past 50 hours of learning maybe, but if you’re someone who wants to help convince other people to be EA’s, or want to better understand and explain what EA is, you’ll need about 30-50 hours of immersion.
I think that continually reading the EA Forum and understanding EA concepts more in-depth helps me convey to my network various perspectives and concepts that are EA-related.
For example, I can point people to resources on climate change, AI Safety, global health, and cause prioritization because of reading widely about EA and its causes. And I think more people should be aiming to do that.
Having a wide understanding of EA allows you to spread the knowledge more easily, and I believe there’s a lot of value to spreading EA knowledge.
Agreed with this. Some positive things that can come from this:
cross-pollination of ideas, especially across cause area and academic discipline
relatedly, preventing ideas from becoming too niche/insular—allowing ideas to be accessible from different view points within the community.
encouraging people to explore ideas they find off-putting in more depth, which might ultimately change their actions in the long-term. For example, many people come to longtermism after many years in the community.
reinforcing norms of critically engaging with ideas, learning, keeping an open mind and so on.