Just gonna weigh in on some of these from my time researching this stuff at Nonlinear.
A common knowledge spreadsheet of directly responsible individuals for important projects.
Strongly agree. It’s logistically easy to do, one person could cover 80% of EA projects within a week. I’ve been using AI Existential Safety Map (aisafety.world) a lot in my list of followups for 1-on-1s.
In the long run, a well-maintained wiki similar to/synced with the EA Opportunities Board (which I also heavily recommend) could make this really comprehensive.
More “public good”-type resources on the state of different talent pipelines and important metrics (e.g., interest in EA).
I read every EA survey I see. They’re often quite interesting and useful. I wouldn’t say they’re neglected since EAs do seem to love surveys, but usually a net positive.
More coherent and transparent communication about the funding situation/bar and priorities.
I am of the opinion that every EA funder should be as transparent and detailed about their funding bar/criteria as possible. Unlike for-profits/VCs, I don’t see a strong reason for secrecy other than infohazards. It helps applicants understand what funders look for which helps both funders and applicants. I believe that applicant misconceptions about “what funders want” can hinder EA a lot in the long run due to mismatched incentives. I see a lot of compelling project directions censored/discarded in the early stages simply because applicants think they should be more generic (because being more generic works well in conventional success pathways).
More risk management capacity for EA broadly as a field and not just individual orgs.
Advanced 80K: Career advice targeted at highly committed and talented individuals.
Agree, but I never figured out how to scalably execute this. Usually, if someone has a skillset+motive to do really well in EA, my priority is to 1. inform them of helpful resources to fill in themselves 2. try to link them with someone doing what they’re trying to do.
The problem is that it seems hard to predict in advance who they’d consider a valuable connection. I think none of my most valuable connections in EA so far would’ve been referred to me by someone else.
Tractable idea: A list of helpful links sent to EAGx and EAG attendees post-conference.
A survey to identify why high-value potential members “bounce off” EA.
I actually have bounced off EA for 3 years before (2019-2022). For me, the big reason was that I couldn’t find any follow-up steps to pursue (especially coming Singapore). My experience within EA is very inspiring and exciting interactions followed by not much follow-up (guidance, next steps, pursuing opportunities, encouraging people to start projects etc.).
[just gonna agree with all the AI Safety points, they’ve all come up before in my discussions]
Casual observation that I can’t recall a single EA social media account that I browse simply because it’s fascinating and not because I wanna support EA on social media.
And I’m into weird stuff, too. I just binged hour-long videos on Soviet semiconductors and the history of Chang’an.
Incubators: One respondent stated that incubators are “super hard and over-done,” mentioning that they are too meta and often started by people without entrepreneurial experience.
I think it’s just hard to do well because there’s so many points of failure, it takes a long time for any results to show and it requires both social skills and technical expertise. That said, I do think a longtermist version of Charity Entrepreneurship seems promising to pilot (actually, I’m gonna bring this up to Kat Woods right now).
Just gonna weigh in on some of these from my time researching this stuff at Nonlinear.
Strongly agree. It’s logistically easy to do, one person could cover 80% of EA projects within a week. I’ve been using AI Existential Safety Map (aisafety.world) a lot in my list of followups for 1-on-1s.
In the long run, a well-maintained wiki similar to/synced with the EA Opportunities Board (which I also heavily recommend) could make this really comprehensive.
I read every EA survey I see. They’re often quite interesting and useful. I wouldn’t say they’re neglected since EAs do seem to love surveys, but usually a net positive.
I am of the opinion that every EA funder should be as transparent and detailed about their funding bar/criteria as possible. Unlike for-profits/VCs, I don’t see a strong reason for secrecy other than infohazards. It helps applicants understand what funders look for which helps both funders and applicants. I believe that applicant misconceptions about “what funders want” can hinder EA a lot in the long run due to mismatched incentives. I see a lot of compelling project directions censored/discarded in the early stages simply because applicants think they should be more generic (because being more generic works well in conventional success pathways).
I really liked this post Cash and FX management for EA organizations — EA Forum (effectivealtruism.org) by @JueYan.
Agree, but I never figured out how to scalably execute this. Usually, if someone has a skillset+motive to do really well in EA, my priority is to 1. inform them of helpful resources to fill in themselves 2. try to link them with someone doing what they’re trying to do.
The problem is that it seems hard to predict in advance who they’d consider a valuable connection. I think none of my most valuable connections in EA so far would’ve been referred to me by someone else.
Tractable idea: A list of helpful links sent to EAGx and EAG attendees post-conference.
I actually have bounced off EA for 3 years before (2019-2022). For me, the big reason was that I couldn’t find any follow-up steps to pursue (especially coming Singapore). My experience within EA is very inspiring and exciting interactions followed by not much follow-up (guidance, next steps, pursuing opportunities, encouraging people to start projects etc.).
Shoutout to @Mo Putera who is working on this.
Casual observation that I can’t recall a single EA social media account that I browse simply because it’s fascinating and not because I wanna support EA on social media.
And I’m into weird stuff, too. I just binged hour-long videos on Soviet semiconductors and the history of Chang’an.
Agree, this point has been discussed in detail before. What we learned from a year incubating longtermist entrepreneurship — EA Forum (effectivealtruism.org)
I think it’s just hard to do well because there’s so many points of failure, it takes a long time for any results to show and it requires both social skills and technical expertise. That said, I do think a longtermist version of Charity Entrepreneurship seems promising to pilot (actually, I’m gonna bring this up to Kat Woods right now).
I really like Manifund as a platform!
Thanks for adding these thoughts!