Community Organiser for EA UK
Organiser for EA Finance
I might have missed this but can you say how many people took the survey, and how many people filled out the FTX section?
It might have increased recently, but even in 2015, one survey found 44% of the American public would consider AI an existential threat. It’s now 55%.
I think EA would improve with more competition as well, what I’m suggesting is more competition in the ‘larger’ orgs category. If someone has a disagreement with how things are run at one of the bigger EA orgs, there are very few other places for them to go within EA.
I don’t think that the issues you linked to are because of centralisation, there are lots of small organisations badly run, and without larger organisations there often isn’t a way to hold bad actors accountable.
I’m talking about increasing the number of large organisations. I don’t think I can do much about how many different types of funders we have, which is a separate question.
On the first point, if FTX had happened and there were more large EA organisations, it would have been easier to handle the fall out from that, with more places for smaller organisations and individuals to go to for support.
On the last point it seems like that was a part of why DARPA had success, they had lots of projects and were focused on the best succeeding rather than maintaining failing ideas.
I think every cause can be presented normally/weirdly depending on how you do it, it was just in that example Kelsey was discussing global dev and I think a lot of people in EA assume that more people are interested in global development as they are just looking outside their bubble into a slightly larger bubble.
I would agree that it’s usually best to introduce people to ideas closer to their interests (in any cause area) before moving onto related ones. Although sometimes they’ll be more interested in the ‘weird’ ideas before getting involved in EA, and EA helps them approach it practically.
I’m not sure the metaphor holds up.
I imagine there are many more people interested in AI Safety, Biosecurity, Nuclear Risks who would be put off if they had to start by learning about the GWWC pledge.
Kelsey Piper writing about Vox analytics - ‘Global poverty stuff doesn’t do very well. This is something that makes me very sad, and it makes my mother very sad. She reads all my articles, and she’s like, “The global poverty stuff is the best, you should do more of that.” I also would love to do more of that. I think it’s a really important topic, but it doesn’t get nearly as many views or as much attention as both the existential risk stuff and sort of the animal stuff and the weird big ideas sort of content.’
It seems to me that the new Community section is more like what a traditional subforum would look like on a forum.
For the other subforums to have succeeded they probably should have been on topics that are getting lots of posts/comments already.
At the moment I think 80,000 understand they have one of the largest reaches of an org with connections to EA and so they spend some of their time on areas they don’t personally think are priorities.
If they decide to shift, it may just to be clearer about their longtermism focus and reduce the amount of time they spend on other causes.
Maybe there is some confusion over thinking that 80,000 Hours represents EA and should represent causes more equally, but 80k have said (maybe not clearly or often enough) that they are focused on a few priority areas, rather than trying to be the EA Careers Org.
This isn’t a new change.
There are other organisations attempting to cover other causes, Animal Advocacy Careers and Probably Good but are less well known.
I know a few people who have gone through EF and have said good things about their program. Also one of the founders has interest in EA and has written about it in his blog.
No, mainly because if people interested in EA wanted to become entrepreneurs, Entrepreneur First is already set up (as well as other incubators) to help people do that.
There is a directory on the EA UK website with different identity/affinity groups.
This may be true in some cases but there are examples of people who would prefer not to work on longtermism related causes if they were just following interests.
Also depending on which career people are in, longtermism related roles often pay less than the private sector.
This was a link post so it’s probably best to add a comment to the original—https://srajagopalan.substack.com/p/altruism-and-development-its-complicated/comments
I also don’t like the language of ‘persuading’ and convincing’.
See the 3rd point of this post.
3.Avoid naive EA outreach.
Outreach is an offer, not persuasion. It can be tempting to try and persuade as many people about EA and run events that tweak the message of EA in an attempt to appeal to certain people. From our experience, this is generally a dangerous approach as it leads to low-fidelity diluted or garbled messages. Instead, think about outreach efforts as an ‘offer’ of EA where people can get a taste of what it’s about and take it or leave it. It’s OK if someone’s not interested. A useful heuristic James used for testing whether to run an outreach event is to ask “to what extent would the audience member now know whether effective altruism is an idea they would be interested in”. It turned out that many speaker events that Oxford were running didn’t fit this test, and neither did the fundraising campaign.
I’ve generally had the opposite experience. I could spend ages talking to friends and family and not change their opinions, whereas just half an hour with someone who is already interested in EA can help accelerate their progress and get them connected to other people/projects/events.
I think for people interested in EA, they should mention it to their social network and see if there is any interest. But for community builders, there is much more value from finding people who are already interested in EA and are ready to get more involved.
Also once you’ve spoken to your social network, you can’t keep on doing that in future months.
I’ve tried to maintain a list here but it can get out of date.
Do you have examples/links?
I think the tricky part is finding where smaller donors can donate, similar to GiveWell.
Those organisations have suggestions for large sums of money but there is a gap for advice for individuals that want to give to global development and are okay with it not just being RCT evidence.
That doesn’t seem to match with EA being a front cover story last year, and being shown in a positive light.