Unfortunately I’m already deep in another project which I think is also very high value. Might look into this later on.
Wil Perkins
You really need to provide proof for these sweeping allegations. I know people are worried with the current situation and I agree fraud is likely, but I’m concerned that someone making such extreme claims with 0 links or evidence besides claiming to be an insider is so highly upvoted.
If you make an extreme claim, the burden of providing sources is on you.
I think it’s a great idea. I’m sure there are strong selection effects when it comes to who decides to go to EAG, and that if we hosted it for instance in Mexico or India, there would be a large number of qualified folks that decided to come.
Not only does it help make sure we are sourcing individuals with a high potential impact from countries outside of the U.S. and U.K., it also would be a great PR move for EA, and CEA specifically. I think it would show a strong commitment from EA as a whole that we care about actual ability and potential good, rather than selecting for people with high levels of wealth, academic credentials, or raw technical ability.
My personal, anecdotal take is that the above signals have been vastly overrepresented in EA to date, which may have been helpful to grow the movement. However as we get bigger and look to sway mainstream opinion, we need to start becoming more appealing to those on the lower end of the income spectrum, people with more soft skills, and older people who have desperately needed experience.
I agree that the name needs to be changed. I’m surprised there was so little consistent pushback after Scott Alexander’s article.
Especially now that travel/food funding is going to be curtailed, it is clear that EA Global is primarily designed to bring together elite, credentialed, or wealthy people in the movement who have connections and will use the event for networking. This is not necessarily a bad thing, as often those types of people can have outsized impact. However I’m a firm believer that to maximize our effectiveness, Effective Altruism as a whole needs to start making itself more attractive to the general populace.
Even though we’ve had a lot of success carving out a niche space in the altruism community, the worldwide market for charity and ‘do-gooding’ makes EA look like a tiny unimportant blip. Even just in America, there was a total of $471 billion dollars given to charity last year. Instead of courting billionaires we need to be courting the folks who already donate, just ineffectively.
Since the initial posts are deleted, could you clarify whether these apologies came before or after MacAskill publicly posted a rebuttal he wrote based on the confidential draft?
I’d also say take it on. Someone objective can always rewrite it later, but if we don’t save it now we could lose a lot.
You should definitely prioritize it! What about creating an open source wiki of sorts to crowd source information?
You could always double check / get citations later on.
I don’t understand the reasoning behind this ban. The poster strikes me as trying to be civil, on topic, and honest per the top three forum norms.
In fact the whole post is basically arguing we should be more honest it seems.
Could you explain why the moderation team felt the need to ban the user here?
Edit: To give some more context, this is a post which to me seems to be trying to provoke a negative reaction far more directly, and I would consider warning-worthy: https://forum.effectivealtruism.org/posts/f2bR6HgABuych2XAF/now-that-we-trust-senior-eas-less
Edit2: to clarify I assumed that the top level post was being banned because I couldn’t see the comment.
Thanks for clarifying! Yeah I thought that the post was banned, based on your comment and the flag at the top of the post when I followed the forum norms link.
I’ve thought about it but I’m convinced my current startup is a more impactful area. Think it would be a great opportunity for someone without a lot of career capital.
I’d also like to point out this post as related to the topic of speaking your mind etc: https://forum.effectivealtruism.org/posts/qtGjAJrmBRNiJGKFQ/the-writing-style-here-is-bad
To my mind the artificial academic tone here makes people feel the stakes are much higher than they should be for an online discussion forum. Also, people who have English as a second language likely have much more insight into the blunders EA is making, especially when it comes to messaging our ideas to the public.
By selecting for people who talk in strict academic tones and with an overwrought style, I’d imagine we lose a lot of legitimate opinions that could help us course correct.
In addition to the other comment, I think he’s also indirectly pointing to the demographic trends (I.e. fertility rates) of social conservatives. Social conservatives have more kids, so they inherit the future. If EA is anti natalist and socially liberal, we will lose out in the long run.
He fails to bring up the tension with short AI timelines which I think is important here. Lots of AI safety folks I’ve talked to argue that long term concerns about movement building, fertility trends, etc aren’t important because of AGI happening soon.
I think this tension underlies a lot of discussions in the community.
Smuggled assumptions in “declining epistemic quality”
I don’t endorse that view myself, but yeah pointing out that I think Tyler believes it.
I absolutely agree! To put it more plainly I intuit that this distinction is a core cause of tension in the EA community, and is the single most important discussion to have with regards to how EA plans to grow our impact over time.
I’ve come down on the side of social capital not because I believe the public is always right, or that we should put every topic to a sort of ‘wisdom of the crowds’ referendum. I actually think that a core strength of EA and rationalism in general is the refusal to accept popular consensus on face value.
Over time it seems from my perspective that EA has leaned too far in the direction of supporting outlandish and difficult to explain cause areas, without giving any thought to convincing the public of these arguments. AI Safety is a great example here. Regardless of your AI timelines or priors on how likely AGI is to come about, it seems like a mistake to me that so much AI Safety research and discussion is gated. Most of the things EA talks about with regard to the field would absolutely freak out the general public—I know this from running a local community organization.
In the end if we want to grow and become an effective movement, we have to at least optimize for attracting workers in tech, academia, etc. If many of our core arguments cease to be compelling to these groups, we should take a look at our messaging and try to keep the core of the idea while tweaking how it’s communicated.
I agree with most of what you’ve written here, and I actually think this is a much better framing on why open discussion of disagreement between the new guard and the old guard is important.
If we let this problem fester, a bunch of people who are newer to the movement will get turned away. If we can instead increase the amount of talented and influential people that join EA while getting better at convincing others of our ideas, that’s where most of the impact lies to me.
A related topic is the youth and academic focus of EA. If we truly want to convince the decision makers in society then we need to practice appealing to people outside an academic setting.
I appreciate the feedback!
In an ideal situation I would definitely try and outline the uses I take issue with, and provide arguments from both sides. At the same time this is my first top level post, and I’ve held back on posting something similar multiple times due to the high level of rigor standard here.
I suppose I decided that when it comes to community building especially, intuitions, moods and gut feelings are something EA should be aware of and respond to, even if they can’t always be explained rationally. My plan is to develop more on this idea in subsequent posts.
I think that many EA community members have a view of the EA community that makes it seem much more important than many current decision makers do.
Interesting, I actually feel that I have the alternative view. In my mind people who are decision makers in EA severely overestimate the true impact of the movement, and by extension their own impact, which makes them more comfortable with keeping EA small and insular. Happy to expand here if you’re curious.
Growth comes with a lot of costs. I think that recent EA failures have highlighted issues that come from trying to grow really quickly.
Would you mind throwing in a couple of examples? To my mind, the whole SBF/FTX fiasco was a result of EA’s focus on elite people who presented as having ‘high quality epistemics.’
Many people outside the rat sphere in my life think the whole FTX debacle, for instance, is ridiculous because they don’t find SBF convincing at all. SBF managed to convince so many people in the movement of his importance because of his ability to expound and rationalize his opinions on many different topics very quickly. This type of communication doesn’t get you very far with normal, run of the mill folks.
I’ve thought about this! Actually have a couple retirees in my local EA group looking to help.
Unfortunately I asked around at EAG DC and the general response seemed to be “great idea, but I don’t know anyone focused on that.”
I got a lot of general referrals to the standard career orgs, but I think an org focusing on retirees could be highly impactful.