Im mostly just depressed about AI progress being so rapid and the ‘safety gameboard’ being in such a bad state. Im angry at the people who contributed to this terrible situation (which includes a lot of longtermist orgs).
sapphire
My honest reaction was: This is finally being taken sort of seriously. If an EVF board member acted badly then the community can’t just pretend the Time article is about people totally peripheral to the community. At least we got some kind of accountability beyond “the same team that has failed to take sufficient action in the past is looking into things.”
It honestly does feel like the dialogue is finally moving in a good direction. I already knew powerful people in EA acted very badly. So it’s honestly a relief it seems like we might get real change.
A comment I made a few days ago said “But usually very little changes until someone goes public (at least anonymously). Nothing else remotely reliably creates the momentum to get bad actors out of power.” Really aged quite well.
As always I would advise survivors who want change to be as public as possible. Anonymous public statements work fine. Of course prioritize your own safety. But private internal processes are not a vehicle for change. Owen would, as predicted, still be on the board if not for the Time article.
I think thats the public image but isn’t how things actually work internally. Id really recommend reading this comment by Buck about how “You’ve also made the (IMO broadly correct) point that a lot of EA organizations are led and influenced by a pretty tightly knit group of people who consider themselves allies”. Notably the post is pretty explicit that any proposed changes should be geared toward getting this small group onboard.
It is less public (at this point) but some of the core EAs have definitely been capricious in terms of who they want to receive any kind of support or endorsement. And they feel quite willing to do this without any community buy in.
The fact this is true, despite issues being reported to the community health team, is a serious indictment.
Honesty, never-mind radical openness, is usually impossible if one party is dependent on the other. This is honestly one reason I hate how intensely hierarchical the EA community is. Hierarchy destroys openness.
I agree that private processors are often better for survivors (Though they can be worse). But usually very little changes until someone goes public (at least anonymously). Nothing else remotely reliably creates the momentum to get bad actors out of power. If the people in power weren’t at least complicit we wouldn’t have these endemic problems. Notably this has already played out multiple times with rationalist and Ea paces. Brent was extremely egregious but until public callouts nothing was seriously done about him. In fact community leaders like eliezer praised his ‘apologies’. Sadly this reality can put a burden on survivors. There isn’t really a great approach as far as I can tell.
CEA pays a team but their main allegiance is to the existing power structure so of course they can’t solve the root issues.
Thanks for all the work you’ve done. It’s not easy.
Working with official orgs to handle sexual abuse cases almost never goes well. For obvious reasons victims want to avoid backlash. And many victims understandably dont want to ruin the lives of people they still care about. I truly wish private processes and call-ins worked better. But the only thing that creates change is public pressure. I would always endorse being as public as you can without compromising victim privacy or pressuring them to be more open about what happened. It is just a very unfortunate situation.
I basically agree but following this advice would require lowering one’s own status (relative to the counterfactual). So its not surprising people dont follow the advice.
I’m extremely opposed to the culture of silence in EA/rat spaces. It is very extreme.
I will just push back on the idea, in a top-level post, that EAG admissions are not a judgment on people as EAs. CEA is very concerned about the most promising/influential EAs having useful conversations. If you are one of the people they consider especially promising or influential you will get invited. Otherwise, they might let you in if EAG seems especially useful for shaping your career. But they will also be worried that you are lowering the quality of the conversations. Here are some quotes from Eli, the lead on EA global at CEA.
EAG is primarily a networking event, as one-to-one conversations are consistently reported to be the most valuable experiences for attendees. I think there’s less value in very new folks having such conversations — a lot of the time they’re better off learning more about EA and EA cause areas first (similar to how I should probably learn how ML works before I go to an ML conference).Very involved and engaged EAs might be less eager to come to EAG if the event is not particularly selective. (This is a thing we sometimes get complaints about but it’s hard for people to voice this opinion publicly, because it can sound elitist). These are precisely the kinds of people we most need to come — they are the most in-demand people that attendees want to talk to (because they can offer mentorship, job opportunities, etc.).
I don’t think this is really what your post is about, but I wanted to clarify: EAG exists to make the world a better place, rather than serve the EA community or make EAs happy. This unfortunately sometimes means EAs will be sad due to decisions we’ve made — though if this results in the world being a worse place overall, then we’ve clearly made a mistake.
Scott Alexander: Is the concern that the unpromising people will force promising people into boring conversations and take up too much of their time? That they’ll disrupt talks?Eli Nathan: Hi Scott — it’s hard to talk about these things publicly, but yes a big concern of opening up the conference is that attendees’ time won’t end up spent on the most valuable conversations they could be having.
Its also admitted that it is ‘hard to discuss this publicly’. ITs against EA style but to me the posts about how this isn’t a judgment are bordering on gaslighting. Even CEA’s public comms state they have a ‘bar for admission’.
I’m sorry you didn’t get invited.
- Feb 9, 2023, 9:04 AM; 5 points) 's comment on Solidarity for those Rejected from EA Global by (
You can ask buck for his list of the core EAs.
FWIW core EAs have openly said a major reason to keep EAG small is the ‘quality of conversation’ at the event. This is a big reason they made EAG smaller again. So there is definitely a level of judgment going on.
I think it is pretty important that, by its own internal logic, longtermism has had negative impact. The AI safety community probably accelerated AI progress. Open AI is still pretty connected to the EA community and has been starting arm races (at least until recently the 80K jobs board listed jobs at Open aI). This is well known but true. Longtermism has also been connected to all sorts of scandals.
As far as I can tell neartermist EA has been reasonably successful. So its kind of concerning that institutional EA is dominated by longtermists. Would be nice to have institutions run by people who genuinely prioritize neartermism.
EVF is an umbrella org that manages community events, community health, public comms like EA handbook and curriculum, press interface, etc largely through CEA. It handles these tasks for both longtermism and the rest of EA. This is suboptimal IMO. The solution here is not just a financial split.
I doubt she agrees with the accusations but I assume she knows they exist.
Would be extremely surprising if she didn’t know about the sexual abuse allegations. They are very well known among her social circle. Despite this she has chosen to defend the fellow.
Pm’d you
Less theoretical example: FWIW im not sold on ‘more than anyone’ but the top 2-3 current AI labs are all downstream of AI safety!