Very comprehensive report, Peter, thank you!
Vara Raturi
Karma: 8
I love this, thank you so much for writing this up. I’m actually kind of mad at myself for not writing about this, but so glad that you did it so well! I’ve been thinking about VAWG and how EA doesn’t give enough attention to this, and often have been shunned or politely asked to focus on bigger problems by many EA’s, and didn’t yet have the courage to actually sit down and write everything I feel about this, and all the research I could have done to articulate my point. I’d be very happy to be a big part of this initiative of exploring this cause-area within EA, if there is something you have thought of already.
Given that these community events help with learning and career connections the most, how do you feel about opening them up to people who work on EA causes (AI safety for instance) but are not well-versed with the EA landscape? Familiarity with EA ideas—like longtermism, for instance—are a major part of the applications for EAGs and EAGx’s. I think this gatekeeps opportunities from other talented, smart and efficient people who are working on EA causes without being affiliated with EA ideas. For instance, someone working on AI safety research who has not taken any EA courses, attended any EA events, or read any of the books that define EA ideas, but is still interested in finding job opportunities that will help them maximize their impact within their field. Perhaps I’m wrong in thinking that being EA-informed is given more weightage in the selection process, but it comes across as so!