There should be alternatives to EAGs/EAGxs—one’s that are cause area specific and/or for people interested in EA ideas but not necessarily needing to call yourself an EA.
ES
Responding to the attention on Kathy’s specific case (I’m aware I’m adding more to it) - I think we’re detracting from the key argument that the EA community as a whole is neglecting to validate and support community members who experience bad things in the community
In this post, it’s women and sexual assault primarily. But there are other posts (1, 2) exempifying ways the EA community itself can and should prioritise internal community health. To argue the truth of one specific example might be detracting from recognising that this might be a systematic problem.
That’s a true point—but I don’t a good objective. EA should strive to exist with the best, highly-aligned to doing good people and I think we need a culture the prioritises people’s lived experiences, feelings, and interactions for that to happen.
I’m strongly in favour of this—it often feels like the need is to make this public so it becomes something the entire community is responsible for—as opposed to how it currently is (private and something CEA’s comm health mainly is responsible for).
Your comment (at least how it’s read as, maybe different from your intentions) reads as “that’s a particularly problematic location, just go to a different one”.
That doesn’t solve the problem. That doesn’t hold the Bay * or any community accountable or push for change in a positive direction. I think that sort of logic is a common response to what Maya writes about and doesn’t help or make anything better.
*and this is coming from an ex-Berkeley community builder
(saying this in a friend capacity and in shock that I haven’t introduced you two already) - you two should definitely talk!
I love it! Also having a tag all the time for drafts and unfinished thoughts would be nice too?
Thanks for sharing!
Is anyone working on AI and biotech regulation in specific or/ have good readings for the state of the field / existing regulation?
Yes, definitely! Here’s a poorly formatted (sorry doesn’t have links) list of relevant papers, there’s this reading list, and this one, and this forum post
Ooh a fun history less! Thanks! I’ll edit it :)
Thank you!!
Ooh sorry! I meant to add more and not be so vague and then forgot I published this and didn’t edit it. I’ll update it over the new few days hopefully
I’d recommend taking an Insights test if you can. I don’t usually believe in all the personality test hype but we did Insights as a team at Atlas and it was super helpful (and like 95% accurate!) about each persons’ working styles, communication profiles, etc. I used it to make my work with my doc
I have a long-ish list of research paper ideas on biosecurity and ethics. I’d be really helpful if there were people interested in doing some of this research or taking on projectsEdit: on hold, but thanks everyone :)
I’d support it! I’ll pass it along to the team
Hi welcome :)
On career advice—have you talked to 80k or GCP? I’m also happy to talk 1-1 if I can be of any help.On your study—would love to hear more about what it’s on!
Welcome! Let me know if there’s anything in specific I can help with
Thanks for this! I’m organising EAGxBerkeley and we have a post-EAGx plan that has some of the same stuff, but I’ll try to incorporate more of your document. The main bottleneck is on my end (as an events organiser) is just people who can organising retreats and serious post-events.
What role do you think bioethics and bioethicists have on biosecurity and AI regulation? I’ve been thinking a lot about how represented bioethics should be in the two fields and if advocating for their increased involvement would help reduce x-risks. Thoughts?