To share some anecdotal data: I personally have had positive experiences doing regular coaching calls with Kat this year and feel that her input has been very helpful.
I would encourage us all to put off updating until we also get the second side of the story—that generally seems like good practice to me whenever it is possible.
Thanks for the post!
A related question: Is LTFF more likely to fund a small AI safety research group than to fund individual independent AI Safety researchers?
So could we see a scenario where, if person A, B or C apply individually for an independent research grant, they might not meet your funding bar. But where, if similarly impressive people with a similarly good research agenda applied as a research group, they would be a more attractive funding opportunity for you?
Thanks for publishing this! I added it to this list of impactful org/project ideas
Hi Rime, I’m not aware of any designated online space for independent alignment researchers either. Peer support networks are a central part of the plan for Catalyze so hopefully we’ll be able to help you out with that soon! I just created a channel on the AI Alignment slack called ‘independent-research’ for now (as Roman suggested).
As for the fiscal sponsorship, it should not place any constraints on the independence of the research. The benefits would be that fundraising can be easier, you can get administrative support, tax-exempt status, and increased credibility because you are affiliated with an organization (which probably sounds better than being independent, especially outside of EA circles).
I currently don’t see risks there that would restrict independent researchers’ independence.
Fair point, I see understand what you meant now. I think these would be great resources for us to potentially connect the independent researchers we would incubate with as well
The current plan is to run a pilot starting in July
Great point! They are currently compiling their results for what people have been doing post-MATS, I’m also curious what the results are
I understand it may look quite similar to different initiatives because I am only giving a very broad description in this post. Let me clarify a few things which will highlight differences with the other orgs/projects you mention:
-Catalyze’s focus is on the post-SERI MATS part of the pipeline (so targeting people who have already done a lot of upskilling—e.g. already done AI Safety Camp/SERI MATS)
-The current plan is not to fund the researchers but to support already funded researchers (the ‘hiring’ them is just another way of saying their funding would not be paid out directly to them but first go through an org with tax-deductibility benefits e.g. 501(c)3 and then go to them). - so no overlap with LTFF there. One exception to the supporting already funded researchers is helping not-yet funded researchers in the fundraising process.
I don’t really see similarities with Nonlinear apart from both naming ourselves ‘incubators’. Same for with ENAIS apart from them also connecting people together.
In short, I agree these interventions are not new. I think the packaging them up together and making a few additions & thereby making them easily accessible to this specific target group is most of the added value here.
Thanks for sharing! I skimmed through the things you linked but will read it in more detail soon
I understand what tension you are describing, a question for clarification: I personally am not familiar with your description that “EA is already known for shoving women into community building/operations roles”, where does this sense come from?
And I think that’s another tangible proposal you’re making here which I’d like to draw attention to and make explicit to see what others think: creating quota for how many spots have to go to women at conferences, organizations, fellowships etc.
Hi Larks, thank you for taking the time to articulate your concerns! I will respond to a few below:
Concern 1: passing off evidential burden
• I agree it would be preferable if we would have a made a solid case for why gender diversity is important in this post.
-> To explain this choice: we did not feel like we could do this topic justice in the limited time we had available for this so decided to prioritize sharing the information in this post instead. Another reason for focusing on the content of the post above is that we had a somewhat rare opportunity to get this many people’s input on the topic all at once—which I would say gave us some comparative advantage for writing about this rather than writing about why/whether gender diversity is important.
• As you specifically mention that you think “relying on posts that received a lot of justified criticism” is a bad idea, do you have suggestions for different posts that you found better?
Concern 2: “Some of your proposals, like adopting “the patriarchy” as a cause area, or rejecting impartiality in favour of an “ethics of care”, are major and controversial changes”
• Something I’d like to point out here: these are not our proposals. As we mention in the post, ‘The views we describe in this post don’t necessarily correspond with our (Veerle Bakker’s & Alexandra Bos’) own but rather we are describing others’ input.′ For more details on this process, I’d recommend taking a look at the Methodology & Limitations if you haven’t already.
-> Overall, I think the reasons you mention for not taking on the proposals under ‘Adjusting attributes of EA thought’ are very fair and I probably agree with you on them.
• A second point regarding your concern: I think you are conflating the underlying reasons participants suspected are behind the gender gap with the solutions they propose.
saying ‘X might be the cause of problem Y’, is not the same as saying:
‘we should do the opposite from X so that problem Y is solved’
Therefore, I don’t feel that, for instance, your claim that a proposal in this post was to adopt “the patriarchy” as cause area fairly represents the written content. What we wrote is that “One of these topics is how EA does not focus specifically on gender inequality issues in its thinking (e.g. ‘the patriarchy’ is not a problem recommended to work on by the EA community).” This is a description of a concern some of the participants described, not a solution they proposed. The same goes for your interpretation that the proposal is “rejecting impartiality in favour of an ethics of care”.
Thank you for the addition! I added it
Good point, it also reminded me to add the tags to the main text of this post as another tip for where to look. Thank you!
Awesome, sounds like you have cracked the code :)
This makes me wonder: how did you get your way into 6 TEDx line-ups? Did you reach out to organizers as I described in the post or did you take some different approach?
Edit: turned this comment into a separate post: https://forum.effectivealtruism.org/posts/TqNAgPpNwu6dCrycN/how-to-get-ea-ideas-onto-the-tedx-stage
Thanks for your post! I organized a TEDx event which took place in April of this year so I’d like to add my insights into two ways in which more EA TEDx talks can be initiated (A. joining existing TEDx events & B. organizing TEDx events) .
A. Get yourself (or someone suitable) into a TEDx event line-up
(NB: you don’t need to be a student! Nor someone related to the university at which the event is held!)
Step 1: spot event organizers
•Do you already know someone involved in organizing a TEDx event?
•Alternatively: take a look at this map https://www.ted.com/tedx/events and see which events are +/- 2-12 months from now and at a distance you’d travel to (at your own costs/maybe there are some EA funds available for a cost like this?), go to the relevant event pages, check who the organizers are at the bottom of the page, and find their contact info (Linkedin/social media page DMs/whatever the internet can find you)
Step 2: contact the spotted event organizer(s)
•Send the organizer(s) a message introducing yourself, asking if they are still looking for speakers for their line-up, and pitching your idea for a TEDx talk. Contact as many as you can for optimal odds :)
(one of the people who ended up being a speaker at our event got himself into the line-up by finding me on LinkedIn and messaging me, and maybe 2-3 people tried in total, so it’s possible! And likely not something so many people do that you wouldn’t stand out by proactively trying.)
B. Organize a TEDx event and invite an EA speaker
•You can organize an event for your university or another type of event such as a ‘studio event’, a TEDx youth event (for schools), a business event (internally for a company), a library event and more. Anyone can take initiative and apply for a license with TED which (when approved) allows you to use the ‘TEDx’ platform in exchange for adhering to their rules.
note: one of these TEDx rules is diversity of topics of the talks at an event, so an event with several speakers on EA ideas might be difficult to get away with (but it may be possible when you approach it strategically).
•Organizing a TEDx event is a large time commitment but it could be worthwhile if you want to gain skills/career capital and at the same time offer a stage to EA ideas. I personally feel that I learned a lot (!) from organizing this (e.g. leading a team, finances, logistics, project management etc.), and I think it has quite some CV value as well as the name TEDx looks good.
•To get a better picture of what organizing a TEDx event looks like, check out the organizer’s guide linked below and feel free to contact me for questions/advice (email@example.com) https://www.ted.com/participate/organize-a-local-tedx-event/tedx-organizer-guide