Hi, I am a Physicist, Effective Altruist and AI safety student/researcher/organiser
Resume—Linda Linsefors—Google Docs
Linda Linsefors
If I calculated correct, in the fully funded version, stipends would be 76% of the cost. Not quite >80% but close. I think I agree that stipends is not much more than than 20% of the value.
Basically I agree with you that stipends are the least cost effective part of AISC. This is why stipends are lowest on the funding priority.
However it is possible for stipend to be less necessary than the rest, but still worth paying. They are in the budget because, if someone wants to fund it, we would like to hand out stipends.
I think giving stipends to low participants from low income countries are probably cost effective, but it’s probably better to prioritise runway for future camps rather than stipends for everyone else. If you know any donors who would like to earmark their donation this way, or any other way, tell them to contact us.
I you read this post and you decide that the reason why AISC is not getting funded, are not good reasons for not funding AISC, then you have a donation opportunity!
Unless donors don’t care about optics at all, paying Remmelt’s salary is a difficult ask.
There is an easy fix to this. You can donate anonymously.
Donation link
Perhaps they could add an appendix to their funding proposal where they answer some common objections they would expect people to have
Correctly guessing what misconception others will have is hard. But discussions on earlier drafts on this post, did inspire us to start drafting something like that. Thanks.
A colleague of mine said that [if you want to attract high-profile research leads], “you are only as strong as your weakest project”—which I thought was well put.
We’re not trying to attract high profile research leads. We’re trying to start worthwhile projects and collaborations that would otherwise not have happened. If a high-profile researcher want’s minions/mentees/collaborators, they don’t need AISC, and I don’t mind if they use some other recourse (e.g. SPAR, MATS, posting on LW) to find people.
Most of these suggestions are based on speculations. I’d like a bit more evidence that it would actually make a difference, before re-structuring. Funders are welcome to reach out to us.
Responding to my self.
There is one thing (that is mentioned in the post) we know is getting in the way of funding, which is Remmelt’s image. But there wouldn’t be and AISC without Remmelt.I don’t expect pretending to be two different programs would help much.
However, donating anonymously is an option. We have had anonymous donations in the past from people who don’t want to entangle their reputation with ours.
Donors want to know that if they donate to keep it alive, you’re going to restructure the program towards something more financially viable
Most of these suggestions are based on speculations. I’d like a bit more evidence that it would actually make a difference, before re-structuring. Funders are welcome to reach out to us.
Funding is currently especially bad. It’s possible that if AISC can just survive a bit longer, things will get better.
AISC has survived each year since the program started in 2017. Which means just doing what we think is the best program, has a pretty good track record of being funded.
I think it would be valuable for AI Safety Camp to refresh its website in order to make it look more professional and polished. The easiest way to accomplish this would be to make it a project in the next round.
No it wouldn’t. Leading a project is a lot of work, significantly more work than it’s worth putting into our website, and we’re almost guaranteed to end up with something that is significantly higher work to maintain. We recently moved from WordPress to Google Site because it’s the lowest effort platform to work with.
I just heard that CEA has a policy that they don’t want events (in this case an EAGx) published to early on their webpage. If true this is absolutely idiotic. As an organiser I’d like to avoid collisions, and I naively assumed I’d be able to find the dates for upcoming EAGs and EAGxs on the EAG website, but apparently not.
I haven’t verified with CEA that this in fact their policy, and that isn’t just a misunderstanding. But it is the case that EAGxNordics 2025 is currently announced by EA Sweden (since at least Dec 6), but not yet CEA.
I think EA should have leadership that at minimum don’t stop the the flow of information. But I don’t know what to do about this. I don’t expect CEA to listen. If they where the typ of org that did that, things would look different.
Funding Case: AI Safety Camp 11
AI Safety Camp 10
At this writing www.aisafety.camp goes to our new website while aisafety.camp goes to our old website. We’re working on fixing this.
If you want to spread information about AISC, please make sure to link to our new webpage, and not the old one.
Invitation to lead a project at AI Safety Camp (Virtual Edition, 2025)
I can’t find the disclaimer. Not saying it isn’t there. But it should be obvious from just skimming the page, since that is what most people will do.
I don’t think it’s too ‘woo’/new age-y. Lot’s of EAs are meditators. There are literally meditation sessions happening at EAG London this week.
Also, Qualia Research Institute (qri.org) is EA or at least EA adjacent.
(What org is or isn’t EA is pretty vague)
Also, isn’t enlightenment notoriously hard to reach? I.e. it takes years of lots of meditation. Most humans probably don’t have both the luxury and the discipline to spend that much time. Even if it’s real (I think it is), there are probably lower hanging fruit to pick.
My guess is that helping someone to go from depressed to normal, is a bigger step in suffering reduction than from normal to enlightened. Same for lifting someone out of poverty.
However, I have not though about this a lot.
I know there are also a few people thinking about current human mental health, but I don’t think that group is very large.
Isn’t most of the current suffering in the world animal suffering?
I’d expect most suffering focused EAs to either focus on animals or S-risk prevention.
I agree with this comment.
If EA and ES both existed, I expect the main focus areas to be very different (e.g. political change is not a main focus area in EA, but would be in ES), but (if harmfull tribalism can be avoided) the movements don’t have to be opposed to each other.
I’m not sure why ES would be against charter cities. Are charter cities bad for unions?
Scandinavia didn’t become wealthy and equitable through marginal charity. Societal transformation comes from uprooting oppressive power structures.
I expect a serious intellectual movement, that aims to uplift the world to Scandinavian standards, to actually learn about Scandinavian society, and what makes it work.
“Real socialism hasn’t been tried either!” the Effective Samaritan quips back. “Every attempt has always been co-opted by ruling elites who used it for their own ends. The closest we’ve gotten is Scandinavia which now has the world’s highest standards of living, even if not entirely socialist it’s gotta count for something!”
I’m guessing that “socialism” hear means something like Marxism? Since this is the type of socialism that “has not been really tried” according to some, and also the typ of socialism that usually end up with dictatorship.
Scandinavian socialism did not come from Marxism.
Source: How Denmark invented Social Democracy (youtube.com)I’m not a historian, and I have not fact checked the above video in any way. But if fits with other things I’ve heard, and my own experience of Swedish v.s. US attitudes.
AISC9 has ended and there will be an AISC10
I misunderstood the order of events, which does change the story in important ways. The way OpenPhil handled this is not ideal for encouraging other funders, but there were no broken promises.
I apologise and I will try to be more careful in the future.
One reason I was too quick on this is that I am concerned about the dynamics that come with having a single overwhelmingly dominant donor in AI Safety (and other EA cause areas), which I don’t think is healthy for the field. But this situation is not OpenPhils fault.
Below the story from someone who was involved. They have asked to stay anonymous, please respect this.
The short version of the story is: (1) we applied to OP for funding, (2) late 2022/early-2023 we were in active discussions with them, (3) at some point, we received 200k USD via the SFF speculator grants, (4) then OP got back confirming that they would fund is with the amount for the “lower end” budget scenario minus those 200k.
My rough sense is similar to what e.g. Oli describes in the comments. It’s roughly understandable to me that they didn’t want to give the full amount they would have been willing to fund without other funding coming in. At the same time, it continues to feel pretty off to me that they let the SFF specultor grant 1:1 replace their funding, without even talking to SFF at all—since this means that OP got to spend a counterfactual 200k on other things they liked, but SFF did not get to spend additional funding on things they consider high priority.
One thing I regret on my end, in retrospect, is not pushing harder on this, including clarifying to OP that the SFF funding we received was partially uncoined, i.e. it wasn’t restricted to funding only the specific program that OP gave us (coined) funding for. But, importantly, I don’t think I made that sufficiently clear to OP and I can’t claim to know what they would have done if I had pushed for that more confidently.
I’ve asked for more information and will share what I find, as long as I have permission to do so.
What caused the restriction?
I’m noticing I’m confused. I have no hypothesis for what could case that sort of restriction.