I wrote a bunch of comments over on the LessWrong side going into detail on various aspects of our plans for the future, including spending plans, so probably good to check those out (e.g. this comment).
To give a very rough high-level overview over how I expect this money to be spent:
~$1.7M for interest payments and Rose Garden Inn upkeep, which includes running an office space, hosting a lot of events, running a lot of events ourselves, and providing space for visiting scholars to sleep and work
~$1.5M for core salaries. We are likely to reshuffle our internal team structure a bit, but at least right now around 50% of those salaries go to the LessWrong team, and the other half goes to the campus team. I expect we will probably focus more on online things in the next year, so assume that more of that will go into things like developing LessWrong and the AI Alignment Forum.
$0 - $4M in FTX clawbacks. We sure got a bunch of money from FTX and at some point the creditors will probably come asking for it. I currently think we morally owe back around $1.5M, though there is some messy game theory here that makes me a bit more confused about that number, and legally it’s very unclear how much we owe back.
$500k on some projects in the reference class of printing LessWrong books, running a big AI Alignment conference, paying research prizes, paying for whistleblowers and investigations into potential bad actors in AI Alignment/EA/Rationality/X-Risk. I don’t have super specific plans here, it’s just on the order of what we’ve spent in the past on some set of projects like this.
I currently think we morally owe back around $1.5M
This might be morally accurate, but perhaps you shouldn’t be saying it online in case it emboldens people to sue you looking to claw back lots of money related to this case? I’m wondering if there’s a group of attorneys analogous to ambulance-chasers but for the FTX scandal.
Yeah, I thought about my policy here for a few hours and talked to some of the relevant stakeholders about the tradeoffs here (like our fiscal sponsor CFAR) and made a conscious decision to talk about this specific thing.
I currently think the incentives don’t really change that much, and I am willing to not pay that money back if someone is trying to do something extortive, though yeah, even the game theory of talking about this stuff feels kind of messy and makes me sad.
I appreciate Lightcone’s willingness to mention its potential legal and ethical clawback exposure to prospective donors. (At the same time, I recognize there could potentially be valid reasons for other organizations not to do so in certain circumstances.)
I wrote a bunch of comments over on the LessWrong side going into detail on various aspects of our plans for the future, including spending plans, so probably good to check those out (e.g. this comment).
To give a very rough high-level overview over how I expect this money to be spent:
~$1.7M for interest payments and Rose Garden Inn upkeep, which includes running an office space, hosting a lot of events, running a lot of events ourselves, and providing space for visiting scholars to sleep and work
~$1.5M for core salaries. We are likely to reshuffle our internal team structure a bit, but at least right now around 50% of those salaries go to the LessWrong team, and the other half goes to the campus team. I expect we will probably focus more on online things in the next year, so assume that more of that will go into things like developing LessWrong and the AI Alignment Forum.
$0 - $4M in FTX clawbacks. We sure got a bunch of money from FTX and at some point the creditors will probably come asking for it. I currently think we morally owe back around $1.5M, though there is some messy game theory here that makes me a bit more confused about that number, and legally it’s very unclear how much we owe back.
$500k on some projects in the reference class of printing LessWrong books, running a big AI Alignment conference, paying research prizes, paying for whistleblowers and investigations into potential bad actors in AI Alignment/EA/Rationality/X-Risk. I don’t have super specific plans here, it’s just on the order of what we’ve spent in the past on some set of projects like this.
This might be morally accurate, but perhaps you shouldn’t be saying it online in case it emboldens people to sue you looking to claw back lots of money related to this case? I’m wondering if there’s a group of attorneys analogous to ambulance-chasers but for the FTX scandal.
Yeah, I thought about my policy here for a few hours and talked to some of the relevant stakeholders about the tradeoffs here (like our fiscal sponsor CFAR) and made a conscious decision to talk about this specific thing.
I currently think the incentives don’t really change that much, and I am willing to not pay that money back if someone is trying to do something extortive, though yeah, even the game theory of talking about this stuff feels kind of messy and makes me sad.
I appreciate Lightcone’s willingness to mention its potential legal and ethical clawback exposure to prospective donors. (At the same time, I recognize there could potentially be valid reasons for other organizations not to do so in certain circumstances.)
Would you like the domain aisafety.global for this? It’s one of the ones I collected on ea.domains which I’m hoping someone will make use of one day.
Possibly! I’ll reach out if our plans for this ever materialize, I was mostly just trying to give an example of a thing we’ve considered.