Views expressed are my own.
lukasb
Regarding AI lab coordination, it seems like the governance teams of major labs are a lot better placed to help with this, since they will have an easier time getting buy in from their own lab as well as being listened to by other labs. Also, the frontier models forum seems to be aiming at exactly this.
Thanks for writing this! If I don’t want to sign-up to the finders course, are there any resources you would recommend for doing the one-hour lovingkindness sessions?
Agreed. My view here is pretty uninformed.
If taking your lawyer’s advice, in this case, means being silent for 5-7 years, it seems like some people should speak openly and bear the costs.
Why are so many people disagreeing with this?
berglund’s Quick takes
I’d be interested to hear what you think is going wrong with Paul’s writing style, if you want to share.
Hm, yeah I guess my intuition is the opposite. To me, one of the central parts of effective altruism is that it’s impartial, meaning we shouldn’t put some people’s welfare over other’s.
I think in this case it’s particularly important to be impartial, because EA is a group of people that benefitted a lot from FTX, so it seems wrong for us to try to transfer the harms it is now causing onto other people.
Maybe I’m misunderstanding bank runs, but as I understand it, they happen because
the institution that is holding other people’s money doesn’t have all that money in liquid form
they are unable to give it back if everybody tries to deposit it at once
when this happens, the institution runs out of money and many people, who didn’t withdraw their cash in time, lose all their deposits
I think the reason Richard listed #2 as a preference is that there might still be hope that FTX doesn’t run out of money in the first place and no one loses their deposits.
However, it might be FTX will run out of money either way. In that case, speeding up the bank run will lead some people to get more their money back, but only because they pick it up before other people do. In the end it’s a zero-sum game, because FTX only has a limited amount of liquid currency. If my model is correct, then there is no net benefit in speeding up the bank run.
Or better yet, at Y Combinator.
For Level 3: Machine Learning, this document might be useful. It provides a quick summary/recap of a lot of the math required for ML.
This looks exciting! I plan to apply.
One reaction I have looking at the syllabus is that it’s too theoretical for me in the beginning. I feel like it would be better to have an applied component from the start. Maybe the first two weeks could be theory paired with writing a short distillation in the first week, getting feedback, and then refining it. The feedback loop of actually writing is probably by far the best way to improve distillation skills. This is just an impression I have though, so I could be wrong.
Nice work! Just wanted to flag that the eaopp.com link is down for me.
Personally, I don’t have a problem with the title. It clearly states the central point of the post.
Regarding the example, spending $5k on EA group dinners is really not that much if it has even a 2% chance to cause one additional career change.
How much of the impact generated by the career change are you attributing to CEA spending here? I’m just wondering because counterfactuals run into the issue of double-counting (as discussed here).
Thanks!
I agree that there is an analogy to animal suffering here, but there’s a difference in degree I think. To longtermists, the importance of future generations is many orders of magnitude higher than the importance of animal suffering is to animal welfare advocates. Therefore, I would claim, longtermists are more likely to ignore other non-longtermist considerations than animal welfare advocates would be.
Thanks for writing this! It seems like you’ve gone through a lot in publishing this. I am glad you had the courage and grit to go through with it despite the backlash you faced.
Tyler Cowen again