I don’t quite know how I feel about this perspective. On one hand, everyone has ways to improve and so if you aren’t finding them you probably aren’t looking hard enough. On the other hand, just because X number of people say something, it doesn’t mean that they are correct.
What are the changes that you think should be made that have the strongest case?
I’ve written a bunch of stuff on this recently, so in that sense I’m biased. But my suggestions have generally been:
More transparency from core EA orgs (Givewell seem to set a good benchmark here, and even moreso the charities they recommend)
More division of responsibility among EA orgs (ie more orgs doing fewer things each) - especially (but not exclusively) having separate orgs for EA PR and special ops
More carrots and/or more sticks to incentivise staff at EA orgs to perform their jobs adequately
Less long-term reliance on heuristic reasoning (eg ITN, orthogonality thesis, existential risk, single-author assessments of particular events)
What are the changes that you think should be made that have the strongest case?
Next red-teaming competition shall include a forecasting contest: “What is the worst thing to happen to EA in 2023?” First, two winners will be selected for “best entries ex ante”. Then, in January, we see if anyone actually predicted the worst hazar that happened.
If it was my choice, I’d likely give it a small prize, but not a large one as my perspective is that it is only vaguely in the general vicinity of what happened.
I don’t quite know how I feel about this perspective. On one hand, everyone has ways to improve and so if you aren’t finding them you probably aren’t looking hard enough. On the other hand, just because X number of people say something, it doesn’t mean that they are correct.
What are the changes that you think should be made that have the strongest case?
I’ve written a bunch of stuff on this recently, so in that sense I’m biased. But my suggestions have generally been:
More transparency from core EA orgs (Givewell seem to set a good benchmark here, and even moreso the charities they recommend)
More division of responsibility among EA orgs (ie more orgs doing fewer things each) - especially (but not exclusively) having separate orgs for EA PR and special ops
More carrots and/or more sticks to incentivise staff at EA orgs to perform their jobs adequately
Less long-term reliance on heuristic reasoning (eg ITN, orthogonality thesis, existential risk, single-author assessments of particular events)
What are the changes that you think should be made that have the strongest case?
Next red-teaming competition shall include a forecasting contest: “What is the worst thing to happen to EA in 2023?” First, two winners will be selected for “best entries ex ante”. Then, in January, we see if anyone actually predicted the worst hazar that happened.
Give this person a prize.
If it was my choice, I’d likely give it a small prize, but not a large one as my perspective is that it is only vaguely in the general vicinity of what happened.