Thanks!
I don’t know much about LW/ESPR/SPARC but I suspect a lot of their impact flows through convincing people of important ideas and/or the social aspect rather than their impact on community epistemics/integrity?
Some of the sorts of outcomes I have in mind are just things like altered cause prioritisation, different projects getting funded, generally better decision-making.
Similarly, if the goal is to help people think about cause prioritisation, I think fairly standard EA retreats / fellowships are quite good at this? I’m not sure we need some intermediary step like “improve community epistemics”.
Appreciate you responding and tracking this concern though!
Well said!
Unfortunately, I think the uncertainty we all face goes even deeper. There’s no EA sorting hat, and there’s also no one who can tell you whether you really made the right call or had the kind of impact you wanted to have. No one will find you after your career, shake your hand and offer you an impact scorecard.
Maybe it’s a bit easier to figure this out, because you can look at the work you’ve done, estimate some counterfactuals and weigh it up yourself, but I also see some people saying things like “I got the EA-aligned job someone recommended, so hooray I’m now having a bunch of impact”. Maybe! But hard to say, and I recommend getting comfortable with that uncertainty.