Will MacAskill appears to be ignoring these questions. E.g. he was interviewed about FTX recently by Sam Harris¹ and made zero mention of any whistleblowing in his account. He also gave the impression that he barely knew SBF, describing only a few fairly shallow interactions (not at all the impression I’d received while SBF was still in favour).
The interview portion of the episode was 80 min, so it wasn’t for lack of time.
I’ve been waiting for a response from Will – a full explanation and (if things are as they seem) a sincere mea culpa. I would expect no less of myself; and I expect more from someone who has held such responsibility.
Based on public information, it seems to me that Will exercised very poor judgement and a lack of moral leadership. And he now appears to be avoiding responsibility with a distorted retelling of events.
I hope and expect that his role in EA in future is restricted to that of a philosopher, and not in any sense that of a leader.
(These arguments might also apply in varying degrees to other leaders who were involved with SBF, or who ignored/hushed whistleblowers – however I’m less familiar with their roles.)
If this continues, and if Will’s lack of spine is matched by that of the rest of the EA leadership², I’ll sadly continue to drift away from EA.
¹Making Sense podcast, “#361 Sam Bankman-Fried & Effective Altruism”, 2 April 2024.
² Edited to add this one note. I’m using EA leadership here as shorthand for leaders of influential EA orgs – of course EA itself is not an organisation. And some/many in leadership are no doubt working or lobbying in ways we can’t see – my strongly worded comment isn’t intended as a blanket criticism.
Edit: I meant to add that I imagine Will to have acted in the best interests of EA as he saw it. I imagine him not as corrupt himself, but as giving space to corruption through poor judgement. I wouldn’t want to expel him from the community, I just wouldn’t want to see him in any position of leadership. (And that’s all based on the events as I understand them.) I will listen with interest to the new Clearer Thinking episode Will mentions in his comment.
This works best in the sense of “stretch goals” and “scrunch goals”.
For a lot of EAs, your stretch goals are audacious and world-changing, and I support this! I wouldn’t want to lower that achievement bar.
But audacious plans are risky. Health problems and personal limitations occur. A scrunch goal might be “minimise harm and be kind.”