To use your plane analogy, there have been 3 planes (billionaire donors) and 2 have crashed. I don’t know exactly what to do to solve the problem, but I do think that EA needs to be more open to external pragmatism.
I upvoted this because your plane analogy is fantastic, and epistemic-downvoted this because “EA needs to be more open to external pragmatism” could mean a lot of things, including the obvious “EA needs to get better at unknown unknowns and people who understand them” but simultaneously also a dog whistle for “underdogs like me should be in charge instead” or “EA should be more in-line with status quo ideologies that already have 100 million people”.
I also do weasel words a lot, so I know what I’m talking about.
It’s fair to criticise the weasel words. What I meant by external pragmatism was more around operations than cause prioritisation, e.g. we should learn the lessons of decades of governance, managerial practices and evidence of how to actually get things done
To use your plane analogy, there have been 3 planes (billionaire donors) and 2 have crashed. I don’t know exactly what to do to solve the problem, but I do think that EA needs to be more open to external pragmatism.
I upvoted this because your plane analogy is fantastic, and epistemic-downvoted this because “EA needs to be more open to external pragmatism” could mean a lot of things, including the obvious “EA needs to get better at unknown unknowns and people who understand them” but simultaneously also a dog whistle for “underdogs like me should be in charge instead” or “EA should be more in-line with status quo ideologies that already have 100 million people”.
I also do weasel words a lot, so I know what I’m talking about.
It’s fair to criticise the weasel words. What I meant by external pragmatism was more around operations than cause prioritisation, e.g. we should learn the lessons of decades of governance, managerial practices and evidence of how to actually get things done
There are more than 3 billionaire donors.