There are writing issues and I’m not sure the net value of the post is positive.
But your view seems ungenerous, ideas in paragraphs like this seem relevant:
This isn’t a snide jab at Will MacAskill. He in fact recognized this problem before most and has made the wise choice of not being the CEO of the CEA for a decade now even though he could have kept the job forever if he wanted.
This is a general problem in EA of many academics having to repeatedly learn they have little to no comparative advantage, if not a comparative disadvantage, in people and operations management.
Some of the individuals about who there is the greatest concern that may end up in a personality cult, information silo, or echo chamber, like Holden, are putting in significant effort to avoid becoming out of touch with reality and minimizing any negative, outsized impact of their own biases.
Yet it’s not apparent if Musk makes any similar efforts. So, what, if any, are the reasons specific to Musk as a personality causing him to be so inconsistent in the ways effective altruists should care about most?
I understood the heart of the post to be in the first sentence: “what should be of greater importance to effective altruists anyway is how the impacts of all [Musk’s] various decisions are, for lack of better terms, high-variance, bordering on volatile.” While Evan doesn’t provide examples of what decisions he’s talking about, I think his point is a valid one: Musk is someone who is exceptionally powerful, increasingly interested in how he can use his power to shape the world, and seemingly operating without the kinds of epistemic guardrails that EA leaders try to operate with. This seems like an important development, if for no other reason that Musk’s and EA’s paths seem more likely to collide than diverge as time goes on.
What you said seems valid. However, unfortunately, it seems low EV to talk a lot about this subject. Maybe the new EA comms and senior people are paying attention to the issues, and for a number of reasons that seems best in this situation. If that’s not adequate, it seems valid to push or ask them about it.
I’m thinking of asking people like that about what they’re doing but I’m also intending to request feedback from them and others in EA how to communicate related ideas better. I’ve asked this question to check if there are major factors I might be missing as a prelude to a post with my own views. That’d be high stakes enough that I’d put in the effort to write it better that I didn’t put into this question post. I might title it something like “Effective Altruism Should Proactively Help Allied/Aligned Philanthropists Optimize Their Marginal Impact.”
Other than at the Centre of Effective Altruism, who are the new/senior communications staff it’d be good to contact?
As I write in my answer above, I think high-variance and volatile decisions are kinda just the name of the game when you are trying to make billions of dollars and change industries in a very-competitive world.
Agreed that Musk is “operating without the kinds of epistemic guardrails that EA leaders try to operate with”, and that it would be better if Musk was wiser. But it is always better if people were wiser, stronger versions of themselves! The problem is that people can’t always change their personalities very much, and furthermore it’s not always clear (from the inside) which direction of personality change would be an improvement. The problem of “choosing how epistemically modest I should be”, is itself a deep and unsettled question.
(Devil’s advocate perspective: maybe it’s not Musk that’s being too wild and volatile, but EAs who are being too timid and unambitious—trying to please everyone, fly under the radar, stay apolitical, etc! I don’t actually believe this 100%, but maybe 25%: Musk is more volatile than would be ideal, but EA is also more timid than would be ideal. So I don’t think we can easily say exactly how much more epistemically guard-railed Musk should ideally be, even if we in the EA movement had any influence over him, and even if he had the capability to change his personality that much.)
I agree that Musk should have more epistemic guardrails but also that EA should me more ambitious and not less timid, but more tactful. Trying to always please everyone, be apolitical and fly under the radar can constitute an extreme risk aversion, a risk in itself.
There are writing issues and I’m not sure the net value of the post is positive.
But your view seems ungenerous, ideas in paragraphs like this seem relevant:
I understood the heart of the post to be in the first sentence: “what should be of greater importance to effective altruists anyway is how the impacts of all [Musk’s] various decisions are, for lack of better terms, high-variance, bordering on volatile.” While Evan doesn’t provide examples of what decisions he’s talking about, I think his point is a valid one: Musk is someone who is exceptionally powerful, increasingly interested in how he can use his power to shape the world, and seemingly operating without the kinds of epistemic guardrails that EA leaders try to operate with. This seems like an important development, if for no other reason that Musk’s and EA’s paths seem more likely to collide than diverge as time goes on.
What you said seems valid. However, unfortunately, it seems low EV to talk a lot about this subject. Maybe the new EA comms and senior people are paying attention to the issues, and for a number of reasons that seems best in this situation. If that’s not adequate, it seems valid to push or ask them about it.
I’m thinking of asking people like that about what they’re doing but I’m also intending to request feedback from them and others in EA how to communicate related ideas better. I’ve asked this question to check if there are major factors I might be missing as a prelude to a post with my own views. That’d be high stakes enough that I’d put in the effort to write it better that I didn’t put into this question post. I might title it something like “Effective Altruism Should Proactively Help Allied/Aligned Philanthropists Optimize Their Marginal Impact.”
Other than at the Centre of Effective Altruism, who are the new/senior communications staff it’d be good to contact?
Strongly upvoted. You’ve put my main concern better than I knew how to put it myself.
As I write in my answer above, I think high-variance and volatile decisions are kinda just the name of the game when you are trying to make billions of dollars and change industries in a very-competitive world.
Agreed that Musk is “operating without the kinds of epistemic guardrails that EA leaders try to operate with”, and that it would be better if Musk was wiser. But it is always better if people were wiser, stronger versions of themselves! The problem is that people can’t always change their personalities very much, and furthermore it’s not always clear (from the inside) which direction of personality change would be an improvement. The problem of “choosing how epistemically modest I should be”, is itself a deep and unsettled question.
(Devil’s advocate perspective: maybe it’s not Musk that’s being too wild and volatile, but EAs who are being too timid and unambitious—trying to please everyone, fly under the radar, stay apolitical, etc! I don’t actually believe this 100%, but maybe 25%: Musk is more volatile than would be ideal, but EA is also more timid than would be ideal. So I don’t think we can easily say exactly how much more epistemically guard-railed Musk should ideally be, even if we in the EA movement had any influence over him, and even if he had the capability to change his personality that much.)
I agree that Musk should have more epistemic guardrails but also that EA should me more ambitious and not less timid, but more tactful. Trying to always please everyone, be apolitical and fly under the radar can constitute an extreme risk aversion, a risk in itself.