I disagree with this. I think we should receive money from basically arbitrary sources, but I think that money should not come with associated status and reputation from within the community. If an old mafia boss wants to buy malaria nets, I think it’s much better if they can than if they cannot.
I think the key thing that went wrong was that in addition to Sam giving us money and receiving charitable efforts in return, he also received a lot of status and in many ways became one of the central faces of the EA community, and I think that was quite bad. I think we should have pushed back hard when Sam started being heavily associated with EA (and e.g. I think we should have not invited him to things like coordination forum, or had him speak at lots of EA events, etc.)
I guess it also depends on where the funding is going. If a bloody dictator gives a lot money to GiveDirectly or another charity that spends the money on physical goods(anti-malaria nets) which are obviously good, then it’s still debatable but there’s less concern. But if the money is used in an outreach project to spread ideas then it’s a terrible outcome. It’s similarly dangerous for research institutions.
What’s the specific mistake you think was made? Do you think e.g. “being very good at crypto / trading / markets” shouldn’t be on its own sufficient to have status in the community?
I would be glad to see Putin have less resources and to see more bednets being distributed.
I do think the influence angle is key here. I think if Putin was doing a random lottery where he chose any organization in the world to receive a billion dollars from him, and it happened to be my organization, I think I should keep the money.
I think it gets trickier if we think about Putin giving money directly to me, because like, presumably he wants something in return. But if there was genuine proof he didn’t want anything in-return, I would be glad to take it, especially if the alternative is that it fuels a war with Ukraine.
Right, I agree that it’s good to drain his resources and turn them into good things. The problem is that right now, our model is “status is a voluntary transaction.” In that model, when SBF, or in this example VP, donates, they are implicitly requesting status, which their recipients can choose to grant them or not.
I don’t think grantees—even whole movements—necessarily have a choice in this matter. How would we have coordinated to avoid granting SBF status? Refused to have him on podcasts? But if he donates to EA, and a non-EA podcaster (maybe Tyler Cowen) asks him, SBF is free to talk about his connection and reasoning. Journalists can cover it however they see fit. People in EA, perhaps simply disagreeing, perhaps because they hope to curry favor with SBF, may self-interestedly grant status anyway. That wouldn’t be very altruistic, but we should be seriously examining the degree to which self-interest motivates people to participate in EA right now.
So if we want to be able to accept donations from radioactive (or potentially radioactive) people, we need some story to explain how that avoids granting them status in ways that are out of our control. How do we avoid journalists, podcasters, a fraction of the EA community, and the donor themselves from constructing a narrative of the donor as a high-status EA figure?
I don’t think grantees—even whole movements—necessarily have a choice in this matter. How would we have coordinated to avoid granting SBF status? Refused to have him on podcasts? But if he donates to EA, and a non-EA podcaster (maybe Tyler Cowen) asks him, SBF is free to talk about his connection and reasoning. Journalists can cover it however they see fit. People in EA, perhaps simply disagreeing, perhaps because they hope to curry favor with SBF, may self-interestedly grant status anyway. That wouldn’t be very altruistic, but we should be seriously examining the degree to which self-interest motivates people to participate in EA right now.
I think my favorite version of this is something like “You can buy our scrutiny and time”. Like, if you donate to EA, we will pay attention to you, and we will grill you in the comments section of our forum, and in some sense this is an opportunity for you to gain status, but it’s also an opportunity for you to lose a lot of status, if you don’t hold yourself well in those situations.
I think a podcast with SBF where someone would have grilled him on his controversial stances would have been great. Indeed, I was actually planning to do a public debate with him in February where I was planning to bring up his reputation for lack of honesty and his involvement in politics that seemed pretty shady to me, but some parts of EA leadership actively requested that I don’t do that, since it seemed too likely to explode somehow and reflect really badly on EAs image.
I also think repeatedly that we don’t think he is a good figurehead of the EA community, not inviting him to coordination forum and other leadership events, etc. would have been good and possible.
Indeed, right now I am involved with talking to a bunch of people about similar situations, where we are associated with a bunch of AI capabilities companies and there are a bunch of people in policy that I don’t want to support, but they are working on things that are relevant to us and that are useful to coordinate with (and sometimes give resources to). And I think we could just have a public statement being like “despite the fact that we trade with OpenAI, we also think they are committing a terrible atrocity and we don’t want you to think we support them”. And I think this would help a lot, and doesn’t seem that hard. And if they don’t want to take the other side of that deal and only want to trade with us if we say that we think they are great, then we shouldn’t trade with them.
I think a podcast with SBF where someone would have grilled him on his controversial stances would have been great. Indeed, I was actually planning to do a public debate with him in February where I was planning to bring up his reputation for lack of honesty and his involvement in politics that seemed pretty shady to me, but some parts of EA leadership actively requested that I don’t do that, since it seemed too likely to explode somehow and reflect really badly on EAs image.
This is an issue with optimizing of image I have: You aren’t able to speak out against a thought leader because they’re successful, and EA optimizing for seeming good is how we got into this mess in the first place.
I support these actions, conditional on them becoming common knowledge community norms. However, it’s strictly less likely for us to trade with bad actors and project that we don’t support them than it is for us to just trade with bad actors.
I disagree with this. I think we should receive money from basically arbitrary sources, but I think that money should not come with associated status and reputation from within the community. If an old mafia boss wants to buy malaria nets, I think it’s much better if they can than if they cannot.
I think the key thing that went wrong was that in addition to Sam giving us money and receiving charitable efforts in return, he also received a lot of status and in many ways became one of the central faces of the EA community, and I think that was quite bad. I think we should have pushed back hard when Sam started being heavily associated with EA (and e.g. I think we should have not invited him to things like coordination forum, or had him speak at lots of EA events, etc.)
I guess it also depends on where the funding is going. If a bloody dictator gives a lot money to GiveDirectly or another charity that spends the money on physical goods(anti-malaria nets) which are obviously good, then it’s still debatable but there’s less concern. But if the money is used in an outreach project to spread ideas then it’s a terrible outcome. It’s similarly dangerous for research institutions.
What’s the specific mistake you think was made? Do you think e.g. “being very good at crypto / trading / markets” shouldn’t be on its own sufficient to have status in the community?Edit: Answered elsewhere
“Old mafia don?” How about Vladimir Putin?
I tend to lean in your direction, but I think we should base this argument on the most radioactive relevant modern case.
I would be glad to see Putin have less resources and to see more bednets being distributed.
I do think the influence angle is key here. I think if Putin was doing a random lottery where he chose any organization in the world to receive a billion dollars from him, and it happened to be my organization, I think I should keep the money.
I think it gets trickier if we think about Putin giving money directly to me, because like, presumably he wants something in return. But if there was genuine proof he didn’t want anything in-return, I would be glad to take it, especially if the alternative is that it fuels a war with Ukraine.
Right, I agree that it’s good to drain his resources and turn them into good things. The problem is that right now, our model is “status is a voluntary transaction.” In that model, when SBF, or in this example VP, donates, they are implicitly requesting status, which their recipients can choose to grant them or not.
I don’t think grantees—even whole movements—necessarily have a choice in this matter. How would we have coordinated to avoid granting SBF status? Refused to have him on podcasts? But if he donates to EA, and a non-EA podcaster (maybe Tyler Cowen) asks him, SBF is free to talk about his connection and reasoning. Journalists can cover it however they see fit. People in EA, perhaps simply disagreeing, perhaps because they hope to curry favor with SBF, may self-interestedly grant status anyway. That wouldn’t be very altruistic, but we should be seriously examining the degree to which self-interest motivates people to participate in EA right now.
So if we want to be able to accept donations from radioactive (or potentially radioactive) people, we need some story to explain how that avoids granting them status in ways that are out of our control. How do we avoid journalists, podcasters, a fraction of the EA community, and the donor themselves from constructing a narrative of the donor as a high-status EA figure?
I think my favorite version of this is something like “You can buy our scrutiny and time”. Like, if you donate to EA, we will pay attention to you, and we will grill you in the comments section of our forum, and in some sense this is an opportunity for you to gain status, but it’s also an opportunity for you to lose a lot of status, if you don’t hold yourself well in those situations.
I think a podcast with SBF where someone would have grilled him on his controversial stances would have been great. Indeed, I was actually planning to do a public debate with him in February where I was planning to bring up his reputation for lack of honesty and his involvement in politics that seemed pretty shady to me, but some parts of EA leadership actively requested that I don’t do that, since it seemed too likely to explode somehow and reflect really badly on EAs image.
I also think repeatedly that we don’t think he is a good figurehead of the EA community, not inviting him to coordination forum and other leadership events, etc. would have been good and possible.
Indeed, right now I am involved with talking to a bunch of people about similar situations, where we are associated with a bunch of AI capabilities companies and there are a bunch of people in policy that I don’t want to support, but they are working on things that are relevant to us and that are useful to coordinate with (and sometimes give resources to). And I think we could just have a public statement being like “despite the fact that we trade with OpenAI, we also think they are committing a terrible atrocity and we don’t want you to think we support them”. And I think this would help a lot, and doesn’t seem that hard. And if they don’t want to take the other side of that deal and only want to trade with us if we say that we think they are great, then we shouldn’t trade with them.
This is an issue with optimizing of image I have: You aren’t able to speak out against a thought leader because they’re successful, and EA optimizing for seeming good is how we got into this mess in the first place.
I support these actions, conditional on them becoming common knowledge community norms. However, it’s strictly less likely for us to trade with bad actors and project that we don’t support them than it is for us to just trade with bad actors.