And on a personal note, I aspire to create a lot of value for the world, and direct it towards doing lots of good. Call me overconfident, but I expect to be a billionaire someday. The way EA treats SBF here sets a precedent: if the EA community is happy to accept money when the going is good, but then is ready to cut ties once the money dries up… you can guess how excited I would be to contribute in the first place.
This is a weird paragraph. If your goal were doing the most good, why would it matter how you expect EA to treat you in the case of failure? It kinda sounds like your goal is social status among the EA community.
This isn’t to say that you don’t have a good point. If people are donating to EA because they want social status, that’s still money going towards good causes, and perhaps we should reward them for that in order to encourage more people to do so. But I’d have a hard time calling that “altruistic behavior” on their part.
If your goal were doing the most good, why would it matter how you expect EA to treat you in the case of failure?
Because he’s a human being and human beings need social support to thrive. I think it’s false to equate this perfectly fine human need with a lower motive like status-seeking. If we want people to try hard to do good we as a community should still be there for them when they fall.
I don’t think it’s either/or. I think it’s consistent for Austin’s philanthropy to be primarily motivated by altruism and for him to also feel scared of the prospect of his community turning on him when he makes a mistake, perhaps to the point of putting him off the whole idea completely. And I’d expect most EAs to have a similar mix of motivations.
Yeah, idk, it’s actually less of a personal note than a comment on decision theory among future and current billionaires. I guess the “personal” side is where I can confidently say “this set of actions feels very distasteful to me” because I get to make claims about my own sense of taste; and I’m trying to extrapolate that to other people who might become meaningful in the future.
Or maybe: This is a specific rant to the “EA community” separate from “EA principles”. I hold my association with the “EA community” quite loosely; I only actually met people in this space like this year as a result of Manifold, whereas I’ve been donating/reading EA for 6ish years. The EA principles broadly make sense to me either way; and I guess I’m trying to figure out whether the EA community is composed of people I’m happy to associate with.
This is a weird paragraph. If your goal were doing the most good, why would it matter how you expect EA to treat you in the case of failure? It kinda sounds like your goal is social status among the EA community.
This isn’t to say that you don’t have a good point. If people are donating to EA because they want social status, that’s still money going towards good causes, and perhaps we should reward them for that in order to encourage more people to do so. But I’d have a hard time calling that “altruistic behavior” on their part.
Because he’s a human being and human beings need social support to thrive. I think it’s false to equate this perfectly fine human need with a lower motive like status-seeking. If we want people to try hard to do good we as a community should still be there for them when they fall.
I don’t think it’s either/or. I think it’s consistent for Austin’s philanthropy to be primarily motivated by altruism and for him to also feel scared of the prospect of his community turning on him when he makes a mistake, perhaps to the point of putting him off the whole idea completely. And I’d expect most EAs to have a similar mix of motivations.
Yeah, idk, it’s actually less of a personal note than a comment on decision theory among future and current billionaires. I guess the “personal” side is where I can confidently say “this set of actions feels very distasteful to me” because I get to make claims about my own sense of taste; and I’m trying to extrapolate that to other people who might become meaningful in the future.
Or maybe: This is a specific rant to the “EA community” separate from “EA principles”. I hold my association with the “EA community” quite loosely; I only actually met people in this space like this year as a result of Manifold, whereas I’ve been donating/reading EA for 6ish years. The EA principles broadly make sense to me either way; and I guess I’m trying to figure out whether the EA community is composed of people I’m happy to associate with.