This is an extremely rich guy who isn’t donating any of his money. I wouldn’t call him ‘aligned’ at all to EA.
I would also just, be careful about reading him on his word. He’s only started talking about this framing recently (I’ve followed him for a while because of a passing interest in Kernel). He may well just be a guy who’s very scared of dying with an incomprehensible amount of money to spend on it, who’s looking for some admirers.
This is an extremely rich guy who isn’t donating any of his money.
FWIW, I totally don’t consider “donating” a necessary component of taking effective altruistic action. Most charities seem much less effective than the most effective for-profit organizations, and most of the good in the world seems achieved by for-profit companies.
I don’t have a particularly strong take on Bryan Johnson, but using “donations” as a proxy seems pretty bad to me.
Hmm, I think having the mindset behind effective altruistic action basically requires you to feel the force of donating. It’s often correct to not donate because of some combination of expecting {better information/deconfusion, better donation opportunities, excellent non-donation spending opportunities, high returns, etc.} in the future. But if you haven’t really considered large donations or don’t get that donating can be great, I fail to imagine how you could be taking effective altruistic action. (For extremely rich people.) (Related indicator of non-EA-ness: not strongly considering causes outside the one you’re most passionate about.)
“Most charities seem much less effective than the most effective for-profit organizations”
This is a big discussion but I would be interested to see you justify this. I would say many of the biggest GHD achievements and much important work is driven by not for profit organizations like charities and government (global vaccine alliance, university research institutions etc) but obviously it’s a complicated discussion.
Obviously a market economy drives much of it, but I consider this more the water we swim in rather than the capitalist system doing the good itself.
I would be interested to hear the for profit businesses which you think are counterfactually doing the most good on the margins
I take a very longtermist and technology-development focused view on things, so the GHD achievements weigh a lot less in my calculus.
The vast majority of world-changing technology was developed or distributed through for-profit companies. My sense is nonprofits are also more likely to cause harm than for-profits (for reasons that would require its own essay to go into, but are related to their lack of feedback loops).
On a separate claim, I find it really hard to discount the rough period since ~1800 where a huge amount of new technological development took place in academic or other non-profit contexts (including militaries). When you add pre-production research to that, I think you’d be hard-pressed to find a single world-changing technology since the enlightenment that doesn’t owe a lot of its existence to non-profit research. Am I misunderstanding your claim?
Academia pre the mid-20th-century was a for-profit enterprise. It did not receive substantial government grants and indeed was often very tightly intertwined with the development of industry (much more so than today).
Indeed, the degree to which modern academia is operating on a grant basis and has adopted more of the trappings of the nonprofit space is one of the primary factors in my model of its modern dysfunctions.
Separately, I think the contribution of militaries to industrial and scientific development is overrated, though that also would require a whole essay to go into.
I disagree-voted because the latter sounds like a very extraordinary claim. I know you don’t have the time to go into an essay on this, but do you mind sketching the rough logic?
“Most charities seem much less effective than the most effective for-profit organizations, and most of the good in the world seems achieved by for-profit companies.”
I disagree but even I did agree, per dollar of investment, I think the best charities far outpeform the best for-profit companies in terms of social impact, and we can do a reasonable job of identifying the best charities, such that donating a lot of money to these charities should be seen as a necessary component of being EA-aligned if you’re rich.
I think Peter might be hoping people read this as “a rich and influential guy might be persuadable!” rather than “let’s discuss the minutiae of what constitutes an EA”. I’ve watched quite a few of Bryan’s videos and honestly I could see this guy swing either way on this (could be SBF, could be Dustin, honestly can’t tell how this shakes out).
Yeah I think that’s part of it. I also thought it was very interesting how he justified what he was doing as being important for the long term future given the expected emergence of superhuman AI. E.g., he is running his life by an algorithm in expectation that society might be run in a similar way.
I will definitely say that he does come across as hyper rational and low empathy in general but there’s also some touching moments here where he clearly has a lot of care for his family and really doesn’t want to lose them. Could all be an act of course.
I think of EA as a cluster of values and related actions that people can hold/practice to different extents. For instance, caring about social impact, seeking comparative advantage, thinking about long term positive impacts, and being concerned about existential risks including AI. He touched on all of those.
It’s true that he doesn’t mention donations. I don’t think that discounts his alignment in other ways.
This is an extremely rich guy who isn’t donating any of his money.
But cf. the “stages of change” in the transtheoretical model of behavior change. A lack of action suggests he has not reached the action stage, but could be in the contemplation or preparation stages.
Yeah, he could be planning to donate money once his attempt to reduce our overcome mortality is resolved.
He said several times that what he’s doing now is only part one of the plants so I guess there is a opportunity to withhold judgment and see what he does later.
Having said all that I don’t want to come across as trusting him. I just heard the interview and was really surprised by all the EA themes which emerged and the narrative he proposed for why what he’s doing is important
Edit: I stand by this; it was a quick way to explain the problems with Jason’s comment. I don’t think we should be too mean to people for not donating (in order to not dissuade them from doing it in the future), but this particular model could be used to excuse basically any behaviour as ‘they might be a potential EA one day’. I don’t think it’s a good defence and wouldn’t want to see it trotted out more often.
Thanks for following up! This evidence you offer doesn’t persuade me that most EAs are extremely rich guys because it’s not arguing that. Did you mean to claim that most EAs who are rich guys are not donating any of their money or more than the median rich person?
I also don’t feel particularly persuaded by that claim based on the evidence shared. What are the specific points that are persuasive in the links—I couldn’t see anything particularly relevant from scanning them. As in nothing that I could use to make an easy comparison between EA donors and median rich people.
I see that “Mean share of total (imputed) income donated was 9.44% (imputing income where below 5k or missing) or 12.5% without imputation.” for EAs and “around 2-3 percent of income” for US households” which seems opposed to your position. But I haven’t checked carefully and I am not the kind of person who makes these sorts of careful comparisons very well.
I don’t have evidence to link to here, or time to search for it, but my current beliefs are that most of EAs funding comes from rich and extremely rich people (often men) donating their money.
This is an extremely rich guy who isn’t donating any of his money. I wouldn’t call him ‘aligned’ at all to EA.
I would also just, be careful about reading him on his word. He’s only started talking about this framing recently (I’ve followed him for a while because of a passing interest in Kernel). He may well just be a guy who’s very scared of dying with an incomprehensible amount of money to spend on it, who’s looking for some admirers.
FWIW, I totally don’t consider “donating” a necessary component of taking effective altruistic action. Most charities seem much less effective than the most effective for-profit organizations, and most of the good in the world seems achieved by for-profit companies.
I don’t have a particularly strong take on Bryan Johnson, but using “donations” as a proxy seems pretty bad to me.
Hmm, I think having the mindset behind effective altruistic action basically requires you to feel the force of donating. It’s often correct to not donate because of some combination of expecting {better information/deconfusion, better donation opportunities, excellent non-donation spending opportunities, high returns, etc.} in the future. But if you haven’t really considered large donations or don’t get that donating can be great, I fail to imagine how you could be taking effective altruistic action. (For extremely rich people.) (Related indicator of non-EA-ness: not strongly considering causes outside the one you’re most passionate about.)
(I don’t have context on Bryan Johnson.)
“Most charities seem much less effective than the most effective for-profit organizations”
This is a big discussion but I would be interested to see you justify this. I would say many of the biggest GHD achievements and much important work is driven by not for profit organizations like charities and government (global vaccine alliance, university research institutions etc) but obviously it’s a complicated discussion.
Obviously a market economy drives much of it, but I consider this more the water we swim in rather than the capitalist system doing the good itself.
I would be interested to hear the for profit businesses which you think are counterfactually doing the most good on the margins
I take a very longtermist and technology-development focused view on things, so the GHD achievements weigh a lot less in my calculus.
The vast majority of world-changing technology was developed or distributed through for-profit companies. My sense is nonprofits are also more likely to cause harm than for-profits (for reasons that would require its own essay to go into, but are related to their lack of feedback loops).
On a separate claim, I find it really hard to discount the rough period since ~1800 where a huge amount of new technological development took place in academic or other non-profit contexts (including militaries). When you add pre-production research to that, I think you’d be hard-pressed to find a single world-changing technology since the enlightenment that doesn’t owe a lot of its existence to non-profit research. Am I misunderstanding your claim?
Academia pre the mid-20th-century was a for-profit enterprise. It did not receive substantial government grants and indeed was often very tightly intertwined with the development of industry (much more so than today).
Indeed, the degree to which modern academia is operating on a grant basis and has adopted more of the trappings of the nonprofit space is one of the primary factors in my model of its modern dysfunctions.
Separately, I think the contribution of militaries to industrial and scientific development is overrated, though that also would require a whole essay to go into.
I disagree-voted because the latter sounds like a very extraordinary claim. I know you don’t have the time to go into an essay on this, but do you mind sketching the rough logic?
“Most charities seem much less effective than the most effective for-profit organizations, and most of the good in the world seems achieved by for-profit companies.”
I disagree but even I did agree, per dollar of investment, I think the best charities far outpeform the best for-profit companies in terms of social impact, and we can do a reasonable job of identifying the best charities, such that donating a lot of money to these charities should be seen as a necessary component of being EA-aligned if you’re rich.
I think Peter might be hoping people read this as “a rich and influential guy might be persuadable!” rather than “let’s discuss the minutiae of what constitutes an EA”. I’ve watched quite a few of Bryan’s videos and honestly I could see this guy swing either way on this (could be SBF, could be Dustin, honestly can’t tell how this shakes out).
Yeah I think that’s part of it. I also thought it was very interesting how he justified what he was doing as being important for the long term future given the expected emergence of superhuman AI. E.g., he is running his life by an algorithm in expectation that society might be run in a similar way.
I will definitely say that he does come across as hyper rational and low empathy in general but there’s also some touching moments here where he clearly has a lot of care for his family and really doesn’t want to lose them. Could all be an act of course.
Thanks for the input!
I think of EA as a cluster of values and related actions that people can hold/practice to different extents. For instance, caring about social impact, seeking comparative advantage, thinking about long term positive impacts, and being concerned about existential risks including AI. He touched on all of those.
It’s true that he doesn’t mention donations. I don’t think that discounts his alignment in other ways.
Useful to know he might not be genuine though.
But cf. the “stages of change” in the transtheoretical model of behavior change. A lack of action suggests he has not reached the action stage, but could be in the contemplation or preparation stages.
Yeah, he could be planning to donate money once his attempt to reduce our overcome mortality is resolved.
He said several times that what he’s doing now is only part one of the plants so I guess there is a opportunity to withhold judgment and see what he does later.
Having said all that I don’t want to come across as trusting him. I just heard the interview and was really surprised by all the EA themes which emerged and the narrative he proposed for why what he’s doing is important
That’s not falsifiable
Edit: I stand by this; it was a quick way to explain the problems with Jason’s comment. I don’t think we should be too mean to people for not donating (in order to not dissuade them from doing it in the future), but this particular model could be used to excuse basically any behaviour as ‘they might be a potential EA one day’. I don’t think it’s a good defence and wouldn’t want to see it trotted out more often.
Definitely not all of them, but most EAs are extremely rich guys who aren’t donating any of their money.
Thanks for sharing your opinion. What’s your evidence for this claim?
https://forum.effectivealtruism.org/posts/nb6tQ5MRRpXydJQFq/ea-survey-2020-series-donation-data#Donation_and_income_for_recent_years, and personal conversations which make me suspect the assumption of non-respondents donating as much as respondents is excessively generous.
Not donating any of their money is definitely an exaggeration, but it’s not more than the median rich person https://www.philanthropyroundtable.org/almanac/statistics-on-u-s-generosity/
Thanks for following up! This evidence you offer doesn’t persuade me that most EAs are extremely rich guys because it’s not arguing that. Did you mean to claim that most EAs who are rich guys are not donating any of their money or more than the median rich person?
I also don’t feel particularly persuaded by that claim based on the evidence shared. What are the specific points that are persuasive in the links—I couldn’t see anything particularly relevant from scanning them. As in nothing that I could use to make an easy comparison between EA donors and median rich people.
I see that “Mean share of total (imputed) income donated was 9.44% (imputing income where below 5k or missing) or 12.5% without imputation.” for EAs and “around 2-3 percent of income” for US households” which seems opposed to your position. But I haven’t checked carefully and I am not the kind of person who makes these sorts of careful comparisons very well.
I don’t have evidence to link to here, or time to search for it, but my current beliefs are that most of EAs funding comes from rich and extremely rich people (often men) donating their money.