Nice! This is great pushback! I think that most my would be responses are covered by other people, so will add one thing just on this:
Even absent these general considerations, you can see it just by looking at the major donors we have in EA: they are generally not lottery winners or football players, they tend to be people who succeeded in entrepreneurship or investment, two fields which require accurate views about the world.
My experience isn’t this. I think that I have probably engaged with something like ~15 >$1M donors in EA or adjacent fields. Doing a brief exercise in my head of thinking through everyone I could, I got to something like:
~33% inherited wealth / family business
~40% seems like they mostly “earned it” in the sense that it seems like they started a business or did a job well, climbed the ranks in a company due to their skills, etc. To be generous, I’m also including people here who were early investors in crypto, say, where they made a good but highly speculative bet at the right time.
~20% seems like the did a lot of very difficult work, but also seem to have gotten really really lucky—e.g. grew a pre-existing major family business a lot, were roommates with Mark Zuckerberg, etc.
Obviously we don’t have the counterfactuals on these people’s lucky breaks, so it’s hard for me to guess what the world looks like where they didn’t have this lucky break, but I’d guess it’s at least at a much lower giving potential.
7% I’m not really sure.
So I’d guess that even trying to do this approach, only like 50% of major donors would pass this filter. Though it seems possible luck also played a major role for many of those 50% too and I just don’t know about it. I’m surprised you find the overall claim bizarre though, because to me it often feels somewhat self-evident from interacting with people from different wealth levels within EA, where it seems like the best calibrated people are often like, mid-level non-executives at organizations, who neither have information distortions from having power but also have deep networks / expertise and a sense of the entire space. I don’t think ultra-wealthy people have worse views, to be clear — just that wealth and having well-calibrated, thoughtful views about the world seem unrelated (or to the extent they are correlated, those differences stop being meaningful below the wealth of the average EA donor), and certainly a default of “cause prioritization is directly downstream of the views of the wealthiest people” is worse than many alternatives.
I strongly agree about the clunkiness of this approach though, and many of the downsides you highlight. I think in my ideal EA, there would be lots and lots of various things like this tried, and good ones would survive and iterate, and just generally EAs experiment with different models for distributing funding, so this is my humble submission to that project.
Nice! This is great pushback! I think that most my would be responses are covered by other people, so will add one thing just on this:
My experience isn’t this. I think that I have probably engaged with something like ~15 >$1M donors in EA or adjacent fields. Doing a brief exercise in my head of thinking through everyone I could, I got to something like:
~33% inherited wealth / family business
~40% seems like they mostly “earned it” in the sense that it seems like they started a business or did a job well, climbed the ranks in a company due to their skills, etc. To be generous, I’m also including people here who were early investors in crypto, say, where they made a good but highly speculative bet at the right time.
~20% seems like the did a lot of very difficult work, but also seem to have gotten really really lucky—e.g. grew a pre-existing major family business a lot, were roommates with Mark Zuckerberg, etc.
Obviously we don’t have the counterfactuals on these people’s lucky breaks, so it’s hard for me to guess what the world looks like where they didn’t have this lucky break, but I’d guess it’s at least at a much lower giving potential.
7% I’m not really sure.
So I’d guess that even trying to do this approach, only like 50% of major donors would pass this filter. Though it seems possible luck also played a major role for many of those 50% too and I just don’t know about it. I’m surprised you find the overall claim bizarre though, because to me it often feels somewhat self-evident from interacting with people from different wealth levels within EA, where it seems like the best calibrated people are often like, mid-level non-executives at organizations, who neither have information distortions from having power but also have deep networks / expertise and a sense of the entire space. I don’t think ultra-wealthy people have worse views, to be clear — just that wealth and having well-calibrated, thoughtful views about the world seem unrelated (or to the extent they are correlated, those differences stop being meaningful below the wealth of the average EA donor), and certainly a default of “cause prioritization is directly downstream of the views of the wealthiest people” is worse than many alternatives.
I strongly agree about the clunkiness of this approach though, and many of the downsides you highlight. I think in my ideal EA, there would be lots and lots of various things like this tried, and good ones would survive and iterate, and just generally EAs experiment with different models for distributing funding, so this is my humble submission to that project.