What did I say warranted this comparison? I am not saying anything anyone has said is irrelevant to EA. I am saying argument Will made that for-profit businesses can’t do much harm because their worst outcome is bankruptcy is misleading. I am also saying that encouraging people to earn to give without considering that harm can by done by earning is one of the ways the EA movement could end up having a net negative impact.
guyi
- 11 Nov 2022 18:05 UTC; 9 points) 's comment on The FTX Future Fund team has resigned by (
To be clear, stating that I was confused was a polite way of indicating that I think this reasoning itself is confused. Why should we evaluate for-profit businesses only from a profit-maximizing perspective? Having profitability as the primary goal of enterprise doesn’t preclude that enterprise from doing massive amounts of harm. If a for-profit enterprise does harm in its attempts to make profit should we ignore that harm simply because they’ve succeeded in turning a profit? If you interpretation of Will’s reasoning is what he intended, then he is asking us compare aiming to do good and aiming to make profit by evaluating each on different criteria. Generally, this is a misleading way of making comparisons.
This is important because this sort of reasoning is used to justify an uncircumspect version of encouraging people to earn to give that I see coming from this community. As I’ve seen it, the argument goes that in your career your should focus on acquiring wealth rather than doing good because you can then use that wealth to do good by being an effective altruist. But this ignores that you can potentially do more harm in acquiring great wealth than you can compensate for by using your wealth altruistically.
As I mentioned, I think there are some good reasons to believe the money Bankman-Fried and Moskovitz are contributing to the EA community was acquired in ways that may have caused significant harm. This doesn’t mean the EA community should reject this money, and if the money is used judiciously it may well mean the Bankman-Fried and Moskovitz’s successes in acquiring wealth may will be net positives for humanity. But it is also not hard to think of examples where so much harm was done in acquiring the funding of a charitable organization even if the charity does a lot of good it won’t be a net positive. For example, consider the Sackler family, who in the process of acquiring their wealth, were happy to ignore that they were creating tens of thousands of opioid addicts. Their charitable work (mostly donating to already well-funded museums as far as I know) also probably also wouldn’t be evaluated well by the EA community. But the real harm happened even before they attempted to clean up their reputations through charitable donations.
In light of harms that can be caused in acquiring wealth, I think the EA community should be more circumspect about encouraging people to earn to give. Succeeding in business to maximize your positive impact through charitable work isn’t necessarily bad advice, but you do need account for the harms businesses can do.
Since we agree scale is a key part of this, I don’t know how you can be confident that an imagined fake charity that disrupts medical service or food supply would ever be large enough to equal scale the harms caused by some of the most powerful global corporations. In this era of extreme wealth inequality it’s plausible that some billionaire could accrue massive personal wealth and then transfer that wealth to a non-profit, which, freed from the shackles of having to make a profit, could focus solely on doing harm. But equally that billionaire could found an institution focused on turning a profit while doing harm and use the profits to grow the institution to a scale and concomitant harm that far exceeds what they would have been able to achieve with a non-profit.
For example, if we are going to imagine a fake that charity that disrupts delivery of a medical service, why don’t we imagine that they do this by acting as a middleman that extracts a profit by reselling medical supplies for an unconscionable profit. This profit, in turn enables them to grow and slip their “services” between more people and their medical providers. While this may seem like a criminal enterprise, for many companies that exist today this basically their business model, and they operate at scales that eclipse most medical non-profits I know of.
Being a non-profit does provide good cover for operating in harmful fashion, but growth through accumulation of capital is very power mechanism—I don’t think we should be surprised to find the largest harm causing institutions use it as their engi
My point wasn’t that charities are incapable of doing harm. There are many examples of charities doing harm as you point out. The point is that reasoning that non-profits have more potential to cause harm that for-profit seems to ignore that many for-profit enterprises operate at much larger scale than any non-profits and do tremendous amounts of harm.
In a real business, overwhelming effort is being made to make sure the business is successful. In the hundred trillion dollar world economy, almost no one is paying money to help or harm people.
Yes, most successful business are primarily focused on making profit rather doing good or harm. But this doesn’t mean they aren’t willing to do harm in the pursuit of profit! If someone dedicates an amount of money to business that then grows it large enough to do lots of harm, even if as a side-effect, it’s quite conceivable they could accomplish more total harm than someone simply dedicating that same money to a directly harmful (but not profitable) venture.
There’s one huge difference between aiming to do good and aiming to make profit. If you set up a company aiming to make money, generally the very worst that can happen is that you go bankrupt; there’s a legal system in place that prevents you from getting burdened by arbitrarily large debt. However, if you set up a project aiming to do good, the amount of harm that you can do is basically unbounded.
I am very confused about this reasoning. It seems clear that there is a lot worse harm that can be caused by a for-profit enterprise than simply that enterprise going bankrupt. What about weapons manufacturers or fossil fuel and tobacco companies? There are many industries that profit from activities that many people would consider a net harm to humanity.
The key difference I see with a non-profit enterprise aiming to do good is that its scale is driven by external factors, the choices of donors. The harm a non-profit can cause is bounded simply because the funding it receives from its donors is bounded. In contrast, a successful for profit enterprise has a mechanism to scale itself up by using its profits to grow the enterprise. The practical implication of this is for profit corporations do end up growing to scales where they have great potential to do a lot of harm.
None of which is to say that the effective altruism movement, which as you indicate is expecting many billions of USD funding, doesn’t have great potential to do harm. It does need to take that responsibility seriously. Perhaps more importantly though, seeing as the EA movement is encouraging people to earn to give, it behooves the EA movement to consider harms caused in the process of earning. Moskovitz’s wealth derives from Facebook which arguable has done great harm globally by, among many other things, helping organize the genocide of Rohingya muslims in Myanmar and Jan 6 capitol insurrection in US. Bankman-Fried’s wealth derives from arbitrage on bitcoin sales and other crypto related ventures. Cryptocurrency wealth currently has a massive externality of CO2 emissions produced by running energy intensive proof-of-work algorithms on fossil power. Bankman-Fried isn’t responsible for emissions to equivalent if he generated all his wealth by mining bitcoin on fossil power, but he is certainly responsible for some fraction (50%? 20%?) of that.
Maybe if the EA community is judicious in allocating the capital Moskovitz and Bankman-Fried are planning to provide them with it will become the quite clear the benefits of that earning outweighed their harms. The funding of carefully considered public health measures and efforts to give directly to people living in poverty raises my confidence the EA community has a chance of achieving this. However, funding of efforts to mitigate the existential risk of imagined future AIs while ignoring the funding of institutes like the Algorithmic Justice League which seek to understand harms already being caused by existing AI algorithms lowers my confidence.
In his post announcing the new found wealth of EA movement stemming from FTX Will included this argument for why charitable enterprises are more dangerous than for profit companies:
At the time I remarked at how wrongheaded this seemed to me. Of course for profit companies can do a large amount of harm! In fact, because for profit companies have ability to use their profits to increase their scale, they have the potential to do immense harm.
Hopefully, the FTX fallout makes abundantly clear the original point I was trying to make and encourages some deeper reflection in this community of about how earn part of earn to give has potential to cause great harm.