It seems to me that there’s a difference between financial investment and EA bets: returns on financial bets can be then invested again, whereas returns on most EA bets are not more resources for the EA movement but are direct positive impact that helps our ultimate beneficiaries. So we can’t get compounding returns from these bets.
So, except for when we’re making bets to grow the resources of the EA movement, I don’t think I agree that EA making correlated bets is bad in itself—we just want the highest EV bets.
I think the most powerful form of compounding in the EA movement context is of people and reputation, which are upstream of money and influence. Great people + great reputation → more great people + more great reputation.
Most endeavours over long periods of time have some geometric/compounding aspects, and some arithmetic aspects.
But usually, I think compounding is more important: that’s how you avoid ruin (which isn’t a big deal outside of compounding unless you use log utility which is equivalent to caring about compounding), and that’s how you get really big returns.
Successful countries weren’t built in a day. Successful charities weren’t built in a day. Many things have to go right, and some things must not happen, for a movement to succeed. That’s essentially just compounding.
one might counter by saying the majority of decisions EAs make affect the reputation of ea which can then be used later. Though I doubt most org’s cost benefits are including the movements reputation change.
Also maybe there is some mechanism like the world getting better on certain dimensions unlocks ea paths that didn’t exist before. But in most cases this doesn’t seem super plausible.
I agree the argument doesn’t work, but there are at least two arguments for investing in charities with sub-optimal expected values that critically depend on time.
Going bust. Suppose you have two charity investments with expected values EXt=xt,EYt=yt. Here x1>y1, but there’s a potential for yt>xt in the future, for instance since you receive better information about the charities. If you invest once, investing everything in X is the correct answer since x1>y1. Now suppose that each time you don’t invest in Y, it has a chance of going bust. Then, if you invest more than once, it would be best to invest something in Y if the probability of Y going bust is high enough and yt+1>xt+1 with a sufficiently high probability.
Signaling effects. Not investing in the charity Yt may signal to charity entrepreneurs that there is nothing to gain by starting in a new charity similar to Y, thus limiting your future pool of potential investments. I can imagine this to be especially important if your calculation of the expected value is contentious, or if EYt has high epistemic uncertainty.
Edit: I think “going bust” example is similar to the spirit of the Kelly criterion, so I suppose you might say the argument does work.
It seems to me that there’s a difference between financial investment and EA bets: returns on financial bets can be then invested again, whereas returns on most EA bets are not more resources for the EA movement but are direct positive impact that helps our ultimate beneficiaries. So we can’t get compounding returns from these bets.
So, except for when we’re making bets to grow the resources of the EA movement, I don’t think I agree that EA making correlated bets is bad in itself—we just want the highest EV bets.
Does that seem right to you?
Hmm, I don’t think I agree.
I think the most powerful form of compounding in the EA movement context is of people and reputation, which are upstream of money and influence. Great people + great reputation → more great people + more great reputation.
Most endeavours over long periods of time have some geometric/compounding aspects, and some arithmetic aspects.
But usually, I think compounding is more important: that’s how you avoid ruin (which isn’t a big deal outside of compounding unless you use log utility which is equivalent to caring about compounding), and that’s how you get really big returns.
Successful countries weren’t built in a day. Successful charities weren’t built in a day. Many things have to go right, and some things must not happen, for a movement to succeed. That’s essentially just compounding.
Good point
one might counter by saying the majority of decisions EAs make affect the reputation of ea which can then be used later. Though I doubt most org’s cost benefits are including the movements reputation change.
Also maybe there is some mechanism like the world getting better on certain dimensions unlocks ea paths that didn’t exist before. But in most cases this doesn’t seem super plausible.
I agree the argument doesn’t work, but there are at least two arguments for investing in charities with sub-optimal expected values that critically depend on time.
Going bust. Suppose you have two charity investments with expected values EXt=xt,EYt=yt. Here x1>y1, but there’s a potential for yt>xt in the future, for instance since you receive better information about the charities. If you invest once, investing everything in X is the correct answer since x1>y1. Now suppose that each time you don’t invest in Y, it has a chance of going bust. Then, if you invest more than once, it would be best to invest something in Y if the probability of Y going bust is high enough and yt+1>xt+1 with a sufficiently high probability.
Signaling effects. Not investing in the charity Yt may signal to charity entrepreneurs that there is nothing to gain by starting in a new charity similar to Y, thus limiting your future pool of potential investments. I can imagine this to be especially important if your calculation of the expected value is contentious, or if EYt has high epistemic uncertainty.
Edit: I think “going bust” example is similar to the spirit of the Kelly criterion, so I suppose you might say the argument does work.