Hey Greg, an estimate can be really highly correlated with reality and still be terribly biased. For instance, I can estimate that I will save 100, 200 and 300 lives in each of the respective next three years. In reality, I will actually save 1, 2 and 3 lives. My estimates explain 100% of the variance there, but there’s a huge upward bias. This could be similar to the situation with GiveWell’s estimates, which have dropped by an order of magnitude every couple of years. Does your approach guard against this? If not, I don’t think it would deserve to be taken literally.
which have dropped by an order of magnitude every couple of years
This seems like a really important point, and I wonder if anyone has blogged on this particular topic yet. In particular:
How should we expect this trend to continue?
Does it increase the activation energy for getting involved in EA? (my interest in EA was first aroused by GW and how cheap they reckoned it was to save a life via VillageReach)
Does it affect the claim that a minority of charities are orders of magnitude more effective than the rest?
If we become able to put numbers to the effectiveness of a new area, such as xrisk or meta, would we expect to see the same exponential drop-off in our estimates even if we’re aware of this problem?
Hey Greg, an estimate can be really highly correlated with reality and still be terribly biased. For instance, I can estimate that I will save 100, 200 and 300 lives in each of the respective next three years. In reality, I will actually save 1, 2 and 3 lives. My estimates explain 100% of the variance there, but there’s a huge upward bias. This could be similar to the situation with GiveWell’s estimates, which have dropped by an order of magnitude every couple of years. Does your approach guard against this? If not, I don’t think it would deserve to be taken literally.
This seems like a really important point, and I wonder if anyone has blogged on this particular topic yet. In particular:
How should we expect this trend to continue?
Does it increase the activation energy for getting involved in EA? (my interest in EA was first aroused by GW and how cheap they reckoned it was to save a life via VillageReach)
Does it affect the claim that a minority of charities are orders of magnitude more effective than the rest?
If we become able to put numbers to the effectiveness of a new area, such as xrisk or meta, would we expect to see the same exponential drop-off in our estimates even if we’re aware of this problem?