Is this actually counter-intuitive? Most people seem to have quite a strong intuition that very very broad things like ‘speeding up technological progress/discoveries’, ‘general medical research’, ‘good policy’ and so on are very good things to work on or contribute to, and they mostly work by the many small benefits dynamic. In fact, I think if you asked non-EAs with an amenable mindset what the best ways of improving the world are, they’d come up with many more broad and relatively un-targeted things than narrow and relatively targeted things.
I think this is actually a good instinct and the description of ‘global public goods’ sums up why I think it is a good instinct. It only happens to be wrong, in my view, because the multipliers on human welfare you get just by transferring from rich countries to poor countries are currently so high; an investment in (e.g.) the American internet infrastructure that delivers 50x returns basically straight up loses to Give Directly in my book*, and I think GD is some way from the most effective intervention if you’re willing to be risk-neutral.
Also, a trivial observation:
“When you’re at a crosswalk and a bus approaches, you can either wait for the bus to pass and then cross, or cross and make the bus wait for you. If you wait, you save one minute of time. If you make the bus wait, each person on that bus waits for one minute of time. If the bus has sixty people in it, that’s an hour that was just spent waiting. That sounds like a lot when framed in those terms, but it’s not something we ever think about when crossing the street.”
I actually routinely do think about this an act on it, mostly by almost never pressing the traffic light button unless not doing so will cause me to wait indefinitely because the road is too busy. It’s normally just not worth holding up the whole queue of traffic if a gap will present itself shortly, and I’ve considered this pretty obvious for as long as I can remember.
*I’m over-simplifying for effect here, but I have seen cases within EA where this point seems to just get missed or underrated. 100x payoffs are really hard to find. They should be hard to find. You should be very surprised if you stumble across one.
Is this actually counter-intuitive? Most people seem to have quite a strong intuition that very very broad things like ‘speeding up technological progress/discoveries’, ‘general medical research’, ‘good policy’ and so on are very good things to work on or contribute to, and they mostly work by the many small benefits dynamic. In fact, I think if you asked non-EAs with an amenable mindset what the best ways of improving the world are, they’d come up with many more broad and relatively un-targeted things than narrow and relatively targeted things.
I think this is actually a good instinct and the description of ‘global public goods’ sums up why I think it is a good instinct. It only happens to be wrong, in my view, because the multipliers on human welfare you get just by transferring from rich countries to poor countries are currently so high; an investment in (e.g.) the American internet infrastructure that delivers 50x returns basically straight up loses to Give Directly in my book*, and I think GD is some way from the most effective intervention if you’re willing to be risk-neutral.
Also, a trivial observation:
“When you’re at a crosswalk and a bus approaches, you can either wait for the bus to pass and then cross, or cross and make the bus wait for you. If you wait, you save one minute of time. If you make the bus wait, each person on that bus waits for one minute of time. If the bus has sixty people in it, that’s an hour that was just spent waiting. That sounds like a lot when framed in those terms, but it’s not something we ever think about when crossing the street.”
I actually routinely do think about this an act on it, mostly by almost never pressing the traffic light button unless not doing so will cause me to wait indefinitely because the road is too busy. It’s normally just not worth holding up the whole queue of traffic if a gap will present itself shortly, and I’ve considered this pretty obvious for as long as I can remember.
*I’m over-simplifying for effect here, but I have seen cases within EA where this point seems to just get missed or underrated. 100x payoffs are really hard to find. They should be hard to find. You should be very surprised if you stumble across one.