Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn’t been studied, or they’ve found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there’s no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they’re calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we probably shouldn’t give them lots of money.
A separate point that makes me uneasy about your specific example (but not about generally spending more money on some people with the rationale that impact is likely extremely heavy-tailed) is the following. I think even people with comparatively low dark personality traits are susceptible to corruption by power. Therefore, I’d want people to have mental inhibitions from developing taste that’s too extravagant. It’s a fuzzy argument because one could say the same thing about spending $50 on an uber eats order, and on that sort of example, my intuition is “Obviously it’s easy to develop this sort of taste and if it saves people time, they should do it rather than spend willpower on changing their food habits.” But on a scale from $50 uber eats orders to spending $150,000 on a sports car, there’s probably a point somewhere where someone’s conduct too dissimilar to the archetype of “person on a world-saving mission.” I think someone you can trust with a lot of money and power would be wise enough that, if they ever form the thought “I should get a sports car because I’d be more productive if I had one,” they’d sound a mental alarm and start worrying they got corrupted. (And maybe they’ll end up buying the sports car anyway, but they certainly won’t be thinking “this is good for impact.”)
Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn’t been studied, or they’ve found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there’s no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they’re calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we probably shouldn’t give them lots of money.
A separate point that makes me uneasy about your specific example (but not about generally spending more money on some people with the rationale that impact is likely extremely heavy-tailed) is the following. I think even people with comparatively low dark personality traits are susceptible to corruption by power. Therefore, I’d want people to have mental inhibitions from developing taste that’s too extravagant. It’s a fuzzy argument because one could say the same thing about spending $50 on an uber eats order, and on that sort of example, my intuition is “Obviously it’s easy to develop this sort of taste and if it saves people time, they should do it rather than spend willpower on changing their food habits.”
But on a scale from $50 uber eats orders to spending $150,000 on a sports car, there’s probably a point somewhere where someone’s conduct too dissimilar to the archetype of “person on a world-saving mission.” I think someone you can trust with a lot of money and power would be wise enough that, if they ever form the thought “I should get a sports car because I’d be more productive if I had one,” they’d sound a mental alarm and start worrying they got corrupted. (And maybe they’ll end up buying the sports car anyway, but they certainly won’t be thinking “this is good for impact.”)