I want to be clear that I am endorsing not only the sentiment but the drastic framing. At the end of the day, a few 100k here and there is literally a rounding error on what matters and I would much rather top researchers were spending this money of weird things that might help them slightly than we had a few more mediocre researchers who are working on things that don’t really matter.
I certainly wouldn’t say this about any researcher, if they could work in constellation/lightcone they have a 30% chance of hitting my bar. I am much more excited about this for the obvious top people at constellation/lightcone.
(If someone actually talked about things that make them 0.01% more productive, that suggests they have lost the plot.)
I don’t really like this, presumably if impact is extremely heavy tailed we can get a lot of value from finding these actvities and a general aversion to this bcause it might waste mere money seems very bad. Things like optics are more of a reason to be careful but idk, maybe we should just make anonymous forum accounts to discuss these things and then actually take our ideas seriously.
Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn’t been studied, or they’ve found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there’s no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they’re calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we probably shouldn’t give them lots of money.
A separate point that makes me uneasy about your specific example (but not about generally spending more money on some people with the rationale that impact is likely extremely heavy-tailed) is the following. I think even people with comparatively low dark personality traits are susceptible to corruption by power. Therefore, I’d want people to have mental inhibitions from developing taste that’s too extravagant. It’s a fuzzy argument because one could say the same thing about spending $50 on an uber eats order, and on that sort of example, my intuition is “Obviously it’s easy to develop this sort of taste and if it saves people time, they should do it rather than spend willpower on changing their food habits.” But on a scale from $50 uber eats orders to spending $150,000 on a sports car, there’s probably a point somewhere where someone’s conduct too dissimilar to the archetype of “person on a world-saving mission.” I think someone you can trust with a lot of money and power would be wise enough that, if they ever form the thought “I should get a sports car because I’d be more productive if I had one,” they’d sound a mental alarm and start worrying they got corrupted. (And maybe they’ll end up buying the sports car anyway, but they certainly won’t be thinking “this is good for impact.”)
I want to be clear that I am endorsing not only the sentiment but the drastic framing. At the end of the day, a few 100k here and there is literally a rounding error on what matters and I would much rather top researchers were spending this money of weird things that might help them slightly than we had a few more mediocre researchers who are working on things that don’t really matter.
I certainly wouldn’t say this about any researcher, if they could work in constellation/lightcone they have a 30% chance of hitting my bar. I am much more excited about this for the obvious top people at constellation/lightcone.
I don’t really like this, presumably if impact is extremely heavy tailed we can get a lot of value from finding these actvities and a general aversion to this bcause it might waste mere money seems very bad. Things like optics are more of a reason to be careful but idk, maybe we should just make anonymous forum accounts to discuss these things and then actually take our ideas seriously.
Superforecasters can predict more accurately if they make predictions at 1% increments rather than 2% increments. It either hasn’t been studied, or they’ve found negative evidence, whether they can make predictions at lower % increments. 0.01% increments are way below anything that people regularly predict on; there’s no way to develop the calibration for that. In my comment, I meant to point out that anyone who thinks they’re calibrated enough to talk about 0.01% differences, or even just things close to that, is clearly not a fantastic researcher and we probably shouldn’t give them lots of money.
A separate point that makes me uneasy about your specific example (but not about generally spending more money on some people with the rationale that impact is likely extremely heavy-tailed) is the following. I think even people with comparatively low dark personality traits are susceptible to corruption by power. Therefore, I’d want people to have mental inhibitions from developing taste that’s too extravagant. It’s a fuzzy argument because one could say the same thing about spending $50 on an uber eats order, and on that sort of example, my intuition is “Obviously it’s easy to develop this sort of taste and if it saves people time, they should do it rather than spend willpower on changing their food habits.”
But on a scale from $50 uber eats orders to spending $150,000 on a sports car, there’s probably a point somewhere where someone’s conduct too dissimilar to the archetype of “person on a world-saving mission.” I think someone you can trust with a lot of money and power would be wise enough that, if they ever form the thought “I should get a sports car because I’d be more productive if I had one,” they’d sound a mental alarm and start worrying they got corrupted. (And maybe they’ll end up buying the sports car anyway, but they certainly won’t be thinking “this is good for impact.”)