Interesting ideas! A hypothesis I found relevant to this phenomenon, similar to yours:
The problem “maximize impact per resources spent” is not well-defined a priori.
For instance, it depends on the time frame and scale: there could be very cost-effective smallish interventions, that can’t scale that much, versus very large scale interventions that require massive coordination, investment, “stubborness”, etc.
[Of course, you should try to see if such things actually exist in the real world; FWIW, I suspect they do]
It also depends on the entity you consider: is it you as an individual? The small group of people who are willing to listen and do a project with you? The whole EA community? Humanity? You might be able to build a coherent system that takes into account these various levels though.
Another remark, that has more to do with execution than general principles, which you also touch upon: sharing all the information you have is not always a good idea. Unfortunately, the possible fixes (restricting information access to trusted people/groups) seem to go against the [EA/rationalist/...] culture of truth-seeking, open communication, etc.
Interesting ideas!
A hypothesis I found relevant to this phenomenon, similar to yours:
The problem “maximize impact per resources spent” is not well-defined a priori.
For instance, it depends on the time frame and scale: there could be very cost-effective smallish interventions, that can’t scale that much, versus very large scale interventions that require massive coordination, investment, “stubborness”, etc.
[Of course, you should try to see if such things actually exist in the real world; FWIW, I suspect they do]
It also depends on the entity you consider: is it you as an individual? The small group of people who are willing to listen and do a project with you? The whole EA community? Humanity?
You might be able to build a coherent system that takes into account these various levels though.
Another remark, that has more to do with execution than general principles, which you also touch upon: sharing all the information you have is not always a good idea. Unfortunately, the possible fixes (restricting information access to trusted people/groups) seem to go against the [EA/rationalist/...] culture of truth-seeking, open communication, etc.