Thoughts on the unintended consequences of hyper optimizing for “impact”? Personally, I think it can lead to a lot of collateral damage that is rationalized for “impact”.
It can probably be problematic when an EA defines impact in productivity ways, versus really thinking about how they positively affect people and the world. So how exactly do we positively define impact in our lives? How can we prevent collateral damage in pursuit of it?
My impression of the general consensus is that maximization is very costly. If you try to maximize one metric, then all other things are harmed/damaged. “Moderation” isn’t exactly a cool and sexy idea, but it is very helpful and practical.
Thoughts on the unintended consequences of hyper optimizing for “impact”? Personally, I think it can lead to a lot of collateral damage that is rationalized for “impact”.
It can probably be problematic when an EA defines impact in productivity ways, versus really thinking about how they positively affect people and the world. So how exactly do we positively define impact in our lives? How can we prevent collateral damage in pursuit of it?
Keen to hear thoughts.
My impression of the general consensus is that maximization is very costly. If you try to maximize one metric, then all other things are harmed/damaged. “Moderation” isn’t exactly a cool and sexy idea, but it is very helpful and practical.
You might be interested to read EA is about maximization, and maximization is perilous.