I’m afraid that I don’t remember the specific name nor the specific formula (and a cursory Google search hasn’t been able to job my memory), but there is also the concept within operations management of not optimizing a system too much, because that decreases effectiveness. If my memory serves, you can roughly think of it that if you are too highly optimized, your system is rigid/fragile and lacks the slack/flexibility to deal with unexpected but inevitable shocks.
I posted this to LessWrong as well, and one of the commenters there mentions the “performance / robustness stability tradeoff in controls theory”. Is that the same as what you’re thinking of?
Reminds me of the result in queueing theory, where (in the simplest queue model) going above ~80% utilisation of your capacity leads to massive increases in waiting time.
I’m afraid that I don’t remember the specific name nor the specific formula (and a cursory Google search hasn’t been able to job my memory), but there is also the concept within operations management of not optimizing a system too much, because that decreases effectiveness. If my memory serves, you can roughly think of it that if you are too highly optimized, your system is rigid/fragile and lacks the slack/flexibility to deal with unexpected but inevitable shocks.
I posted this to LessWrong as well, and one of the commenters there mentions the “performance / robustness stability tradeoff in controls theory”. Is that the same as what you’re thinking of?
Reminds me of the result in queueing theory, where (in the simplest queue model) going above ~80% utilisation of your capacity leads to massive increases in waiting time.