I have not seen anything remotely convincing that the EA community has any insight into how to improve human decision making in group settings, which I believe is how “improving institutional decision making” should be phrased. Institutions do not make decisions. Phrasing this wrong is part of the problem, in my opinion. The assumption, correct me if I am wrong, being made is that making better decisions is a matter of having better information. This assumption, ironically, appears to lack any information about how humans make decisions in groups. For example, people tend very strongly to make decisions that align with their existing beliefs and group commitments. Humans also have the evolved skill of self-deception, which allows us to not know information that conflicts with our beliefs. Further, there is a thing called “office politics” where the source of an idea greatly or entirely determines whether it is accepted or rejected.
As for improving global coordination...to achieve what, exactly? If we want to improve global coordination, I believe, we need to suggest plans that benefit all the stakeholders and power holders, as well as achieve whatever benefit we hope to achieve. I do not believe anyone in EA has an interest in developing such ideas, correct me if I am wrong. I tried a few years ago to share such ideas, and the reaction was pretty hostile.
I have not seen anything remotely convincing that the EA community has any insight into how to improve human decision making in group settings, which I believe is how “improving institutional decision making” should be phrased. Institutions do not make decisions. Phrasing this wrong is part of the problem, in my opinion. The assumption, correct me if I am wrong, being made is that making better decisions is a matter of having better information. This assumption, ironically, appears to lack any information about how humans make decisions in groups. For example, people tend very strongly to make decisions that align with their existing beliefs and group commitments. Humans also have the evolved skill of self-deception, which allows us to not know information that conflicts with our beliefs. Further, there is a thing called “office politics” where the source of an idea greatly or entirely determines whether it is accepted or rejected.
As for improving global coordination...to achieve what, exactly? If we want to improve global coordination, I believe, we need to suggest plans that benefit all the stakeholders and power holders, as well as achieve whatever benefit we hope to achieve. I do not believe anyone in EA has an interest in developing such ideas, correct me if I am wrong. I tried a few years ago to share such ideas, and the reaction was pretty hostile.