Thanks a lot for raising this, Geoffrey. A while back I mentioned some personal feelings and possible risks related to the current Western political climate, from one non-Westerner’s perspective. You’ve articulated my intuitions very nicely here and in that article.
From a strategic perspective, it seems to me that if AGI takes longer to develop, the more likely it is that the expected decision-making power would be shared globally. EAs should consider that they might end up in that world and it might not be a good idea to create and enforce easily-violated, non-negotiable demands on issues that we’re not prioritizing (e.g. it would be quite bad if a Western EA ended up repeatedly reprimanding a potential Chinese collaborator simply because the latter speaks bluntly from the perspective of the former). To be clear, China has some of this as well (mostly relating to its geopolitical history) and I think feeling less strongly about those issues could be beneficial.
Thanks a lot for raising this, Geoffrey. A while back I mentioned some personal feelings and possible risks related to the current Western political climate, from one non-Westerner’s perspective. You’ve articulated my intuitions very nicely here and in that article.
From a strategic perspective, it seems to me that if AGI takes longer to develop, the more likely it is that the expected decision-making power would be shared globally. EAs should consider that they might end up in that world and it might not be a good idea to create and enforce easily-violated, non-negotiable demands on issues that we’re not prioritizing (e.g. it would be quite bad if a Western EA ended up repeatedly reprimanding a potential Chinese collaborator simply because the latter speaks bluntly from the perspective of the former). To be clear, China has some of this as well (mostly relating to its geopolitical history) and I think feeling less strongly about those issues could be beneficial.