Focusing just on the quoted text, I’m not sure “happy medium” is the right message to take from these two incidents. AI and blockchain involve two entirely different ways of thinking about risk control.
AI risk involves frequent events with undefined causes, whereas a digital currency collapse is a rare event with overdetermined causes. For the first you would need lots of communication in order to establish a logical sequence, whereas the second requires carefully controlled communications in order to prevent false logic from taking hold.
Let me try to steelman this fear (which I mostly disagree with):
Social media was originally thought to be a radical force for democratic change—see the Arab Spring, for instance.
The objective of disinformation was never to change minds, but to reduce trust in anonymous online interactions. See Russia’s human-based propaganda methods.
Thus, disinformation blunts the value proposition of social media platforms in allowing individuals to coordinate political action.
So it’s really an opportunity cost we’re talking about here in preventing social media from achieving its full potential—which may have been oversold in the first place.
My own view is that very few actors will attempt to target “political trust” as an abstract force. Instead, we should be significantly more concerned about financially-motivated scams targeting individuals.