[Question] Does the idea of AGI that benevolently control us appeal to EA folks?

Can we get the benefits of AGI when they control us benevolently, keeping us from making mistakes while guiding us to a life on exoplanets and a massive population increase, a techno-utopia? Is that a worthwhile solution to the alignment problem?