[Question] What would you do if you had a lot of money/​power/​influence and you thought that AI timelines were very short?

To make things more specific:
Lot of money = $1B+; lot of power = CEO of $10B+ org; lot of influence = 1M+ followers, or an advisor to someone with a lot of money or power.
AI timelines = time until an AI-mediated existential catastrophe
Very short = ≥ 10% chance of it happening in ≤ 2 years.

Please don’t use this space to argue that AI x-risk isn’t possible/​likely, or that timelines aren’t that short. There are plenty of other places to do that. I want to know what you would do conditional on being in this scenario, not whether you think the scenario is likely.