I was imagining someone who thinks that, say, there’s a 90% risk of unaligned AI takeover, and a 50% loss of EV of the future from other non-alignment issues that we can influence. So EV of the future is 25%.
I’m not understanding—if there’s no value in the 90%, and then 50% value in the remaining 10%, wouldn’t the EV of the future be 5%?
I’m not understanding—if there’s no value in the 90%, and then 50% value in the remaining 10%, wouldn’t the EV of the future be 5%?
Argh, thanks for catching that! Edited now.