Awesome news, thanks! Looking forward to hearing more about the operationalization, and logistics.
Wondering if there could be a way to incorporate the fact that doom is conditional on year that TAI is developed (i.e how well developed AI Alignment/strategy/governance is when TAI is possible)? P(doom|TAI in year 20xx) and P(10% chance of TAI in year 20xx) are both important questions.
Awesome news, thanks! Looking forward to hearing more about the operationalization, and logistics.
Wondering if there could be a way to incorporate the fact that doom is conditional on year that TAI is developed (i.e how well developed AI Alignment/strategy/governance is when TAI is possible)? P(doom|TAI in year 20xx) and P(10% chance of TAI in year 20xx) are both important questions.