I think that without knowing people’s assessment of extinction risk (e.g. chance of extinction over the next 5, 10, 20, 50, 100 years)[1], the answers here don’t provide a lot of information value.
I think a lot of people on the disagree side would change their mind if they believed (as I do) that there is a >50% chance of extinction in the next 5 years (absent further intervention).
Would be good if there was a short survey to establish such background assumptions to people’s votes.
I think that without knowing people’s assessment of extinction risk (e.g. chance of extinction over the next 5, 10, 20, 50, 100 years)[1], the answers here don’t provide a lot of information value.
I think a lot of people on the disagree side would change their mind if they believed (as I do) that there is a >50% chance of extinction in the next 5 years (absent further intervention).
Would be good if there was a short survey to establish such background assumptions to people’s votes.
And their assessment of the chance that AI successors will be morally valuable, as per footnote 2 of the statement.