Thanks for writing this—it was useful to read the pushbacks!
As I said below, I want more synthesis of these sorts of arguments. I know that some academic groups are preparing literature reviews of the key arguments for and against AGI risk.
I really think that we should be doing that for ourselves as a community and to make sure that we are able to present busy smart people with more compelling content than a range of arguments spread across many different forum posts.
I don’t think that that is going to cut it for many people in the policy space.
Agree. But at the same time, we need to do this fast! The typical academic paper review cycle is far too slow for this. We probably need groups like SAGE (and Independent SAGE?) to step in. In fact, I’ll try and get hold of them.. (they are for “emergencies” in general, not just Covid[1])
Thanks for writing this—it was useful to read the pushbacks!
As I said below, I want more synthesis of these sorts of arguments. I know that some academic groups are preparing literature reviews of the key arguments for and against AGI risk.
I really think that we should be doing that for ourselves as a community and to make sure that we are able to present busy smart people with more compelling content than a range of arguments spread across many different forum posts.
I don’t think that that is going to cut it for many people in the policy space.
Agree. But at the same time, we need to do this fast! The typical academic paper review cycle is far too slow for this. We probably need groups like SAGE (and Independent SAGE?) to step in. In fact, I’ll try and get hold of them.. (they are for “emergencies” in general, not just Covid[1])
Although it looks like they are highly specialised on viral threats. They would need totally new teams to be formed for AI. Maybe Hinton should chair?