Yes, arguably this prize doesn’t require any original research (or STEM breakthroughs), so could be won just by convincing argumentation based on existing knowledge. Prizes relating to (non-fiction) writing seem like a more relevant reference class than scientific prizes. And this prize seems correspondingly a lot more accessible (and lower effort to enter) on the face of it.
And this prize seems correspondingly a lot more accessible (and lower effort to enter) on the face of it.
At the face of it, sure, but I don’t think this should fool people into thinking it is easy. Authors of the prize have been heavily informed by reports and conversations from smart people who have spent years thinking about AI risk. IMO you need atleast a month or two to understand the jargon, norms and core ideas of the subculture well enough to write something that will be understood. After this, you need to come with original ideas or concepts or frames, which may ofcourse build upon existing ones. And on top of that, a lot of alignment research does not look like or follow all norms of science as is usually practiced. This makes the task both easier and harder in different ways.
I agree it’s easier than winning a Nobel or breakthrough prize though.
Yes, arguably this prize doesn’t require any original research (or STEM breakthroughs), so could be won just by convincing argumentation based on existing knowledge. Prizes relating to (non-fiction) writing seem like a more relevant reference class than scientific prizes. And this prize seems correspondingly a lot more accessible (and lower effort to enter) on the face of it.
At the face of it, sure, but I don’t think this should fool people into thinking it is easy. Authors of the prize have been heavily informed by reports and conversations from smart people who have spent years thinking about AI risk. IMO you need atleast a month or two to understand the jargon, norms and core ideas of the subculture well enough to write something that will be understood. After this, you need to come with original ideas or concepts or frames, which may ofcourse build upon existing ones. And on top of that, a lot of alignment research does not look like or follow all norms of science as is usually practiced. This makes the task both easier and harder in different ways.
I agree it’s easier than winning a Nobel or breakthrough prize though.