Like a lot of this post, this is a bit of an intuition-based ‘hot take’. But some quick things that come to mind: i) iirc it didn’t seem like our initial intuitions were very different to the WFM results, ii) when we filled in the weighted factor model I think we had a pretty limited understanding of what each project involved (so you might not expect super useful results), iii) I got a bit more of a belief that it just matters a lot that central-AI-x-risk people have a lot of context (and that this more than offsets the a risk of bias and groupthink) so understanding their view is very helpful, iv) having a deep understanding of the project and the space just seems very important for figuring out what if anything should be done and what kinds of profiles might be best for the potential founders
Like a lot of this post, this is a bit of an intuition-based ‘hot take’. But some quick things that come to mind: i) iirc it didn’t seem like our initial intuitions were very different to the WFM results, ii) when we filled in the weighted factor model I think we had a pretty limited understanding of what each project involved (so you might not expect super useful results), iii) I got a bit more of a belief that it just matters a lot that central-AI-x-risk people have a lot of context (and that this more than offsets the a risk of bias and groupthink) so understanding their view is very helpful, iv) having a deep understanding of the project and the space just seems very important for figuring out what if anything should be done and what kinds of profiles might be best for the potential founders