I haven’t expected EAs to have any unusual skill at spotting risks.
EAs have been unusual at distinguishing risks based on their magnitude. The risks from FTX didn’t look much like the risk of human extinction.
But half our resources to combat human extinction were at risk due to risks to FTX. Why didn’t we take that more seriously.
And also the community’s reputation to a very significant degree. It was arguably the biggest mistake EA has made thus far (or the biggest one that has become obvious – could imagine we’re making other mistakes that aren’t yet obvious).
Current theme: default
Less Wrong (text)
Less Wrong (link)
I haven’t expected EAs to have any unusual skill at spotting risks.
EAs have been unusual at distinguishing risks based on their magnitude. The risks from FTX didn’t look much like the risk of human extinction.
But half our resources to combat human extinction were at risk due to risks to FTX. Why didn’t we take that more seriously.
And also the community’s reputation to a very significant degree. It was arguably the biggest mistake EA has made thus far (or the biggest one that has become obvious – could imagine we’re making other mistakes that aren’t yet obvious).