Ok, that’s more of a semantic issue with the definition of AGI then. FTX Future Fund care about AI that poses an existential threat, not about whether such AI is AGI, or strong AI or true AI or whatever. Perhaps Transformative AI or TAI (as per OpenPhil’s definition) would be better used in this case.
I’m not sure what Future Fund care about, but they do go into some length defining what they mean by AGI, and they do care about when this AGI will be achieved. This is what I am responding to.
Ok, that’s more of a semantic issue with the definition of AGI then. FTX Future Fund care about AI that poses an existential threat, not about whether such AI is AGI, or strong AI or true AI or whatever. Perhaps Transformative AI or TAI (as per OpenPhil’s definition) would be better used in this case.
I’m not sure what Future Fund care about, but they do go into some length defining what they mean by AGI, and they do care about when this AGI will be achieved. This is what I am responding to.