I’ve previously been frustrated that “AI forecasting” has focused heavily on “when will AGI happen” as opposed to other potential strategic questions. I think there are many interesting strategic questions. That said, I think in the last 1-2 years things have improved here. I’ve been impressed by a lot of the recent work by Epoch, for instance.
My guess is that a lot of our community is already convinced. But I don’t think we’re the target market for much of this.
Interestingly, OP really does not seem to be convinced. Or, they have a few employees who are convinced of short timelines, but their broader spending really doesn’t seem very AGI-pilled to me (tons of non-AI spending, for instance). I’m happy for OP to spend more money investigating these questions, for reasons of whether OP should spend more money in this area in the future.
It sounds like you have some very specific individuals/people in mind, in terms of parts like “If your intervention is so fragile and contingent that every little update to timeline forecasts matters, it’s probably too finicky to be working on in the first place.” I’m really not sure who you are referring to here.
I’d agree that the day-to-day of “what AI came out today” gets too much attention, but this doesn’t seem like an “AI timelines” thing to me, more like an over-prioritization of recent news.
On ai-2027.com; I see this as dramatically more than answering ”when will AGI happen.” It’s trying to be very precise about what a short-timeline world would look like. This contains a lot of relevant strategic questions/discussions.
Quick thoughts:
I’ve previously been frustrated that “AI forecasting” has focused heavily on “when will AGI happen” as opposed to other potential strategic questions. I think there are many interesting strategic questions. That said, I think in the last 1-2 years things have improved here. I’ve been impressed by a lot of the recent work by Epoch, for instance.
My guess is that a lot of our community is already convinced. But I don’t think we’re the target market for much of this.
Interestingly, OP really does not seem to be convinced. Or, they have a few employees who are convinced of short timelines, but their broader spending really doesn’t seem very AGI-pilled to me (tons of non-AI spending, for instance). I’m happy for OP to spend more money investigating these questions, for reasons of whether OP should spend more money in this area in the future.
It sounds like you have some very specific individuals/people in mind, in terms of parts like “If your intervention is so fragile and contingent that every little update to timeline forecasts matters, it’s probably too finicky to be working on in the first place.” I’m really not sure who you are referring to here.
I’d agree that the day-to-day of “what AI came out today” gets too much attention, but this doesn’t seem like an “AI timelines” thing to me, more like an over-prioritization of recent news.
On ai-2027.com; I see this as dramatically more than answering ”when will AGI happen.” It’s trying to be very precise about what a short-timeline world would look like. This contains a lot of relevant strategic questions/discussions.
Yeah I tried to exempt AI 2027 from my critique. They are doing a lot more, and well.