To be clear, I think there is absolutely no intention of doing this. EA existed before AI became hot, and many EAs have expressed concerns about the recent, hard pivot towards AI. It seems in part, maybe mostly (?), to be a result of funding priorities. In fact, a feature of EA that hopefully makes it more immune than many impact focused communities to donor influence (although far from total immunity!) is the value placed on epistemics—decisions and priorities should be argued clearly and transparently, why AI should take priority over other cause areas. Glad to have you engage skeptically on this!
To be clear, I think there is absolutely no intention of doing this. EA existed before AI became hot, and many EAs have expressed concerns about the recent, hard pivot towards AI. It seems in part, maybe mostly (?), to be a result of funding priorities. In fact, a feature of EA that hopefully makes it more immune than many impact focused communities to donor influence (although far from total immunity!) is the value placed on epistemics—decisions and priorities should be argued clearly and transparently, why AI should take priority over other cause areas. Glad to have you engage skeptically on this!