Omega does acknowledge the value of public communications efforts:
Moreover, in recent years Connor has been a vocal public advocate for safety: although we disagree in some cases with the framing of the resulting media articles, in general we are excited to see greater public awareness of safety risks.[5]
I think they don’t emphasize this more in this piece because of their concerns about Connor/Conjecture’s particular style of communications:
We think there is a reasonable risk that Connor and Conjecture’s outreach to policymakers and media is alarmist and may decrease the credibility of x-risk.
...
^In particular, Connor has referred to AGI as god-like multiple times in interviews (CNN, Sifted). We are skeptical if this framing is helpful.
I can see where you’re coming from. However, I think it’s worth noting that “raise the alarm” isn’t straightforwardly the appropriate response to “the situation is alarming, according to my inside view,” for unilateralist’s curse-type reasons. (I imagine this has been discussed in more depth elsewhere.)
What is the appropriate response? (This is not a rhetorical question; I want to know). There may be some risk of alarmism being negative, but I don’t think there is much risk of it being “net” negative, given the default no-action path is we all just get killed in a few years. Also it’s ironic that the EA community talks a lot about the unilateralist’s curse, yet is arguably responsible for the worst cases of it (supporting DeepMind, OpenAI, and Anthropic, and thus kicking off and accelerating the race to uncontrollable AGI).
Omega does acknowledge the value of public communications efforts:
I think they don’t emphasize this more in this piece because of their concerns about Connor/Conjecture’s particular style of communications:
What if alarmism—raising the alarm—is actually the appropriate response? The current situation is highly alarming imo.
I can see where you’re coming from. However, I think it’s worth noting that “raise the alarm” isn’t straightforwardly the appropriate response to “the situation is alarming, according to my inside view,” for unilateralist’s curse-type reasons. (I imagine this has been discussed in more depth elsewhere.)
What is the appropriate response? (This is not a rhetorical question; I want to know). There may be some risk of alarmism being negative, but I don’t think there is much risk of it being “net” negative, given the default no-action path is we all just get killed in a few years. Also it’s ironic that the EA community talks a lot about the unilateralist’s curse, yet is arguably responsible for the worst cases of it (supporting DeepMind, OpenAI, and Anthropic, and thus kicking off and accelerating the race to uncontrollable AGI).