Post-FTX, I think core EA adopted a “PR mentality” that (i) has been a failure on its own terms and (ii) is corrosive to EA’s soul.
I find it helpful to distinguish two things, one which I think EA is doing too much of and one which EA is doing too little of:
Suppressing (the discussion of) certain ideas (e.g. concern for animals of uncertain sentience): I agree this seems deeply corrosive (even if an individual could theoretically hold onto the fact that x matters and even act to advance the cause of x, while not talking publicly about it, obviously the collective second-order effects mean that not publicly discussing x prevents many other people forming true beliefs about or acting in service of x (often with many other downstream effects on their beliefs and actions regarding y, z...).
Attending carefully to the effect of communicating ideas in different ways: how an idea is communicated can make a big difference to how it is understood and received (even if all the expressions of the idea are equally accurate). For example, if you talk about “extinction from AI”, will people even understand this to refer to extinction and not the metaphorical extinction of job losses, or per your recent useful example, if you talk about “AI Safety”, will people understand this to refer to mean “stop all AI development”. I think this kind of focus on clear and compelling communication is typically not corrosive, but often neglected by EAs (and often undertaken only at the level of intuitive vibes, rather than testing how people receive communications differently framed.
Thanks—I agree the latter is important, and I think it’s an error if “Attending carefully to the effect of communicating ideas in different ways” (appreciating that most of your audience is not extremely high-decoupling, etc) is rounded off to being overly focused on PR.
I find it helpful to distinguish two things, one which I think EA is doing too much of and one which EA is doing too little of:
Suppressing (the discussion of) certain ideas (e.g. concern for animals of uncertain sentience): I agree this seems deeply corrosive (even if an individual could theoretically hold onto the fact that x matters and even act to advance the cause of x, while not talking publicly about it, obviously the collective second-order effects mean that not publicly discussing x prevents many other people forming true beliefs about or acting in service of x (often with many other downstream effects on their beliefs and actions regarding y, z...).
Attending carefully to the effect of communicating ideas in different ways: how an idea is communicated can make a big difference to how it is understood and received (even if all the expressions of the idea are equally accurate). For example, if you talk about “extinction from AI”, will people even understand this to refer to extinction and not the metaphorical extinction of job losses, or per your recent useful example, if you talk about “AI Safety”, will people understand this to refer to mean “stop all AI development”. I think this kind of focus on clear and compelling communication is typically not corrosive, but often neglected by EAs (and often undertaken only at the level of intuitive vibes, rather than testing how people receive communications differently framed.
Thanks—I agree the latter is important, and I think it’s an error if “Attending carefully to the effect of communicating ideas in different ways” (appreciating that most of your audience is not extremely high-decoupling, etc) is rounded off to being overly focused on PR.