The previous point notwithstanding, people’s attention spans are extremely short, and the median outcome of a news story is ~nothing. I’ve commented before that FTX’s collapse had little effect on the average person’s perception of EA, and we might expect a similar thing to happen here.
I think this is an oversimplification. This effect is largely caused by competing messages; the modern internet optimizes information for memetic fitness e.g. by maximizing emotional intensity or persuasive effect, and people have so much routine exposure to stuff that leads their minds around in various directions that they get wary (or see having strong reactions to anything at all as immature, since a large portion of outcries on the internet are disproportionately from teenagers). This is the main reason why people take things with a grain of salt.
However, overton windows can still undergo big and lasting shifts (this process could also be engineered deliberately long before generative AI emerged, e.g. via clown attacks which exploit social status instincts to consistently hijack any person’s impressions of any targeted concept). The 80,000 hours podcast with Cass Sunstein covered how Overton windows are dominated by vague impressions of what ideas are acceptable or unacceptable to talk about (note: this podcast was from 2019). This dynamic could plausibly strangle EA’s access to fresh talent, and AI safety’s access to mission-critical policy influence, for several years (which would be far too long).
It can be frustrating to feel that a group you are part of is being judged by the actions of a couple people you’ve never met nor have any strong feelings about.
On the flip side, johnswentworth actually had a pretty good take on this; that the human brain is instinctively predisposed to over-focus on the risk of their in-group becoming unpopular among everyone else:
First, [AI safety being condemned by the public] sure does sound like the sort of thing which the human brain presents to us as a far larger, more important fact than it actually is. Ingroup losing status? Few things are more prone to distorted perception than that.
Thanks for the helpful comment – I had not seen John’s dialogue and I think he is making a valid point.
Fair point that the lack of impact might not be due to attention span but instead things like having competing messages.
In case you missed it: Angelina Li compiled some growth metrics about EA here; they seem to indicate that FTX’s collapse did not “strangle” EA (though it probably wasn’t good).
Upvoted, I’m grateful for the sober analysis.
I think this is an oversimplification. This effect is largely caused by competing messages; the modern internet optimizes information for memetic fitness e.g. by maximizing emotional intensity or persuasive effect, and people have so much routine exposure to stuff that leads their minds around in various directions that they get wary (or see having strong reactions to anything at all as immature, since a large portion of outcries on the internet are disproportionately from teenagers). This is the main reason why people take things with a grain of salt.
However, overton windows can still undergo big and lasting shifts (this process could also be engineered deliberately long before generative AI emerged, e.g. via clown attacks which exploit social status instincts to consistently hijack any person’s impressions of any targeted concept). The 80,000 hours podcast with Cass Sunstein covered how Overton windows are dominated by vague impressions of what ideas are acceptable or unacceptable to talk about (note: this podcast was from 2019). This dynamic could plausibly strangle EA’s access to fresh talent, and AI safety’s access to mission-critical policy influence, for several years (which would be far too long).
On the flip side, johnswentworth actually had a pretty good take on this; that the human brain is instinctively predisposed to over-focus on the risk of their in-group becoming unpopular among everyone else:
Thanks for the helpful comment – I had not seen John’s dialogue and I think he is making a valid point.
Fair point that the lack of impact might not be due to attention span but instead things like having competing messages.
In case you missed it: Angelina Li compiled some growth metrics about EA here; they seem to indicate that FTX’s collapse did not “strangle” EA (though it probably wasn’t good).