Of course it is socially acceptable to disagree with “traumatized” people in EA.
Well you previously wrote:
I think writing such quoted passages in passive response to a disclosure of sexual assault is cruel and disgusting in my personal opinion
Insofar as others share this opinion of yours, it won’t be socially acceptable to express those particular disagreements.
Assuming we are talking about the Any Community That Tolerates Trauma Junkies Is Unsafe For Everyone Else post, it’s not something I would’ve predicted in advance would be considered “cruel and disgusting”. So it remains the case that I personally have some uncertainty regarding what opinions will be considered “cruel and disgusting”, to the point where it seems a bit safer socially to just avoid expressing much of any disagreement at all.
Whose job is it to identify EA questions which could benefit from better forecasts?
Consider two different hypotheses:
Forecasting is only helpful for AI
Forecasting is helpful outside of AI, but AI has captured much more forecasting interest than other cause areas
How much time are non-AI org leaders spending trying to think up decision-relevant forecasts related to their cause areas?
If leaders are not spending any time trying to think up such forecasts, maybe there is low-hanging fruit here. Maybe EA has latent forecasting capability which can be tapped to improve organizational decision-making. Or maybe such forecasting capability will free up in a few years if AI turns out to be a nothingburger.
If leaders have spent a lot of time trying to think up useful forecasts, and failed, maybe forecasting really is fairly useless outside of AI.
If I was leading a non-AI EA organization, and I had a forecast I really wanted to see the result of, who would I even talk to? Which forecasting organizations are actively soliciting ideas for EA-related forecast questions?
It seems to me that a lot of what EA does is implicit forecasting in some sense, e.g. if you give someone a grant, it’s an implicit forecast about the probability that they will be able to accomplish something with that grant. EA is often critiqued for neglecting “systemic change”. If you want to do systemic change, being able to forecast the effects of various systemic changes is really useful. If you take any action, there’s an implicit forecast that it will lead to a good outcome and not backfire somehow. Wouldn’t it be better to make this forecast explicit? All else equal, wouldn’t it be good to get some perspective from people outside of the organization, who are perhaps forecasting in their free time as a replacement for watching TV or other downtime activities?