I am writing here because the EA community should know now, that sentiment in global health and poverty, and animal welfare, is extremely low, especially among limited talent.
As EAs know, the FTX money favored longtermist causes. In the aftermath of the FTX collapse, EA is globally harmed, further disadvantaging these causes already in the shadow of this money.
The departure of this talent could be a wholesale disaster for EA, and leave it in a permanent weakened state. It is not being discussed, like dangers, such as the risk of FTX, due to the dynamics of EA discourse, which is easily dominated by full time influencers like Yudhowsky.
In this vulnerable state, undue attempts to associate Peter Singer, “EA”, and undue attempts to dissasociate “LW” and “rationality”, are an incredibly uncooperative defection.
Another example of this uncooperative behavior is the LW treatment of the Gopalakrishnan’s post, while received mixed reception, claims to point out serious misconduct.
On balance, negative claims of this supposed behavior is highly associated with SF and the EA/LW communities there.
The author implicates LW as well as EA, for example she says,
LessWrong style jedi mindtricks while they stand to benefit from the erosion of your boundaries.
My experience resonates with a few other women in SF I have spoken to. They have also met red pilled, exploitative men in EA/rationalist circles. EA/rationalism and redpill fit like yin and yang. Akin to how EA is an optimization of altruism with “suboptimal” human tendencies like morality and empathy stripped from it, red pill is an optimized sexual strategy with the humanity of women stripped from it.
This is the response by a member of the LW team, which reads more like an attempt to dissociate this conduct with LW and put it squarely with EA, instead of stating that the post seems tendentious or unproductive.
I am writing here because the EA community should know now, that sentiment in global health and poverty, and animal welfare, is extremely low, especially among limited talent.
As EAs know, the FTX money favored longtermist causes. In the aftermath of the FTX collapse, EA is globally harmed, further disadvantaging these causes already in the shadow of this money.
The departure of this talent could be a wholesale disaster for EA, and leave it in a permanent weakened state. It is not being discussed, like dangers, such as the risk of FTX, due to the dynamics of EA discourse, which is easily dominated by full time influencers like Yudhowsky.
In this vulnerable state, undue attempts to associate Peter Singer, “EA”, and undue attempts to dissasociate “LW” and “rationality”, are an incredibly uncooperative defection.
Another example of this uncooperative behavior is the LW treatment of the Gopalakrishnan’s post, while received mixed reception, claims to point out serious misconduct.
On balance, negative claims of this supposed behavior is highly associated with SF and the EA/LW communities there.
The author implicates LW as well as EA, for example she says,
This is the response by a member of the LW team, which reads more like an attempt to dissociate this conduct with LW and put it squarely with EA, instead of stating that the post seems tendentious or unproductive.
The above comment is not intellectually honest.