Don’t have time to write a full article, but there is a lot, a lot to unpack.
First, it’s worth stopping to just unpack the object level claims in the parent comment—the reasoning is pretty wild, even on the surface level:
So absolutely, yes someone working on AI safety in an EA-funded AI safety job, or for OpenAI/Anthropic/DeepMind can donate more than someone working in a mere neartermist job in one of the neartermist orgs...ummm, I think there are more than one issue in that last sentence if you stop to think about it.
I would probably challenge the main claim, that that associated community is a dominant fraction of working income funding for bednets.
Also, money, shouldn’t be a criteria to discuss or waive serious issues. No matter what amount of money a community donates, does not mean it can be subject to bigotry or prejudice, or be shielded from investigation or concern of serious crime or misconduct.
Like, so much to unpack. Some of the points above, especially the third point, really make a person guess about what really people believe about SBF and FTX, versus what they say.
Getting to the big stuff:
Even before November, e.g. before the FTX fraud, the Bostrom/Tegmark “Peter Pan” management, the concentrated Bay Area sexual abuse, the “castles”.
I don’t think that even the longtermists in say, 2012, thinking about EA, would find the late 2022 state of longtermism ideal.
The affect from longtermist (or rather the current instantiation/culture of “longtermists”) on the rest of “neartermist” EA is not publicly known even in EA circles.
The truth is uh, much, more different than the parent comment suggests.
The main complaint from neartermist is not something like bednet funding should be 2X or 10X. It’s not even clear most global health people want more bednets.
They want something else, or even a host of other things, but they might not be fully aware of it. It would be the most brutal and sad thing to crisply elaborate on this and IMO dangerous for reasons below.
If they knew the truth, or were given a few anedcotes, the neartermists would probably riot.
I can’t give specific statements and I don’t have time to explain, but like, trust me, we do not want this poked at, and I strongly recommend not writing more about this or making similar comments like the parent one.
Wow.
Don’t have time to write a full article, but there is a lot, a lot to unpack.
First, it’s worth stopping to just unpack the object level claims in the parent comment—the reasoning is pretty wild, even on the surface level:
So absolutely, yes someone working on AI safety in an EA-funded AI safety job, or for OpenAI/Anthropic/DeepMind can donate more than someone working in a mere neartermist job in one of the neartermist orgs...ummm, I think there are more than one issue in that last sentence if you stop to think about it.
I would probably challenge the main claim, that that associated community is a dominant fraction of working income funding for bednets.
Also, money, shouldn’t be a criteria to discuss or waive serious issues. No matter what amount of money a community donates, does not mean it can be subject to bigotry or prejudice, or be shielded from investigation or concern of serious crime or misconduct.
Like, so much to unpack. Some of the points above, especially the third point, really make a person guess about what really people believe about SBF and FTX, versus what they say.
Getting to the big stuff:
Even before November, e.g. before the FTX fraud, the Bostrom/Tegmark “Peter Pan” management, the concentrated Bay Area sexual abuse, the “castles”.
I don’t think that even the longtermists in say, 2012, thinking about EA, would find the late 2022 state of longtermism ideal.
The affect from longtermist (or rather the current instantiation/culture of “longtermists”) on the rest of “neartermist” EA is not publicly known even in EA circles.
The truth is uh, much, more different than the parent comment suggests.
The main complaint from neartermist is not something like bednet funding should be 2X or 10X. It’s not even clear most global health people want more bednets.
They want something else, or even a host of other things, but they might not be fully aware of it. It would be the most brutal and sad thing to crisply elaborate on this and IMO dangerous for reasons below.
If they knew the truth, or were given a few anedcotes, the neartermists would probably riot.
I can’t give specific statements and I don’t have time to explain, but like, trust me, we do not want this poked at, and I strongly recommend not writing more about this or making similar comments like the parent one.