Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
I think you’re mostly right, especially about LLMs and current hype (though I do think a couple innovations beyond current technology could get us AGI). but I want to point out that AI progress has not been entirely fruitless. The most salient example in my mind is AlphaFold which is actually used for research, drug discovery etc.
Thanks for correcting me. I do believe they’re much less involved in these things nowadays, but I might be wrong.
I indeed haven’t seen any expression of racism from either, but I chose carefully to write “racist/euginicist” before for this kind of reason exactly. I personally believe even discussing such interventions in the way that they have been in EA has risks (of promoting racist policies by individuals, organizations, governments) that far outweigh any benefits. Such a discussion might be possible privately between people who all know each other very well and can trust each other’s good intentions, but otherwise it is too dangerous.
I appreciate you sharing your experience. It’s different from mine and so it can be that I’m judging too many people too harshly based on this difference.
That said, I suspect that it’s not enough to have this aversion. The racism I often see requires a degree of indifference to the consequences of one’s actions and discourse, or maybe a strong naivety that makes one unaware of those consequences.
I know I can’t generalize from one person, but if you see yourself as an example of the different mindset that might lead to the behaviour I observed—notice that you yourself seem to be very aware of the consequences of your actions, and every bit of expression from you I’ve seen has been the opposite of what I’m condemning.
Edit: for those downvoting, I would appreciate feedback on this comment, either here or in a PM.
Maybe not consciously. Does that make it any better?
I don’t think the movement can be ascribed a stance on this. What I said, rather, is:
many EAs are racists/euginicists and want to have such opinions around them
And I stand behind this. They just aren’t the people responsible for the interventions you mentioned.
some people here might think that EA should be grappling with racism outside of this incident, in which case opportunities like this are helpful for creating discourse
I think sort of the opposite. Even though I commented elsewhere that I think there’s a strong racist/eugenicist element in EA, I think Manifest has little to do with EA and could probably be ignored here if it weren’t for the guardian article.
But the problem is that once it came to be discussed here, the discussion itself proved much more damning to EA than that not-really-EA event was in the first place. This isn’t the first time that has happened. I guess it’s better to know than not to know, but it’s really weird to need this outside trigger for it.
I disagree with much of this, but I edited my very-downvoted comment to make clear that it wasn’t about the Manifest team, whom I know basically nothing about.
I maintain that it’s neither, but I’m particularly curious to hear why you think it’s implausible.
One of the reasons I’ve been distancing myself from EA in the last year or so is that it feels like the much celebrated ‘openness to all new ideas’ is a fake cover for the fact that many EAs are racists/euginicists and want to have such opinions around them. In other words, they don’t really champion free speech, but rather just pretend to in order to keep the eugenics talk going.
Edit: while there are some forum users that I’m sure are in the group I described, I don’t know any of the Manifest team and am not ascribing them in particular any beliefs or intentions.
Most Israeli Jews would call the phrase “From the river to the see” antisemitic. Myself being relatively on the far left in that group, and having spoken a lot with Palestinians online before the war, I’d argue that it’s antisemitic/calls for ethnic cleansing of Jews around 50% of the time. I would not prosecute or boycott someone based on it alone.
Edit: but most Israelis might choose not to come to a conference that would platform such a person, I guess. I think this is a different situation from the current real controversy, but make of it what you will.
What helps us know what politics we should enact?
Even in this question you put the political action as an end goal and the truth-seeking as only an instrumental one. This means truth-seeking is (and, in my view, really should be) secondary, and should sometimes give way to other priorities.
I think most of these issues stem from the fact that I am not a very good writer or a communicator
FWIW I found your writing in this post better and more honest and to-the-point than most of what’s on the forum.
I didn’t know the site and have looked at it just now for the first time. I like it and think it offers news which are way more relevant than usual.
One comment I have is that the ‘general news’ section covers the ‘popular’ wars in Gaza and Ukraine while neglecting other conflicts which are currently taking place and threatening harm to many people, e.g. in Sudan. I would’ve liked to hear about more of them.
This is extremely encouraging to hear. Thanks for the report!
Treating each new person as a separate investment and trying to optimize for their marginal utility for EA, instead of looking at the aggregate effect on the movement of all the community building efforts.
Specifically in your comment, justifying diversifying investment in groups by saying “high quality group members” are the goal but top universities have bottlenecks which can’t be easily solved by just pouring more money into them—instead of arguing that it’s better to have a new group in Chile than a new group in Harvard, even if hypothetically people there were less qualified for existing EA jobs.
I’m on the one hand happy to hear that the groups team isn’t as elite-focused as I had thought; on the other hand, I’m still troubled by the margin-based reasoning.
Thanks, I’ve never used shortform but I’ll try tomorrow
top universities are the places with the highest concentrations of people who ultimately have a very large influence on the world
I think this as a piece of reasoning represents a major problem in the perceptions of EA. While it might be factually true, there are two problems with relying on it:
It means surrendering ourselves to this existing state as opposed to trying to change it and create a more equal world.
It means the goal of EA community building is regarded as a funnel trying to get individuals into existing positions determined by the system already in place. There is an alternative: building not a pool of individuals, each of which is separately regarded as a marginal talent contribution—but rather a diverse community that could think more robustly about how to change the world for better, and not be mostly confined to rich, white, technological, western perspectives. IMO this alternative is much more important than the funnel.
Looking for people (probably from US/UK) to do donation swaps with. My local EA group currently allows tax-deductible donations to:
GiveWell—Top Charities Fund
Animal Charity Evaluators—Top Charities Fund
Against Malaria Foundation
Good Food Institute
<One other org that I don’t want to include here>
However, I would like to donate to the following:
GiveWell—All Grants Fund (~$1230)
GiveDirectly (~$820)
The Humane League (~$580)
If anyone is willing to donate these sums and have me donate an equal sum to one of the funds mentioned above—please contact me.