Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Maybe not consciously. Does that make it any better?
I don’t think the movement can be ascribed a stance on this. What I said, rather, is:
many EAs are racists/euginicists and want to have such opinions around them
And I stand behind this. They just aren’t the people responsible for the interventions you mentioned.
some people here might think that EA should be grappling with racism outside of this incident, in which case opportunities like this are helpful for creating discourse
I think sort of the opposite. Even though I commented elsewhere that I think there’s a strong racist/eugenicist element in EA, I think Manifest has little to do with EA and could probably be ignored here if it weren’t for the guardian article.
But the problem is that once it came to be discussed here, the discussion itself proved much more damning to EA than that not-really-EA event was in the first place. This isn’t the first time that has happened. I guess it’s better to know than not to know, but it’s really weird to need this outside trigger for it.
I disagree with much of this, but I edited my very-downvoted comment to make clear that it wasn’t about the Manifest team, whom I know basically nothing about.
I maintain that it’s neither, but I’m particularly curious to hear why you think it’s implausible.
One of the reasons I’ve been distancing myself from EA in the last year or so is that it feels like the much celebrated ‘openness to all new ideas’ is a fake cover for the fact that many EAs are racists/euginicists and want to have such opinions around them. In other words, they don’t really champion free speech, but rather just pretend to in order to keep the eugenics talk going.
Edit: while there are some forum users that I’m sure are in the group I described, I don’t know any of the Manifest team and am not ascribing them in particular any beliefs or intentions.
Most Israeli Jews would call the phrase “From the river to the see” antisemitic. Myself being relatively on the far left in that group, and having spoken a lot with Palestinians online before the war, I’d argue that it’s antisemitic/calls for ethnic cleansing of Jews around 50% of the time. I would not prosecute or boycott someone based on it alone.
Edit: but most Israelis might choose not to come to a conference that would platform such a person, I guess. I think this is a different situation from the current real controversy, but make of it what you will.
What helps us know what politics we should enact?
Even in this question you put the political action as an end goal and the truth-seeking as only an instrumental one. This means truth-seeking is (and, in my view, really should be) secondary, and should sometimes give way to other priorities.
I think most of these issues stem from the fact that I am not a very good writer or a communicator
FWIW I found your writing in this post better and more honest and to-the-point than most of what’s on the forum.
I didn’t know the site and have looked at it just now for the first time. I like it and think it offers news which are way more relevant than usual.
One comment I have is that the ‘general news’ section covers the ‘popular’ wars in Gaza and Ukraine while neglecting other conflicts which are currently taking place and threatening harm to many people, e.g. in Sudan. I would’ve liked to hear about more of them.
This is extremely encouraging to hear. Thanks for the report!
Treating each new person as a separate investment and trying to optimize for their marginal utility for EA, instead of looking at the aggregate effect on the movement of all the community building efforts.
Specifically in your comment, justifying diversifying investment in groups by saying “high quality group members” are the goal but top universities have bottlenecks which can’t be easily solved by just pouring more money into them—instead of arguing that it’s better to have a new group in Chile than a new group in Harvard, even if hypothetically people there were less qualified for existing EA jobs.
I’m on the one hand happy to hear that the groups team isn’t as elite-focused as I had thought; on the other hand, I’m still troubled by the margin-based reasoning.
Thanks, I’ve never used shortform but I’ll try tomorrow
top universities are the places with the highest concentrations of people who ultimately have a very large influence on the world
I think this as a piece of reasoning represents a major problem in the perceptions of EA. While it might be factually true, there are two problems with relying on it:
It means surrendering ourselves to this existing state as opposed to trying to change it and create a more equal world.
It means the goal of EA community building is regarded as a funnel trying to get individuals into existing positions determined by the system already in place. There is an alternative: building not a pool of individuals, each of which is separately regarded as a marginal talent contribution—but rather a diverse community that could think more robustly about how to change the world for better, and not be mostly confined to rich, white, technological, western perspectives. IMO this alternative is much more important than the funnel.
I feel like I am one of the most engaged EAs in my local community, but the beliefs Torres ascribes to EA are so far removed from my own
This might have to do with “how local” your local community is. It seems to me that the weirder sides of EA (which I usually consider bad, but others here might not) are common in the EA hubs (Bay Area, Oxbridge, London, and the cluster of large groups in Europe) but not as common in other places (like here in Israel).
You’re describing a religious belief that, for some unknown reason, many EAs seem to share. A belief in a mystical state of being never scientifically documented. And you ask why there’s no activity around this, in a community supposedly organized around following evidence to find good ways to improve the world. And that’s your answer: a shared belief is not evidence. Same as a shared belief in God, even by billions of people, is not evidence.
It’s good that nobody’s talking about this. It would be no more sane than e.g. trying to make everyone religious because then God would eliminate suffering.
I definitely agree. But I think we’re far from it being practically useful for dedicated EAs to do this themselves.
I appreciate you sharing your experience. It’s different from mine and so it can be that I’m judging too many people too harshly based on this difference.
That said, I suspect that it’s not enough to have this aversion. The racism I often see requires a degree of indifference to the consequences of one’s actions and discourse, or maybe a strong naivety that makes one unaware of those consequences.
I know I can’t generalize from one person, but if you see yourself as an example of the different mindset that might lead to the behaviour I observed—notice that you yourself seem to be very aware of the consequences of your actions, and every bit of expression from you I’ve seen has been the opposite of what I’m condemning.
Edit: for those downvoting, I would appreciate feedback on this comment, either here or in a PM.