Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Do you maybe want to voice your opinion of the methodology in a top level comment? I’m not qualified to judge myself and I think it’d be informative.
I downvoted and disagreevoted, though I waited until you replied to reassess.
I did so because I see absolutely no gain from doing this, I think the opportunity cost means it’s net negative, and I oppose the hype around prediction markets—it seems to me like the movement is obsessed with them but practically they haven’t led to any good impact.
Edit: regarding ‘noticing we are surprised’ - one would think this result is surprising, otherwise there’d be voices against the high amount of funding for EA conferences?
I admire the boldness of publishing a serious evaluation which shows a common EA intervention to have no significant effect (with all the caveats, of course).
What do you think can be gained from that?
Looking for people (probably from US/UK) to do donation swaps with. My local EA group currently allows tax-deductible donations to:
GiveWell—Top Charities Fund
Animal Charity Evaluators—Top Charities Fund
Against Malaria Foundation
Good Food Institute
<One other org that I don’t want to include here>
However, I would like to donate to the following:
GiveWell—All Grants Fund (~$1230)
GiveDirectly (~$820)
The Humane League (~$580)
If anyone is willing to donate these sums and have me donate an equal sum to one of the funds mentioned above—please contact me.
I think you’re mostly right, especially about LLMs and current hype (though I do think a couple innovations beyond current technology could get us AGI). but I want to point out that AI progress has not been entirely fruitless. The most salient example in my mind is AlphaFold which is actually used for research, drug discovery etc.
Thanks for correcting me. I do believe they’re much less involved in these things nowadays, but I might be wrong.
I indeed haven’t seen any expression of racism from either, but I chose carefully to write “racist/euginicist” before for this kind of reason exactly. I personally believe even discussing such interventions in the way that they have been in EA has risks (of promoting racist policies by individuals, organizations, governments) that far outweigh any benefits. Such a discussion might be possible privately between people who all know each other very well and can trust each other’s good intentions, but otherwise it is too dangerous.
I appreciate you sharing your experience. It’s different from mine and so it can be that I’m judging too many people too harshly based on this difference.
That said, I suspect that it’s not enough to have this aversion. The racism I often see requires a degree of indifference to the consequences of one’s actions and discourse, or maybe a strong naivety that makes one unaware of those consequences.
I know I can’t generalize from one person, but if you see yourself as an example of the different mindset that might lead to the behaviour I observed—notice that you yourself seem to be very aware of the consequences of your actions, and every bit of expression from you I’ve seen has been the opposite of what I’m condemning.
Edit: for those downvoting, I would appreciate feedback on this comment, either here or in a PM.
Maybe not consciously. Does that make it any better?
I don’t think the movement can be ascribed a stance on this. What I said, rather, is:
many EAs are racists/euginicists and want to have such opinions around them
And I stand behind this. They just aren’t the people responsible for the interventions you mentioned.
some people here might think that EA should be grappling with racism outside of this incident, in which case opportunities like this are helpful for creating discourse
I think sort of the opposite. Even though I commented elsewhere that I think there’s a strong racist/eugenicist element in EA, I think Manifest has little to do with EA and could probably be ignored here if it weren’t for the guardian article.
But the problem is that once it came to be discussed here, the discussion itself proved much more damning to EA than that not-really-EA event was in the first place. This isn’t the first time that has happened. I guess it’s better to know than not to know, but it’s really weird to need this outside trigger for it.
I disagree with much of this, but I edited my very-downvoted comment to make clear that it wasn’t about the Manifest team, whom I know basically nothing about.
I maintain that it’s neither, but I’m particularly curious to hear why you think it’s implausible.
One of the reasons I’ve been distancing myself from EA in the last year or so is that it feels like the much celebrated ‘openness to all new ideas’ is a fake cover for the fact that many EAs are racists/euginicists and want to have such opinions around them. In other words, they don’t really champion free speech, but rather just pretend to in order to keep the eugenics talk going.
Edit: while there are some forum users that I’m sure are in the group I described, I don’t know any of the Manifest team and am not ascribing them in particular any beliefs or intentions.
Most Israeli Jews would call the phrase “From the river to the see” antisemitic. Myself being relatively on the far left in that group, and having spoken a lot with Palestinians online before the war, I’d argue that it’s antisemitic/calls for ethnic cleansing of Jews around 50% of the time. I would not prosecute or boycott someone based on it alone.
Edit: but most Israelis might choose not to come to a conference that would platform such a person, I guess. I think this is a different situation from the current real controversy, but make of it what you will.
What helps us know what politics we should enact?
Even in this question you put the political action as an end goal and the truth-seeking as only an instrumental one. This means truth-seeking is (and, in my view, really should be) secondary, and should sometimes give way to other priorities.
I think most of these issues stem from the fact that I am not a very good writer or a communicator
FWIW I found your writing in this post better and more honest and to-the-point than most of what’s on the forum.
I didn’t know the site and have looked at it just now for the first time. I like it and think it offers news which are way more relevant than usual.
One comment I have is that the ‘general news’ section covers the ‘popular’ wars in Gaza and Ukraine while neglecting other conflicts which are currently taking place and threatening harm to many people, e.g. in Sudan. I would’ve liked to hear about more of them.
I’m an Israeli Jew and was initially very upset about the incident. I don’t remember the details, but I recall that in the end I was much less sure that there was anything left to be upset about. It took time but Tegmark did answer many questions posed about this.