Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Working in healthcare technology.
MSc in applied mathematics/theoretical ML.
Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help “neartermist” causes.
Positive suggestion, but the title for the post is confusing
Zero effect is not the worst case.
Upvoted because I’m glad you answered the question (and didn’t use EA grant money for this).
Disagreevoted because as an IMO medalist, I neither think science olympiad medalists are really such a useful audience, nor do I see any value in disseminating said fanfiction to potential alignment researchers.
Personally I don’t believe in a “trusted person”, as a concept. I think EA has had its fun trying to be a high trust environment where some large things are kept private, and it backfired horribly.
I’ll take <agree> <disagree> votes to indicate how compelling this would be to readers.
That was the aim of my comment as well, so I do hope more people actually vote on it.
I was initially impressed and considered donating to the fund in the future, but then noticed the ~$300K grant without a public report. I can’t see myself donating to a fund that doesn’t say what it’s doing with almost 30% of its disbursed funds.
I came to this discussion by following a link from the Animal Welfare Fund’s report where it gave out a large grant which isn’t publicly disclosed.
He really did love carrots. I seem to remember him saving me from (what was possibly) hypoglycemia in the middle of a hike by giving me a carrot.
May his memory be a blessing.
Personally since my grant was probably too small to justify the effort of a clawback, and the statute of limitations had passed, I donated all of it, divided between the GiveWell top charities fund and GiveDirectly.
I somehow missed that 🤦🏼♂️.
Looks like the number is just for 2024, it doesn’t really say what the previous numbers were (e.g. before the FTX scandal when most attendees could be reimbursed for flights and accommodation).
Full disclosure: I was rejected from an EAG, in 2022 I think (after attending one the year before).
Having previously criticised the lack of transparency in the EAG admissions process, I’m happy to see this post. Strongly upvoted.
With all the scandals we’ve seen in the last few years, I think it should be very evident how important transparency is. See also my explanation from last year.
...some who didn’t want to be named would have not come if they needed to be on a public list, so barring such people seems silly...
How is it silly? It seems perfectly acceptable, and even preferable, for people to be involved in shaping EA only if they agree for their leadership to be scrutinized.
The EA movement absolutely cannot carry on with the “let’s allow people to do whatever without any hindrance, what could possibly go wrong?” approach.
Just a reminder that I think it’s the wrong choice to allow attendees to leave their name off the published list.
I haven’t listened to that many episodes—in fact, of those you listed I’ve only listened to the one with Howie Lempel (which also resonated with me). But I think the episode I found most interesting is the one with Mushtaq Khan about effectively fighting corruption in developing countries.
I think it is irrelevant, and in every context where I’ve seen it presented as ‘on topic’ in EA, the connection between it and any positive impact was simplistic to the point of being imaginary, while at the same time promoting dangerous views—just like in the post you quoted.
As an Ashkenazi Jew myself, saying “we’d like to make everyone like Ashkenazi Jews” feels just like a mirror image of Nazism that very clearly should not appear on the forum
I’m an Israeli Jew and was initially very upset about the incident. I don’t remember the details, but I recall that in the end I was much less sure that there was anything left to be upset about. It took time but Tegmark did answer many questions posed about this.
Do you maybe want to voice your opinion of the methodology in a top level comment? I’m not qualified to judge myself and I think it’d be informative.
I downvoted and disagreevoted, though I waited until you replied to reassess.
I did so because I see absolutely no gain from doing this, I think the opportunity cost means it’s net negative, and I oppose the hype around prediction markets—it seems to me like the movement is obsessed with them but practically they haven’t led to any good impact.
Edit: regarding ‘noticing we are surprised’ - one would think this result is surprising, otherwise there’d be voices against the high amount of funding for EA conferences?
I think it was “will replace” when I wrote the comment but now it’s “must replace”? If that’s the case, it’s better now.