Experienced quant trader, based in London. Formerly a volunteer at Rethink Priorities, where I did some forecasting research. Interested in most things, donations have been primarily to longtermism, animal welfare and meta causes.
Charles Dillon
I think this is an irresponsible ad hominem to be posting without any substance or link to substance whatsoever. There are many EAs who know a lot about crypto and read the forum—if there are substantial criticisms to be made I think you can expect them to make them without this vague insinuation.
“The EA community has little awareness of their privilege.”
This strikes me as straightforwardly untrue, unless you are holding the community to a standard which nobody anywhere meets. The EA community exists largely because individuals recognised their outsized (i.e. privileged) position to do good in the world, given their relative access to resources compared to e.g. those in poverty and non-human animals, and strove to use that privilege for good.
That EA doesn’t, e.g. make it as easy for you to go to EA conferences as it is for a Western citizen is not because EA doesn’t know that some people have difficulty travelling. It is because doing that costs resources that have been allocated to something else. It might be a mistake not to use those resources to help you travel to a conference vs whatever the opportunity cost of that decision would be, but that is a very different question, I think.
Just a note on Jane Street in particular—nobody at Jane Street is making a potentially multi year bet on interest rates with Jane Street money. That’s simply not in the category of things that Jane Street trades. If someone at Jane Street wanted to make betting on this a significant part of what they do, they’d have to leave and go elsewhere and find someone to give them at least hundreds of millions of dollars to make the bet.
What was this distinct reason? If this was mentioned in the post, I didn’t see it.
If it wasn’t mentioned in the post, it feels disingenuous of you to not mention it and give the impression that you were left in the dark and had to come up with your own list of hypotheses. It’s quite difficult for a third party to come to any conclusions without this piece of information.
Holden Karnofsky and Elie Hassenfeld founded Givewell as a charity club at Bridgewater while they were working there, so Dalio definitely knows about EA.
I think the layout of this post is quite reader unfriendly.
I strongly suggest you start with a full summary rather than just an intro, and don’t bury your conclusions midway between the post and some very long appendices which are unlikely to be very useful to 90% of readers.
As it is, anyone wishing to respond in depth would basically have to do the work of summarizing the post themselves, which increases the friction on feedback.
I don’t have many strong opinions on this topic, but one I do have and think should be standard practice is recusing oneself from decisions involving current or former romantic partners.
That means not being involved in hiring processes and grantmaking decisions involving them, and not giving them references without noting the conflict of interest. This is very standard in professional organisations for good reason.
Well done for doing this! I think attempted replications or re-examinations of existing work are under-done in EA and wish more were conducted.
Minor point, but I disagree with the unqualified claim of being well calibrated here except for the 90% bucket, at least a little.
Weak evidence that you are overconfident in each of the 0-10, 10-20, 70-80, 80-90 and 90%+ buckets is decent evidence of an overconfidence bias overall, even if those errors are mostly individually within the margin of error.
Thanks for posting this—on a quick read it looks pretty accurate to me and I’ll be glad to have this as a resource to point people to when they seem not to understand exactly why what FTX did was so bad.
Interesting post. I haven’t conducted the depth research to verify most of the figures, but I do think the idea that you have a 55% chance of success with a $208k 1 year advocacy campaign pretty implausible and suspect there’s something dubious going on with the method of estimating P(success) here.
I think an appropriate fact to incorporate which I did not see would be “actual costs of lobbying in the US” and “frequency of novel regulations passing” on which I presume there is quite a bit of data available.
I strongly dislike claiming that “there’s a role for small EA donors in campaign finance” in a post which makes essentially no argument for the intervention’s effectiveness.
Maybe there’s a role, but assuming there is seems like wishful thinking. GiveDirectly still has a plausible funding gap in the hundreds of billions or trillions of dollars, so one should at least make the case that it might be better than that.
Seems like a rather vague collection of barely connected anecdotes haphazardly strung together.
I am not particularly concerned as I don’t see this persuading anybody.
Despite your clarifications within the post here to say that we should grow the pie, and that CWRs are still underfunded, I find the zero-sum tone of much of the post (I.e. saying that we should do less CWR work and more other stuff) off putting and poorly supported.
It is not obvious to me that other areas such as those you mention can readily absorb that much extra funding that quickly, or that anyone is currently erring in their approach here not finding a particular intervention and funding CWRs instead.
I would guess that e.g. Open Phil are eager to find other good opportunities in these areas and are more constrained by lack of good opportunities than by having committed large amounts to CWRs and therefore not having the budget to give more. Do you think this is wrong?
This conceptually seems similar to the meat eater problem argument against global health interventions.
Can you give an example of a point or points in there you found compelling?
That article looks like the usual “utilitarianism is bad” stuff (an argument which predates EA by a long time and has seen little progress in recent times) combined with some strong mood affiliation and straightforward misunderstandings of economic thinking to me.
“Private foundations must give 5% of their endowment annually, meaning EA orgs are giving $1.25b annually”
This is not true. Open Phil/ Good Ventures has recently donated approx $250m annually, and I think the reason they are not subject to a “5% of Dustin’s wealth” limit is that he hasn’t actually donated most of his assets to the foundation yet.
Thanks for sharing this!
I don’t think that I would have included this statement though:
I am sharing this in good faith that EAs who participate will donate whatever they earn beyond a reasonable value of their time to effective causes
For many EAs for whom this might be a good use of their time, especially those trying to position themselves for direct work, I would think donating this money will be a worse decision than using it to help themselves in their own efforts to do so.
“Nvidia’s implied volatility is about 60%, which means – even assuming efficient markets – it has about a 15% chance of falling more than 50% in a year.
And more speculatively, booms and busts seem more likely for stocks that have gone up a ton, and when new technologies are being introduced.”
Do you think the people trading the options setting that implied volatility are unaware of this?
I agree with the main point here, and I think it’s a good one, but the headline’s use of present tense is confusing, and implies to me that they are currently doing a good job in their capacity as a donor.