controversiality need not be extremely correlated with outrage. in fact, outrage can be very uncontroversial (school shooting). and controverisality is often productive (debate about X). my inclination is to trust the readership of this forum. promoting visibility to controversial posts will help people discuss ideas they’ve neglected.
SaraAzubuike
SaraAzubuike’s Quick takes
Is it super cost-effective to buy out interest groups?
One reaction I’ve seen in several places, mostly outside EA, is something like, “this was obviously a fraud from the start, look at all the red flags, how could EAs have been so credulous?” I think this is mostly wrong: the red flags they cite (size of FTX’s claimed profits, located in the Bahamas, involved in crypto, relatively young founders, etc.) are not actually strong indicators here. Cause for scrutiny, sure, but short of anything obviously wrong.
To make money, you not only have to be right, but be right at the right time. Imagine you predicted the COVID pandemic in 2018 and shorted the market starting in 2018. By 2020 you would be broke and have no more cash.
On the other hand, EA is not trying to make money. So, the EA community doesn’t care about the timing as much as a trader does. EA cares about preparation. If we know that the COVID pandemic is going to happen in 2018, we start preparing in 2018, and when it does happen, in 2020, we are prepared.
Thus, for the EA community, what was really more salient were articles such as this piece by Paul Krugman:
stablecoins...resemble 19th-century banks,...when paper currency was issued by largely unregulated private institutions. Many of these banks failed, in some cases due to fraud but mostly due to bad investments.
[this is a repost from a comment elsewhere]
Thanks for taking the time to comment. The details of the interaction between Alameda and FTX were very hard to pinpoint. And the timing was such that it was very hard to profit off of the collapse, even if you were very skeptical of cryptocurrencies to begin with. Hence, the whole misplaced discussion on the forum of, “Institutional investors, who have a profit motive, didn’t foresee this. How could we have?” For example, exchanges like Binance have not experienced similar meltdowns.
But to make money, you not only have to be right, but be right at the right time. Imagine you saw the COVID pandemic in 2018 and shorted the market starting in 2018. By 2020 you would be broke and have no more cash.
On the other hand, EA is not trying to make money. So, the EA community doesn’t care about the timing as much as a trader does. EA cares about preparation. If we know that the COVID pandemic is going to happen in 2018, we start preparing in 2018, and when it does happen, in 2020, we are prepared.
Thus, for the EA community, what was really more salient to prediction was the quotation by Paul Krugman:
stablecoins...resemble 19th-century banks,...when paper currency was issued by largely unregulated private institutions. Many of these banks failed, in some cases due to fraud but mostly due to bad investments.
The important thing is to design a system where it takes more work to a) post a lie b) refute the truth. And also, somehow design said system such that there is incentive to a) post the truth b) refute a lie, and importantly c) read/spread the truth. Whether this is by citations or a reputation-based voting system is beyond me but something I’ve been mulling over for quite some time.
I like to think that open exchange of ideas, if conducted properly, converges on the correct answer. Of course, the forum in which this exchange occurs is crucial, especially the systems and software. Compare the amount of truth that you obtain from BBC, Wikipedia, Stack Overflow, Kialo, Facebook, Twitter, Reddit, and EA forum. All of these have different methods of verifying truth. The beauty of a place like each of these is that with the exception of BBC, you can post whatever you want.
But the inconvenient truth will be penalized in different ways. On Wikipedia, it might get edited out for something more tame, though often not. On Stack Overflow, it will be downvoted but still available, and likely read. On Kialo it will get refuted, although if it is the truth, it will be promoted. On Facebook and Twitter, many might even reshare it, though into their own echochambers. On Reddit, it’ll get downvoted and then posted into r/unpopularopinion.
I think a post on past frauds would be very welcome, although a list of reading recommendations would be equally helpful and would require less work for you. EA has a lot to learn from more diverse voices that are more experienced in management within large organizations.
None that I know of.
But I’m concerned that they couldn’t simply state why they believe AI is more important than climate change rather than do this over-complicated scheme.
Agree
Disagree here because I don’t want to see an EA forum that values controversial posts.
Disagree. This is like saying, “Amazon shouldn’t sort by 1 star, because otherwise it will get a bad reputation for selling bad products.”
That’s wrong. People still have the option of sorting by whatever they choose. But the forum should give more visibility to posts that break people out of their comfort zone, should they desire.
yes, I now think anonymity of the sort that I proposed is the wrong way of going about this. can you think of a better solution?
I strongly agree with the spirit of the reforms being suggested here (although I might have some different opinions on how to implement it)
How would you do things differently?
Sorry that the post came off as very harsh and accusatory tone. I mainly meant to express my exasperation with how the situation unfolded so quickly. I’m worried about the coming months and how that will affect the community and in the long term.
Clearly, revealing who is donating is good for transparency. However, if donations were anonymized from the perspective of the recipients, I think that would help mitigate conflicts of interest. I think there needs to be more dialogue about how we can mitigate conflicts of interest, regardless of whether we anonymize. (in fact, perhaps anonymizing is not the most feasible option)
Regarding whether the crash is just normal financial chicanery, it’s kind of like saying the housing bubble wasn’t due to mortgage backed securities per se, but just financial engineering. Clearly there is much at play here, and some attributes are unique to crypto being such a new unregulated area.
You’re right about redflagging. I more meant general posts critiquing EA. Thanks for correcting.
Ok, I’m not too clear about the legal perspective. I guess my main purpose is this post was to start a dialogue about how we could have avoided such a situation with some preliminary suggestions.
Yes, I don’t really care about getting credit for predicting this; I pointed out my previous post mainly to give credence to my suggestions. And based on the comments of other people, maybe anonymous donations or not the best, most feasible, nor most practical way to do things. But, given that EAs focus very much on catastrophic tail risks, it should be the case that we not become overly reliant on single donations or donations which generate such large conflicts of interest. I don’t know what system would be best.
Apologies. Yes, thanks for reading and responding to my prior post. I believe I haven’t edited it since we last spoke in the comments section, but I did edit it when you pointed them out.
Hi, thanks for replying! I’ve made this into an EA forum post, instead because I’m afraid it’ll get buried in the comments here. https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change
Another FTX post: suggestions for change
I’ve made this into a post on the forum, because I’m afraid it’ll get buried in the comments here. Please comment on the forum post instead.
I suggested that we would have trouble with FTX and funding around 6 months ago.
SBF has been giving lots of money to EA. He admits it’s a massively speculative bubble. Crypto crash hurts the most vulnerable, because poor uneducated people put lots of money into it (Krugman). Crypto is currently small, but should be regulated and has potential contagion effects (BIS). EA as a whole is getting loose with it’s money due to large crypto flows (MacAskill). An inevitable crypto crash leads to either a) bad optics leading to less interest in EA or b) lots of dead projects.
It was quite obvious that this would happen—although the specific details with Alameda were not obvious. Stuart Buck was the only one who took me seriously at the time.
Below are some suggestions for change.
1. The new button of “support” is great, but I think EA forum should have a way to *sort* by controversiality. And, have the EA forum algorithm occasionally (some % of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
2. More of you should consider anonymous posts. This is EA forum. I cannot believe that some of you delete your posts simply because it ends up being downvoted. Especially if you’re working higher up in an EA org, you ought to be actively voicing your dissent and helping to monitor EA.
For example, this is not good:
“Members of the mutinous cohort told me that the movement’s leaders were not to be taken at their word—that they would say anything in public to maximize impact. Some of the paranoia—rumor-mill references to secret Google docs and ruthless clandestine councils—seemed overstated, but there was a core cadre that exercised control over public messaging; its members debated, for example, how to formulate their position that climate change was probably not as important as runaway A.I. without sounding like denialists or jerks.” (New Yorker)
What makes EA, *EA*, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have *already concluded something is super effective*, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
3. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.
A life saved in a rich country is generally considered more valuable than one saved in a poor country because the value of a statistical life (VSL) rises with wealth. However, transferring a dollar to a rich country is less beneficial than transferring a dollar to a poor country because marginal utility decreases as wealth increases.
So, using [$ / lives saved] is the wrong approach. We should use [$ / (lives saved * VSL)] instead. This means GiveDirectly might be undervalued compared to other programs that save lives. Can someone confirm if this makes sense?