Sorry that the post came off as very harsh and accusatory tone. I mainly meant to express my exasperation with how the situation unfolded so quickly. I’m worried about the coming months and how that will affect the community and in the long term.
Clearly, revealing who is donating is good for transparency. However, if donations were anonymized from the perspective of the recipients, I think that would help mitigate conflicts of interest. I think there needs to be more dialogue about how we can mitigate conflicts of interest, regardless of whether we anonymize. (in fact, perhaps anonymizing is not the most feasible option)
Regarding whether the crash is just normal financial chicanery, it’s kind of like saying the housing bubble wasn’t due to mortgage backed securities per se, but just financial engineering. Clearly there is much at play here, and some attributes are unique to crypto being such a new unregulated area.
You’re right about redflagging. I more meant general posts critiquing EA. Thanks for correcting.
SaraAzubuike
Crypto markets, EA funding and optics
Another FTX post: suggestions for change
[Question] What problems in society are both mathematically challenging, verifiable, and competitively rewarding?
Button for controversial posts
Might stopping miscarriages be a very important cause area?
[Question] Can someone verify offsetting or tree planting NGOs are bad?
I don’t like how all the comments basically reiterate that smart people have more impact. Of course smart people do. But one avenue for EA to actually make a difference—is to appeal to the masses. For policy to change, you have to appeal to the body politic. And for that, we do need a diverse range of skillsets from people that are much different than the average EA (for example, having more super-social salesperson types in EA would be a net positive for EA)
I think this ignores the central thesis of this post.
Is it super cost-effective to buy out interest groups?
Yes, I don’t really care about getting credit for predicting this; I pointed out my previous post mainly to give credence to my suggestions. And based on the comments of other people, maybe anonymous donations or not the best, most feasible, nor most practical way to do things. But, given that EAs focus very much on catastrophic tail risks, it should be the case that we not become overly reliant on single donations or donations which generate such large conflicts of interest. I don’t know what system would be best.
Thanks for taking the time to comment. The details of the interaction between Alameda and FTX were very hard to pinpoint. And the timing was such that it was very hard to profit off of the collapse, even if you were very skeptical of cryptocurrencies to begin with. Hence, the whole misplaced discussion on the forum of, “Institutional investors, who have a profit motive, didn’t foresee this. How could we have?” For example, exchanges like Binance have not experienced similar meltdowns.
But to make money, you not only have to be right, but be right at the right time. Imagine you saw the COVID pandemic in 2018 and shorted the market starting in 2018. By 2020 you would be broke and have no more cash.
On the other hand, EA is not trying to make money. So, the EA community doesn’t care about the timing as much as a trader does. EA cares about preparation. If we know that the COVID pandemic is going to happen in 2018, we start preparing in 2018, and when it does happen, in 2020, we are prepared.
Thus, for the EA community, what was really more salient to prediction was the quotation by Paul Krugman:
stablecoins...resemble 19th-century banks,...when paper currency was issued by largely unregulated private institutions. Many of these banks failed, in some cases due to fraud but mostly due to bad investments.
sigh
The important thing is to design a system where it takes more work to a) post a lie b) refute the truth. And also, somehow design said system such that there is incentive to a) post the truth b) refute a lie, and importantly c) read/spread the truth. Whether this is by citations or a reputation-based voting system is beyond me but something I’ve been mulling over for quite some time.
Ok, I’m not too clear about the legal perspective. I guess my main purpose is this post was to start a dialogue about how we could have avoided such a situation with some preliminary suggestions.
don’t see where you get the 99.9% number, but yes, it does seem crypto is commonly used in scams.
Hey, thanks for answering my post. Means a lot, especially since you seem to be more familiar with philosophy than me.
“Total utilitarians care about intrinsic value of outcomes.”
- But a) death is painful b) death is the loss of future life c) parents grieve over miscarriages just as people grieve over the loss of a friend.“Embryos must have an interest in continued existence.”
- Hm, but I argue this is a temporary state. Say I give that mother nutrition and I wait 9 months. That embryo now has an interest in continued existence. In a similar vein, suicidal people have no interest in continued existence. But if I give that suicidal person therapy and wait some time, that person now has an interest in continued existence.
I like to think that open exchange of ideas, if conducted properly, converges on the correct answer. Of course, the forum in which this exchange occurs is crucial, especially the systems and software. Compare the amount of truth that you obtain from BBC, Wikipedia, Stack Overflow, Kialo, Facebook, Twitter, Reddit, and EA forum. All of these have different methods of verifying truth. The beauty of a place like each of these is that with the exception of BBC, you can post whatever you want.
But the inconvenient truth will be penalized in different ways. On Wikipedia, it might get edited out for something more tame, though often not. On Stack Overflow, it will be downvoted but still available, and likely read. On Kialo it will get refuted, although if it is the truth, it will be promoted. On Facebook and Twitter, many might even reshare it, though into their own echochambers. On Reddit, it’ll get downvoted and then posted into r/unpopularopinion.
I think a post on past frauds would be very welcome, although a list of reading recommendations would be equally helpful and would require less work for you. EA has a lot to learn from more diverse voices that are more experienced in management within large organizations.
I’ve made this into a post on the forum, because I’m afraid it’ll get buried in the comments here. Please comment on the forum post instead.
https://forum.effectivealtruism.org/posts/9YodZj6J6iv3xua4f/another-ftx-post-suggestions-for-change
I suggested that we would have trouble with FTX and funding around 6 months ago.
It was quite obvious that this would happen—although the specific details with Alameda were not obvious. Stuart Buck was the only one who took me seriously at the time.
Below are some suggestions for change.
1. The new button of “support” is great, but I think EA forum should have a way to *sort* by controversiality. And, have the EA forum algorithm occasionally (some ϵ% of the time), punt controversial posts back upwards to the front page. If you’re like me, you read the forum sorted by Magic (New and Upvoted). But this promotes herd mentality. The red-teaming and self-criticism are excellent, but if the only way we aggregate how “good” red-teaming is is by up-votes, that is flawed. Perhaps the best way to know that criticism has touched a nerve is to compute a fraction: how many members of the community disagree vs how many agree. (or, even better, if you are in an organization, use a weighted fraction, where you put lower weight on the people in the organization that are in positions of power (obviously difficult to implement in practice))
2. More of you should consider anonymous posts. This is EA forum. I cannot believe that some of you delete your posts simply because it ends up being downvoted. Especially if you’re working higher up in an EA org, you ought to be actively voicing your dissent and helping to monitor EA.
For example, this is not good:
What makes EA, *EA*, what makes EA antifragile, is its ruthless transparency. If we are self-censoring because we have *already concluded something is super effective*, then there is no point in EA. Go do your own thing with your own money. Become Bill Gates. But don’t associate with EA.
3. Finances should be partially anonymized. If an EA org receives some money above a certain threshold from an individual contribution, we should be transparent in saying that we will reject said money if it is not donated anonymously. You may protest that this would decrease the number of donations by rich billionaires. But take it this way: if they donate to EA, it’s because they believe that EA can spend it better. Thus, they should be willing to donate anonymously, to not affect how EA spends money. If they don’t donate to EA, then they can establish a different philanthropic organization and hire EA-adjacent staff, making for more competition.