I would like to be involved in the version of EAs where we look after eachother’s basic wellness even if it’s bad for FTX or other FTX depositors. I think people will find this version of EA more emotionally safe and inspiring.
To me there is just no normative difference between trying to suppress information and actively telling people they should go deposit on FTX when distress occurred (without communicating any risks involved), knowing that there was a good chance they’d get totally boned if they did so. Under your model this would be no net detriment, but it would also just be sociopathic.
Yes the version of EA where people suppress this information, rather than actively promote deposits, is safer. But both are quite cruel and not something I could earnestly suggest to a friend that they devote their lives to.
Hm, yeah I guess my intuition is the opposite. To me, one of the central parts of effective altruism is that it’s impartial, meaning we shouldn’t put some people’s welfare over other’s.
I think in this case it’s particularly important to be impartial, because EA is a group of people that benefitted a lot from FTX, so it seems wrong for us to try to transfer the harms it is now causing onto other people.
(as an aside it also seems quite unusual to apply this impartiality to the finances of EAs. If EAs were going to be financially impartial it seems like we would not really encourage trying to earn money in competitive financially zero sum ways such as a quant finance career or crypto trading)
Aspiring to be impartially altruistic doesn’t mean we should shank eachother. The so-impartial-we-will-harvest-your-organs-and-steal-your-money version of EA has no future as a grassroots movement or even room to grow as far as I can tell.
This community norm strategy works if you determine that retaining socioeconomically normal people doesn’t actually matter and you just want to incubate billionaires, but I guess we have to hope the next billionare is not so (allegedly) impartial towards their users’ welfare.
Seriously, imagine dedicating your life to EA and then finding out you lost your life savings because one group of EAs defrauded you and the other top EAs decided you shouldn’t be alerted about it for as long as possible specifically because it might lead to you reaching safety. Of course none of the in-the-know people decided to put up their own money to defend against bank run, just decided it would be best if you kept doing so.
In that situation I have to say I would just go and never look back.
I would like to be involved in the version of EAs where we look after eachother’s basic wellness even if it’s bad for FTX or other FTX depositors. I think people will find this version of EA more emotionally safe and inspiring.
To me there is just no normative difference between trying to suppress information and actively telling people they should go deposit on FTX when distress occurred (without communicating any risks involved), knowing that there was a good chance they’d get totally boned if they did so. Under your model this would be no net detriment, but it would also just be sociopathic.
Yes the version of EA where people suppress this information, rather than actively promote deposits, is safer. But both are quite cruel and not something I could earnestly suggest to a friend that they devote their lives to.
Hm, yeah I guess my intuition is the opposite. To me, one of the central parts of effective altruism is that it’s impartial, meaning we shouldn’t put some people’s welfare over other’s.
I think in this case it’s particularly important to be impartial, because EA is a group of people that benefitted a lot from FTX, so it seems wrong for us to try to transfer the harms it is now causing onto other people.
(as an aside it also seems quite unusual to apply this impartiality to the finances of EAs. If EAs were going to be financially impartial it seems like we would not really encourage trying to earn money in competitive financially zero sum ways such as a quant finance career or crypto trading)
Aspiring to be impartially altruistic doesn’t mean we should shank eachother. The so-impartial-we-will-harvest-your-organs-and-steal-your-money version of EA has no future as a grassroots movement or even room to grow as far as I can tell.
This community norm strategy works if you determine that retaining socioeconomically normal people doesn’t actually matter and you just want to incubate billionaires, but I guess we have to hope the next billionare is not so (allegedly) impartial towards their users’ welfare.
Seriously, imagine dedicating your life to EA and then finding out you lost your life savings because one group of EAs defrauded you and the other top EAs decided you shouldn’t be alerted about it for as long as possible specifically because it might lead to you reaching safety. Of course none of the in-the-know people decided to put up their own money to defend against bank run, just decided it would be best if you kept doing so.
In that situation I have to say I would just go and never look back.