My current, extremely tentative, sense of the situation is not that individuals who were aware of some level of dishonesty and shadiness were not open enough about it. I think individuals acted in pretty reasonable ways, and I heard a good amount of rumors.
I think the error likely happened at two other junctions:
Some part of EA leadership ended up endorsing SBF very publicly and very strongly despite having very likely heard about the concerns, and without following up on them (In my model of the world Will fucked up really hard here)
We didn’t have any good system for aggregating rumors and related information, and we didn’t have anyone who was willing to just make a public post about the rumors (I think this would have been a scary and heroic thing to do, I am personally ashamed that I didn’t do it, but I don’t think it’s something that we should expect the average person to do)
I think if we had some kind of e.g. EA newspaper where people try to actively investigate various things that seem concerning, then I think this would have helped a bunch. This kind of thing could even be circulated privately, though a public version seems also good.
I separately also think that we should just much more deeply embed the virtues of honesty and truth-seeking into the core idea of EA. I think it shouldn’t be possible to be seen as “an effective EA” without also being actually good at truth-seeking and helping other people orient to the world.
I think when a billionaire shows up with billions of dollars, or an entrepreneur builds a great company, I think it should just be a strict requirement that they are also honest and good at truth-seekingness in order to gain status and reputation within the community, in the same way that no matter how much money you make, people are not going to think you are a “good scientist” without actually having discovered new verifiable regularities in the natural world (you might be a “great supporter of science”, but I think that doesn’t usually mean you would get invited to all the scientific conferences, or get the Nobel Prize, or something, and I think people would have a healthy understanding of the relationship of you to the rest of the scientific ecosystem).
Agree with much of what you say here. (Though I don’t think we currently have strong enough evidence to single out specific EA leaders as being especially responsible for the recent tragic events; at least I don’t think I personally have that kind of information.)
As a substitute, or complement, to an investigative EA newspaper, what do you think about an “EA rumours” prediction market?[1] Some attractive features of such a market:
It turns private information held by individual people with privileged access to sources into public information available to the entire EA community, increasing the likelihood that the information will reach those for whom it is most valuable and actionable.
It potentially reduces community drama by turning “hot” debates influenced by tribal allegiances and virtue signaling into “cold” assignments of probability and assessments of evidence.
It makes rumours more accurate, by incentivizing users to estimate their probability correctly.
It makes false rumours less damaging to their targets, by explicitly associating them with a low probability.
I think this market would need judicious moderation to function well and avoid being abused. But overall it seems to me like it might be an idea worth exploring further, and of the sort that could make future events in the same reference class as the FTX debacle less likely to happen.
By ‘market’, I do not necessarily mean a real-money prediction market like Polymarket or PredictIt; it could also be a play-money market like Manifold Markets or a forecasting platform like Metaculus.
Yeah, I feel excited about something in this space. Generally I feel like prediction markets have a lot of good things going for them in situations like this, though I do worry that they will somehow just end up gamed when the stakes are high. Like, my guess is Sam could have likely moved the probability of a market here a lot, either with money, or by encouraging other people to move it.
My current, extremely tentative, sense of the situation is not that individuals who were aware of some level of dishonesty and shadiness were not open enough about it. I think individuals acted in pretty reasonable ways, and I heard a good amount of rumors.
I think the error likely happened at two other junctions:
Some part of EA leadership ended up endorsing SBF very publicly and very strongly despite having very likely heard about the concerns, and without following up on them (In my model of the world Will fucked up really hard here)
We didn’t have any good system for aggregating rumors and related information, and we didn’t have anyone who was willing to just make a public post about the rumors (I think this would have been a scary and heroic thing to do, I am personally ashamed that I didn’t do it, but I don’t think it’s something that we should expect the average person to do)
I think if we had some kind of e.g. EA newspaper where people try to actively investigate various things that seem concerning, then I think this would have helped a bunch. This kind of thing could even be circulated privately, though a public version seems also good.
I separately also think that we should just much more deeply embed the virtues of honesty and truth-seeking into the core idea of EA. I think it shouldn’t be possible to be seen as “an effective EA” without also being actually good at truth-seeking and helping other people orient to the world.
I think when a billionaire shows up with billions of dollars, or an entrepreneur builds a great company, I think it should just be a strict requirement that they are also honest and good at truth-seekingness in order to gain status and reputation within the community, in the same way that no matter how much money you make, people are not going to think you are a “good scientist” without actually having discovered new verifiable regularities in the natural world (you might be a “great supporter of science”, but I think that doesn’t usually mean you would get invited to all the scientific conferences, or get the Nobel Prize, or something, and I think people would have a healthy understanding of the relationship of you to the rest of the scientific ecosystem).
Agree with much of what you say here. (Though I don’t think we currently have strong enough evidence to single out specific EA leaders as being especially responsible for the recent tragic events; at least I don’t think I personally have that kind of information.)
As a substitute, or complement, to an investigative EA newspaper, what do you think about an “EA rumours” prediction market?[1] Some attractive features of such a market:
It turns private information held by individual people with privileged access to sources into public information available to the entire EA community, increasing the likelihood that the information will reach those for whom it is most valuable and actionable.
It potentially reduces community drama by turning “hot” debates influenced by tribal allegiances and virtue signaling into “cold” assignments of probability and assessments of evidence.
It makes rumours more accurate, by incentivizing users to estimate their probability correctly.
It makes false rumours less damaging to their targets, by explicitly associating them with a low probability.
I think this market would need judicious moderation to function well and avoid being abused. But overall it seems to me like it might be an idea worth exploring further, and of the sort that could make future events in the same reference class as the FTX debacle less likely to happen.
By ‘market’, I do not necessarily mean a real-money prediction market like Polymarket or PredictIt; it could also be a play-money market like Manifold Markets or a forecasting platform like Metaculus.
Yeah, I feel excited about something in this space. Generally I feel like prediction markets have a lot of good things going for them in situations like this, though I do worry that they will somehow just end up gamed when the stakes are high. Like, my guess is Sam could have likely moved the probability of a market here a lot, either with money, or by encouraging other people to move it.