But also there seems to be a total lack of upside. Reported sexual misconduct is not the same as actual sexual misconduct (generally it significantly undercounts) so there’s no real indication whether an apparent improvement in the number represents perceptions of improved behaviour or worsened reporting process. Pretty much all the incentives on either side of the bet are perverse (people with relevant knowledge can earn from suppressing information or breaching confidences, and people can earn money from there being lots of sexual harassment or perhaps even buy looser safeguarding policy!), particularly as I can’t imagine it being a liquid market lots of people uninvolved in sexual harassment cases want to bet on.
I think your right that it would be very bad reputationally if the community as a whole was widely perceived as doing this. But it’s also a bit easier to say that if you also believe that this is actually a bad, or at least useless, thing to do on it’s own merits. If you don’t think that, it seems a bit sleazy (even if perhaps correct) to reason ‘this would actually help improve how we deal with sexual misconduct, but we shouldn’t do it because it’ll make us look bad’. Of course, the fact that something looks bad is evidence it IS bad, even if it seem good to you, but not always definitive evidence. Personally I don’t think it would be useful, but I’m not sure how much someone making a Manifold market would actually cause outside people to perceive this as something “EAs” do, or even that they would notice the market at all.
I think it would be net negative, in the “What is your community doing to prevent sexual misconduct? - Oh, we make bets about it” kind of way.
This. It’s awful from a reputational perspective
But also there seems to be a total lack of upside. Reported sexual misconduct is not the same as actual sexual misconduct (generally it significantly undercounts) so there’s no real indication whether an apparent improvement in the number represents perceptions of improved behaviour or worsened reporting process. Pretty much all the incentives on either side of the bet are perverse (people with relevant knowledge can earn from suppressing information or breaching confidences, and people can earn money from there being lots of sexual harassment or perhaps even buy looser safeguarding policy!), particularly as I can’t imagine it being a liquid market lots of people uninvolved in sexual harassment cases want to bet on.
I think your right that it would be very bad reputationally if the community as a whole was widely perceived as doing this. But it’s also a bit easier to say that if you also believe that this is actually a bad, or at least useless, thing to do on it’s own merits. If you don’t think that, it seems a bit sleazy (even if perhaps correct) to reason ‘this would actually help improve how we deal with sexual misconduct, but we shouldn’t do it because it’ll make us look bad’. Of course, the fact that something looks bad is evidence it IS bad, even if it seem good to you, but not always definitive evidence. Personally I don’t think it would be useful, but I’m not sure how much someone making a Manifold market would actually cause outside people to perceive this as something “EAs” do, or even that they would notice the market at all.
I certainly also think it’d be useless, like most prediction markets in EA.