epistemic and emotional status: had a brief look at your post and some comments, got the impression that 400 comments didn’t move the needle of your mind at all, which disappointed me.
I don’t understand why you think you’d like to be a part of EA, the list of orthodoxies just seem like the movement’s stated premises and goals (and yeah it’d be a problem if AMF was firing people for not being transhumanist, but I’d roll to disbelieve that something like that is actually happening). So what I’d suggest at about 65% confidence is a kind of broad “your reasons for anonymity are deep down reasons to try making a living in another philanthropic ecosystem” or a harsher heuristic I’m only about 25% confident in which is “the urge to be anonymous is a signal you don’t belong here”.
I’m glad the decentralization discourse is in the overton window (I have a few sketches and unfinished projects in the parallel/distributed epistemics space, I intermittently study a mechanism design textbook so I can take a crack at contributions in the space), but I haven’t seen a good contribution from the pro-decentralization side that came from the folks who are talking about fearing retribution for their bold views.
If ConcernedEAs posted with their real names, would you be less likely to hire them for an EA role? Even if not, would you agree that ConcernedEAs might reasonably draw that conclusion from your comment suggesting they might not belong here?
Sorry for any terseness I lack, and this may get out of scope or better placed in their original post’s comments. Keep in mind I’m not someone who’s opinion matters about this.
Plausibly, but who knows. Inclusivity failures are not an indictment. Sometimes knowing you disagree with an institution is a prediction that working together wouldn’t go super well.
As a baseline, recall that in normie capitalism hiring discrimination on alignment happens all the time. You have to at least pretend to care. Small orgs have higher standards of this “pretending to care” than large orgs (Gwern’s fermstimate of the proportion of amazon employees who “actually” care about same day delivery). Some would even say that to pull off working at a small org you have to actually care, and some would say that most EA orgs have more in common with startup than enterprise. But ConcernedEAs do care. They care enough to write a big post. So it’s looking good for them so far.
I probably converge with them on the idea that ideological purity and the accurate recitation of shibboleths is a very bad screening tool for any org. The more we have movement wide cohesion, the greater a threat this is.
So like, individual orgs should use their judgment and standards analogous to a startup avoiding hiring someone who openly doesn’t care about the customers or product. That doesn’t mean a deep amount of conformity.
So with the caveat that it depends a lot on the metric ton of variables that go into whether someone seems like a viable employee, in addition to their domain/object-level contributions/potential/expertise, and with deep and immense emphasis on all the reasons a hiring decision might not go through, I don’t think they’re disqualified from most projects. The point is that due to the nitty gritty, there may be some projects they’re disqualified from, and this is good and efficient. Rather, it would be good and efficient if they weren’t anonymous.
epistemic and emotional status: had a brief look at your post and some comments, got the impression that 400 comments didn’t move the needle of your mind at all, which disappointed me.
I don’t understand why you think you’d like to be a part of EA, the list of orthodoxies just seem like the movement’s stated premises and goals (and yeah it’d be a problem if AMF was firing people for not being transhumanist, but I’d roll to disbelieve that something like that is actually happening). So what I’d suggest at about 65% confidence is a kind of broad “your reasons for anonymity are deep down reasons to try making a living in another philanthropic ecosystem” or a harsher heuristic I’m only about 25% confident in which is “the urge to be anonymous is a signal you don’t belong here”.
I’m glad the decentralization discourse is in the overton window (I have a few sketches and unfinished projects in the parallel/distributed epistemics space, I intermittently study a mechanism design textbook so I can take a crack at contributions in the space), but I haven’t seen a good contribution from the pro-decentralization side that came from the folks who are talking about fearing retribution for their bold views.
If ConcernedEAs posted with their real names, would you be less likely to hire them for an EA role? Even if not, would you agree that ConcernedEAs might reasonably draw that conclusion from your comment suggesting they might not belong here?
Sorry for any terseness I lack, and this may get out of scope or better placed in their original post’s comments. Keep in mind I’m not someone who’s opinion matters about this.
Plausibly, but who knows. Inclusivity failures are not an indictment. Sometimes knowing you disagree with an institution is a prediction that working together wouldn’t go super well.
As a baseline, recall that in normie capitalism hiring discrimination on alignment happens all the time. You have to at least pretend to care. Small orgs have higher standards of this “pretending to care” than large orgs (Gwern’s fermstimate of the proportion of amazon employees who “actually” care about same day delivery). Some would even say that to pull off working at a small org you have to actually care, and some would say that most EA orgs have more in common with startup than enterprise. But ConcernedEAs do care. They care enough to write a big post. So it’s looking good for them so far.
I probably converge with them on the idea that ideological purity and the accurate recitation of shibboleths is a very bad screening tool for any org. The more we have movement wide cohesion, the greater a threat this is.
So like, individual orgs should use their judgment and standards analogous to a startup avoiding hiring someone who openly doesn’t care about the customers or product. That doesn’t mean a deep amount of conformity.
So with the caveat that it depends a lot on the metric ton of variables that go into whether someone seems like a viable employee, in addition to their domain/object-level contributions/potential/expertise, and with deep and immense emphasis on all the reasons a hiring decision might not go through, I don’t think they’re disqualified from most projects. The point is that due to the nitty gritty, there may be some projects they’re disqualified from, and this is good and efficient. Rather, it would be good and efficient if they weren’t anonymous.