I do agree with you that silence can hurt community epistemics.
In the past I also thought people worried about missing out on job and grant opportunities if they voiced criticisms on the EA Forum overestimated the risks. I am ashamed to say that I thought this was a mere result of their social anxiety and pretty irrational.
Then last year I applied to an explicitly identified longtermist (central) EA org. They rejected me straight away with the reason that I wasn’t bought into longtermism (as written up here which is now featured in the EA Handbook as the critical piece on longtermism...).
This was perfectly fine by me, my interactions with the org were kind and professional and I had applied on a whim anyway.
But only later I realised that this meant that the people who say they are afraid to be critical of longtermism and potentially other bits of EA because they are worried about losing out on opportunities were more correct than I previously thought.
I still think it’s harmful not to voice disagreements. But evidently there is a more of a cost to individuals than I thought, especially to ones who are financially reliant on EA funding or EA jobs, and I was unreasonably dismissive of this possibility.
I am a bit reluctant to write this. I very much appreciated being told the reason for the rejection and I think it’s great that the org invested time and effort to do so. I hope they’ll continue doing this in the future, even if insufficient buy-in to longtermism is the reason for rejection.
I do agree with you that silence can hurt community epistemics.
In the past I also thought people worried about missing out on job and grant opportunities if they voiced criticisms on the EA Forum overestimated the risks. I am ashamed to say that I thought this was a mere result of their social anxiety and pretty irrational.
Then last year I applied to an explicitly identified longtermist (central) EA org. They rejected me straight away with the reason that I wasn’t bought into longtermism (as written up here which is now featured in the EA Handbook as the critical piece on longtermism...). This was perfectly fine by me, my interactions with the org were kind and professional and I had applied on a whim anyway.
But only later I realised that this meant that the people who say they are afraid to be critical of longtermism and potentially other bits of EA because they are worried about losing out on opportunities were more correct than I previously thought.
I still think it’s harmful not to voice disagreements. But evidently there is a more of a cost to individuals than I thought, especially to ones who are financially reliant on EA funding or EA jobs, and I was unreasonably dismissive of this possibility.
I am a bit reluctant to write this. I very much appreciated being told the reason for the rejection and I think it’s great that the org invested time and effort to do so. I hope they’ll continue doing this in the future, even if insufficient buy-in to longtermism is the reason for rejection.