Speaking only for ConcernedEAs, we are likely to continue remaining anonymous until costly signals are sent that making deep critiques in public will not damage one’s career/funding/social prospects within EA.
I’m willing to make a very costly signal to help you test your theory. Would both outcomes of the following experiment update your opinions in opposite directions? Here is my idea:
I’ll open up a grant form from EV, OP, LTFF, etc, and write a grant proposal for a project I actually want to execute. Instead of submitting it right away, I’ll post my final draft on the forum, and ask several grant makers to read it and say how much funding they’d be willing to give me. Then I’ll make another post, containing very harsh criticism of EA or prominent EA organizations or leadership, such as the aforementioned funding orgs or their top executives. (You can suggest criticisms you want me to post, otherwise I’ll aim to write up the most offensive-to-EA-orthodoxy thing I actually believe or can pass an ITT about.) Finally, I submit the grant proposal to the org I just criticized and see how much money I get.
If this gets as much funding as the reviewers estimated, this is at least weak evidence that public criticism of EA doesn’t hurt your prospects. If I get less funding, I’ll admit that public criticism of EA can damage the person making the criticism. Agreed?
Aside from the bit where you publish something you don’t believe and wouldn’t otherwise write, and how estimates ahead of time might not be that predictive of how funding actually goes, talking about how you plan to do this on the Forum means this probably doesn’t work at all. Someone at the grant making organization sees your harshly critical post, thinks “that’s a really surprising thing to come from Robi”, someone else points out that you’re doing it as an experiment and links your comment, …
Crap. I guess I should’ve posted the above comment from a burner account...
But anyway, serious reply: I thought of all of those problems already, and have several solutions for them. (For example, have someone who is not known to the grantmakers to be connected to me to do the experiment instead of me.) ConcernedEAs, would you accept this experiment if I propose a satisfactory variation, or in principle if it’s not practically workable?
I’m willing to make a very costly signal to help you test your theory. Would both outcomes of the following experiment update your opinions in opposite directions? Here is my idea:
I’ll open up a grant form from EV, OP, LTFF, etc, and write a grant proposal for a project I actually want to execute. Instead of submitting it right away, I’ll post my final draft on the forum, and ask several grant makers to read it and say how much funding they’d be willing to give me. Then I’ll make another post, containing very harsh criticism of EA or prominent EA organizations or leadership, such as the aforementioned funding orgs or their top executives. (You can suggest criticisms you want me to post, otherwise I’ll aim to write up the most offensive-to-EA-orthodoxy thing I actually believe or can pass an ITT about.) Finally, I submit the grant proposal to the org I just criticized and see how much money I get.
If this gets as much funding as the reviewers estimated, this is at least weak evidence that public criticism of EA doesn’t hurt your prospects. If I get less funding, I’ll admit that public criticism of EA can damage the person making the criticism. Agreed?
Aside from the bit where you publish something you don’t believe and wouldn’t otherwise write, and how estimates ahead of time might not be that predictive of how funding actually goes, talking about how you plan to do this on the Forum means this probably doesn’t work at all. Someone at the grant making organization sees your harshly critical post, thinks “that’s a really surprising thing to come from Robi”, someone else points out that you’re doing it as an experiment and links your comment, …
Crap. I guess I should’ve posted the above comment from a burner account...
But anyway, serious reply: I thought of all of those problems already, and have several solutions for them. (For example, have someone who is not known to the grantmakers to be connected to me to do the experiment instead of me.) ConcernedEAs, would you accept this experiment if I propose a satisfactory variation, or in principle if it’s not practically workable?