Exactly. EA is a political project, not a truth seeking one. If EA is clear about that it can better make the political alliances that are useful for its aims.
Open truthseeking: Rather than starting with a commitment to a certain cause, community or approach, it’s important to consider many different ways to help and seek to find the best ones. This means putting serious time into deliberation and reflection on one’s beliefs, being constantly open and curious for new evidence and arguments, and being ready to change one’s views quite radically.
I think any other source of EA principles you can find will say the same thing.
I don’t think EA should be a political project at all. The value in EA is to be an intellectual space where weird ideas about how to improve the world can be explored. That is where it has excelled in the past and has the potential to excel even more in the future. When it comes time to do politics, that should be entirely outside the EA brand/umbrella. That should be done under cause-specific brands and umbrellas that can incorporate both the relevant components of EA and non-EAs who share the relevant policy goals.
Even in this question you put the political action as an end goal and the truth-seeking as only an instrumental one. This means truth-seeking is (and, in my view, really should be) secondary, and should sometimes give way to other priorities.
Huh, it’s a bit surprising to me that people disagree so strongly with this comment, which seems to be (uncharitably but not totally inaccurately) paraphrasing the parent, which has much more agreement.
(Maybe most people are taking it literally, rather than interpreting it as a snipe?)
I don’t agree with @Barry Cotter’s comment or think that it’s an accurate interpretation of my comment (but didn’t downvote).
I think EA is both a truth-seeking project and a good-doing project. These goals could theoretically be in tension, and I can envision hard cases where EAs would have to choose between them. Importantly, I don’t think that’s going on here, for much the same reasons as were articulated by @Ben Millwood in his thoughtful comment. In general, I don’t think the rationalists have a monopoly on truth-seeking, nor do I think their recent practices are conducive to it.
More speculatively, my sense is that epistemic norms within EA may—at least in some ways—now be better than those within rationalism for the following reason: I worry that some rationalists have been so alienated by wokeness (which many see as anathema to the project of truth-seeking) that they have leaned pretty hard into being controversial/edgy, as evidenced by them, e.g., platforming speakers who endorse scientific racism. Doing this has major epistemic downsides—for instance, a much broader swath of the population isn’t going to bother engaging with you if you do this—and I have seen limited evidence that rationalists take these downsides sufficiently seriously.
Your comment seems to be pretty straightforwardly advocating for optimizing for very traditional political considerations (appearance of respectability, relationships with particular interest groups, etc) by very traditional political means (disassociating with unfavorables). The more central this is to how “EA” operates, the more fair it is to call it a political project.
I agree that many rationalists have been alienated by wokeness/etc. I disagree that much of what’s being discussed today is well-explained by a reactionary leaning-in to edginess, and think that the explanation offered—that various people were invited on the basis of their engagement with concepts central to Manifest, or for specific panels not related to their less popular views—is sufficient to explain their presence.
With that said, I think Austin is not enormously representative of the rationalist community, and it’s pretty off-target to chalk this up as an epistemic win for the EA cultural scene over the rationalist cultural scene. Observe that it is here, on the EA forum, that a substantial fraction of commenters are calling for conference organizers to avoid inviting people for reasons that explicitly trade off against truth-seeking considerations. Notably, there are people who I wouldn’t have invited, if I were running this kind of event, specifically because I think they either have very bad epistemics or are habitual liars, such that it would be an epistemic disservice to other attendees to give those people any additional prominence.
I think that if relevant swathes of the population avoid engaging with e.g. prediction markets on the basis of the people invited to Manifest, this will be substantially an own-goal, where people with 2nd-order concerns (such as anticipated reputational risk) signal boost this and cause the very problem they’re worried about. (This is a contingent, empirical prediction, though unfortunately one that’s hard to test.) Separately, if someone avoided attending Manifest because they anticipated unpleasantness stemming from the presence of these attendees, they either had wildly miscalibrated expectations about what Manifest would be like, or (frankly) they might benefit from asking themselves what is different about attending Manifest vs. attending any other similarly large social event (nearly all of which have invited people with similarly unpalatable views), and whether they endorse letting the mere physical presence of people they could choose to pretend don’t exist stop them from going.
Observe that it is here, on the EA forum, that a substantial fraction of commenters are calling for conference organizers to avoid inviting people for reasons that explicitly trade off against truth-seeking considerations.
I have mostly observed people who don’t see the controversial speakers as a problem claim that excluding them would go against truth-seeking principles. People who’d prefer to not have them platformed at an event somewhat connected to EA don’t seem to think this is a trade off.
Separately, if someone avoided attending Manifest because they anticipated unpleasantness stemming from the presence of these attendees, they either had wildly miscalibrated expectations about what Manifest would be like, or (frankly) they might benefit from asking themselves what is different about attending Manifest vs. attending any other similarly large social event (nearly all of which have invited people with similarly unpalatable views)
Anecdotally, a major reason I created this post was because the amount of very edgy people was significantly higher than the baseline for non-EA large events. I can’t think of another event that I have attended where people would’ve felt comfortable saying the stuff that was being said. I didn’t particularly seek these types of interactions either.
The fact is that we have multiple people who would have been a positive contribution to the event, multiple people who have had similar experiences, and at least one person who said they would not have come or volunteered if they would have known that race science is a topic that would continue to come up (and I myself was on the fence on whether or not I’d come again, but I probably would, especially if some actions are taken to make things more comfortable for everyone). To be fair, at least one person has said that they did not see anything like this happening during the events, so it is unclear how many people were actually left upset by these things (Austin’s feedback form suggests not many).
People who’d prefer to not have them platformed at an event somewhat connected to EA don’t seem to think this is a trade off.
Optimizing for X means optimizing against not-X. (Well, at the pareto frontier, which we aren’t at, but it’s usually true for humans, anyways.) You will generate two different lists of people for two different values of X. Ergo, there is a trade off.
Anecdotally, a major reason I created this post was because the amount of very edgy people was significantly higher than the baseline for non-EA large events. I can’t think of another event that I have attended where people would’ve felt comfortable saying the stuff that was being said.
Note that these two sentences are saying very different things. The first one is about the percentage of attendees that have certain views, and I am pretty confident that it is false (except in a trivial sense, where people at non-EA events might have different “edgy” views). If you think that percentage of the general population that holds views at least as backwards as “typical racism” is less than whatever it was at Manifest (where I would bet very large amounts of money the median attendee was much more egalitarian than average for their reference class)...
The second one is about what was said at the event, and so far I haven’t seen anyone describe an explicit instance of racism or bigotry by an attendee (invited speaker or not). There were no sessions about “race science”, so I am left at something of a loss to explain how that is a subject that could continue to come up, unless someone happened to accidentally wander into multiple ongoing conversations about the subject. Absent affirmative confirmation of such an event, my current belief is that much more innocous things are being lumped in under a much more disparaging label.
>so alienated by wokeness (which many see as anathema to the project of truth-seeking)
Would you be willing to express any degree of agreement or disagreement?
Or, perhaps, a brief comment on whether certain epistemic approaches could be definitionally incompatible? That is, that what “woke” call truth-seeking is so different from what “rationalists” call truth-seeking (not taking a position here on which one is more correct, mind you) as to be totally separate domains of thought, EA’s version is somewhere in between, and that tension/confusion is contributing to these issues.
Exactly. EA is a political project, not a truth seeking one. If EA is clear about that it can better make the political alliances that are useful for its aims.
I think any other source of EA principles you can find will say the same thing.
I don’t think EA should be a political project at all. The value in EA is to be an intellectual space where weird ideas about how to improve the world can be explored. That is where it has excelled in the past and has the potential to excel even more in the future. When it comes time to do politics, that should be entirely outside the EA brand/umbrella. That should be done under cause-specific brands and umbrellas that can incorporate both the relevant components of EA and non-EAs who share the relevant policy goals.
I don’t want EA to be a political project over a truth seeking one. What helps us know what politics we should enact?
Even in this question you put the political action as an end goal and the truth-seeking as only an instrumental one. This means truth-seeking is (and, in my view, really should be) secondary, and should sometimes give way to other priorities.
Huh, it’s a bit surprising to me that people disagree so strongly with this comment, which seems to be (uncharitably but not totally inaccurately) paraphrasing the parent, which has much more agreement.
(Maybe most people are taking it literally, rather than interpreting it as a snipe?)
I don’t agree with @Barry Cotter’s comment or think that it’s an accurate interpretation of my comment (but didn’t downvote).
I think EA is both a truth-seeking project and a good-doing project. These goals could theoretically be in tension, and I can envision hard cases where EAs would have to choose between them. Importantly, I don’t think that’s going on here, for much the same reasons as were articulated by @Ben Millwood in his thoughtful comment. In general, I don’t think the rationalists have a monopoly on truth-seeking, nor do I think their recent practices are conducive to it.
More speculatively, my sense is that epistemic norms within EA may—at least in some ways—now be better than those within rationalism for the following reason: I worry that some rationalists have been so alienated by wokeness (which many see as anathema to the project of truth-seeking) that they have leaned pretty hard into being controversial/edgy, as evidenced by them, e.g., platforming speakers who endorse scientific racism. Doing this has major epistemic downsides—for instance, a much broader swath of the population isn’t going to bother engaging with you if you do this—and I have seen limited evidence that rationalists take these downsides sufficiently seriously.
Your comment seems to be pretty straightforwardly advocating for optimizing for very traditional political considerations (appearance of respectability, relationships with particular interest groups, etc) by very traditional political means (disassociating with unfavorables). The more central this is to how “EA” operates, the more fair it is to call it a political project.
I agree that many rationalists have been alienated by wokeness/etc. I disagree that much of what’s being discussed today is well-explained by a reactionary leaning-in to edginess, and think that the explanation offered—that various people were invited on the basis of their engagement with concepts central to Manifest, or for specific panels not related to their less popular views—is sufficient to explain their presence.
With that said, I think Austin is not enormously representative of the rationalist community, and it’s pretty off-target to chalk this up as an epistemic win for the EA cultural scene over the rationalist cultural scene. Observe that it is here, on the EA forum, that a substantial fraction of commenters are calling for conference organizers to avoid inviting people for reasons that explicitly trade off against truth-seeking considerations. Notably, there are people who I wouldn’t have invited, if I were running this kind of event, specifically because I think they either have very bad epistemics or are habitual liars, such that it would be an epistemic disservice to other attendees to give those people any additional prominence.
I think that if relevant swathes of the population avoid engaging with e.g. prediction markets on the basis of the people invited to Manifest, this will be substantially an own-goal, where people with 2nd-order concerns (such as anticipated reputational risk) signal boost this and cause the very problem they’re worried about. (This is a contingent, empirical prediction, though unfortunately one that’s hard to test.) Separately, if someone avoided attending Manifest because they anticipated unpleasantness stemming from the presence of these attendees, they either had wildly miscalibrated expectations about what Manifest would be like, or (frankly) they might benefit from asking themselves what is different about attending Manifest vs. attending any other similarly large social event (nearly all of which have invited people with similarly unpalatable views), and whether they endorse letting the mere physical presence of people they could choose to pretend don’t exist stop them from going.
I have mostly observed people who don’t see the controversial speakers as a problem claim that excluding them would go against truth-seeking principles. People who’d prefer to not have them platformed at an event somewhat connected to EA don’t seem to think this is a trade off.
Anecdotally, a major reason I created this post was because the amount of very edgy people was significantly higher than the baseline for non-EA large events. I can’t think of another event that I have attended where people would’ve felt comfortable saying the stuff that was being said. I didn’t particularly seek these types of interactions either.
The fact is that we have multiple people who would have been a positive contribution to the event, multiple people who have had similar experiences, and at least one person who said they would not have come or volunteered if they would have known that race science is a topic that would continue to come up (and I myself was on the fence on whether or not I’d come again, but I probably would, especially if some actions are taken to make things more comfortable for everyone). To be fair, at least one person has said that they did not see anything like this happening during the events, so it is unclear how many people were actually left upset by these things (Austin’s feedback form suggests not many).
Optimizing for X means optimizing against not-X. (Well, at the pareto frontier, which we aren’t at, but it’s usually true for humans, anyways.) You will generate two different lists of people for two different values of X. Ergo, there is a trade off.
Note that these two sentences are saying very different things. The first one is about the percentage of attendees that have certain views, and I am pretty confident that it is false (except in a trivial sense, where people at non-EA events might have different “edgy” views). If you think that percentage of the general population that holds views at least as backwards as “typical racism” is less than whatever it was at Manifest (where I would bet very large amounts of money the median attendee was much more egalitarian than average for their reference class)...
The second one is about what was said at the event, and so far I haven’t seen anyone describe an explicit instance of racism or bigotry by an attendee (invited speaker or not). There were no sessions about “race science”, so I am left at something of a loss to explain how that is a subject that could continue to come up, unless someone happened to accidentally wander into multiple ongoing conversations about the subject. Absent affirmative confirmation of such an event, my current belief is that much more innocous things are being lumped in under a much more disparaging label.
>so alienated by wokeness (which many see as anathema to the project of truth-seeking)
Would you be willing to express any degree of agreement or disagreement?
Or, perhaps, a brief comment on whether certain epistemic approaches could be definitionally incompatible? That is, that what “woke” call truth-seeking is so different from what “rationalists” call truth-seeking (not taking a position here on which one is more correct, mind you) as to be totally separate domains of thought, EA’s version is somewhere in between, and that tension/confusion is contributing to these issues.