Thanks Max. I agree that there is a lot of ground covered here that isn’t broken up into different dimensions and that it could have been better if broken up as such. I disagree that entirely undermines the core proposition that: (a) whether we like it or not we are getting more attention; (b) it’s particularly important to think carefully about our “shop fronts” with that increased attention; and therefore (c) staying true to “EA as a question” instead of a particular set of conclusions is going to ultimately serve our goals better (this might be our biggest disagreement?).
I’d be very interested to hear you unpack that you think the opposite of “easier to get big with the ‘evidence and reasoning’ framing”. This seems to be a pretty important crux.
Ah, I think I was actually a bit confused what the core proposition was, because of the different dimensions.
Here’s what I think of your claims:
a) 100% agree, this is a very important consideration.
b) Agree that this is important. I think it’s also very important to make sure that our shop fronts are accurate, and that we don’t importantly distort the real work that we’re doing (I expect you agree with this?).
c) I agree with this! Or at least, that’s what I’m focused on and want more of. (And I’m also excited about people doing more cause-specific or community building to complement that/reach different audiences.)
So maybe I agree with your core thesis!
How easy is it to get big with evidence and reasoning?
I want to distinguish a few different worlds:
We just do cause specific community building, or action-specific community building.
We do community building focused on “EA as a question” with several different causes. Our epistemics are decent but not amazing.
We do community building focused on “EA as a question” with several different causes. We are aiming for the epistemics of core members to be world class (like probably better than the average on this Forum, around the level that I see at some core EA organizations).
I’m most excited about option 3. I think that the thing we’re trying to do is really hard and it would be easy for us to cause harm if we don’t think carefully enough.
And then I think that we’re kind of just about at the level I’d like to see for 3. As we grow, I naturally expect regression to the mean, because we’re adding new people who have had less exposure to this type of thinking and may be less inclined to it. And also because I think that groups tend to reason less well as they get older and bigger. So I think that you want to be really careful about growth, and you can’t grow that quickly with this approach.
I wonder if you mean something a bit more like 2? I’m not excited about that, but I agree that we could grow it much more quickly.
I’m personally not doing 1, but I’m excited about others trying it. I think that, at least for some causes, if you’re doing 1 you can drop the epistemics/deep understanding requirements, and just have a lot of people coordinate around actions. E.g. I think that you could build a community of people who are earning to give for charities, and deferring to GiveWell and OpenPhilanthropy and GWWC about where they give. I think that this thing could grow at >200%/year. (This is the thing that I’m most excited about GWWC being.) Similarly, I think you could make a movement focused on ending global poverty based on evidence and reasoning that grows pretty quickly—e.g. around lobbying governments to spend more on aid, and spend aid money more effectively. (I think that this approach basically doesn’t work for pre-paradigmatic fields like AI safety, wild animal welfare, etc. though.)
Had a bit of time to digest overnight and wanted to clarify this a bit further.
I’m very supportive of #3 including “epistemics of core members to be world class”. But fear that trying to achieve #3 too narrowly (demographics, worldviews, engagement levels etc) might ultimately undermine our goals (putting more people off, leaving the core group without as much support, worldviews becoming too narrow and this hurts our epistemics, we don’t create enough allies to get things we want to do done).
I think that nurturing the experience through each level of engagement from outsider to audience through to contributor and core while remaining a “big tent” (worldview and action diverse) will ultimately serve us better than focusing too much on just developing a world class core (I think remaining a “big tent” is a necessary precondition because the world class core won’t exist without diversity of ideas/approaches and the support network needed for this core to succeed).
Thanks Max. I agree that there is a lot of ground covered here that isn’t broken up into different dimensions and that it could have been better if broken up as such. I disagree that entirely undermines the core proposition that: (a) whether we like it or not we are getting more attention; (b) it’s particularly important to think carefully about our “shop fronts” with that increased attention; and therefore (c) staying true to “EA as a question” instead of a particular set of conclusions is going to ultimately serve our goals better (this might be our biggest disagreement?).
I’d be very interested to hear you unpack that you think the opposite of “easier to get big with the ‘evidence and reasoning’ framing”. This seems to be a pretty important crux.
Ah, I think I was actually a bit confused what the core proposition was, because of the different dimensions.
Here’s what I think of your claims:
a) 100% agree, this is a very important consideration.
b) Agree that this is important. I think it’s also very important to make sure that our shop fronts are accurate, and that we don’t importantly distort the real work that we’re doing (I expect you agree with this?).
c) I agree with this! Or at least, that’s what I’m focused on and want more of. (And I’m also excited about people doing more cause-specific or community building to complement that/reach different audiences.)
So maybe I agree with your core thesis!
How easy is it to get big with evidence and reasoning?
I want to distinguish a few different worlds:
We just do cause specific community building, or action-specific community building.
We do community building focused on “EA as a question” with several different causes. Our epistemics are decent but not amazing.
We do community building focused on “EA as a question” with several different causes. We are aiming for the epistemics of core members to be world class (like probably better than the average on this Forum, around the level that I see at some core EA organizations).
I’m most excited about option 3. I think that the thing we’re trying to do is really hard and it would be easy for us to cause harm if we don’t think carefully enough.
And then I think that we’re kind of just about at the level I’d like to see for 3. As we grow, I naturally expect regression to the mean, because we’re adding new people who have had less exposure to this type of thinking and may be less inclined to it. And also because I think that groups tend to reason less well as they get older and bigger. So I think that you want to be really careful about growth, and you can’t grow that quickly with this approach.
I wonder if you mean something a bit more like 2? I’m not excited about that, but I agree that we could grow it much more quickly.
I’m personally not doing 1, but I’m excited about others trying it. I think that, at least for some causes, if you’re doing 1 you can drop the epistemics/deep understanding requirements, and just have a lot of people coordinate around actions. E.g. I think that you could build a community of people who are earning to give for charities, and deferring to GiveWell and OpenPhilanthropy and GWWC about where they give. I think that this thing could grow at >200%/year. (This is the thing that I’m most excited about GWWC being.) Similarly, I think you could make a movement focused on ending global poverty based on evidence and reasoning that grows pretty quickly—e.g. around lobbying governments to spend more on aid, and spend aid money more effectively. (I think that this approach basically doesn’t work for pre-paradigmatic fields like AI safety, wild animal welfare, etc. though.)
Had a bit of time to digest overnight and wanted to clarify this a bit further.
I’m very supportive of #3 including “epistemics of core members to be world class”. But fear that trying to achieve #3 too narrowly (demographics, worldviews, engagement levels etc) might ultimately undermine our goals (putting more people off, leaving the core group without as much support, worldviews becoming too narrow and this hurts our epistemics, we don’t create enough allies to get things we want to do done).
I think that nurturing the experience through each level of engagement from outsider to audience through to contributor and core while remaining a “big tent” (worldview and action diverse) will ultimately serve us better than focusing too much on just developing a world class core (I think remaining a “big tent” is a necessary precondition because the world class core won’t exist without diversity of ideas/approaches and the support network needed for this core to succeed).
Happy to chat more about this.
Thanks for clarifying! Not much to add now right this moment other than to say that I appreciate you going into detail about this.