Thanks for writing this up Luke! I think you’re pointing to some important issues. I also think you and the GWWC team are doing excellent work—I’m really excited to see more people introduced to effective giving!
[Edit to add: Despite my comment below, I still am taking in the datapoints and perspectives that Luke is sharing, and I agree with many of his recommendations. I don’t want to go into all of the sub-debates below because I’m focused on other priorities right now (including working on some of the issues Luke raises!).]
However, I worry that you’re conflating a few pretty different dimensions, so I downvoted this post.
Here are some things that I think you’re pointing to:
“Particular set of conclusions” vs. “commitment to using evidence and reasoning”
Size of the community, which we could in turn split into
Rate of growth of the community
Eventual size of the community
How welcoming we should be/how diverse
[I think you could split this up further.]
In what circumstances, and to what degree, there should be encouragement/pressure to take certain actions, versus just presenting people with options.
How much we should focus on clearly communicating EA to people who aren’t yet heavily involved.
This matters because you’re sometimes then conflating these dimensions in ways that seem wrong to me (e.g. you say that it’s easier to get big with the “evidence and reasoning” framing, but I think the opposite).
I also interpreted this comment as quite dismissive but I think most of that comes from the fact Max explicitly said he downvoted the post, rather than from the rest of the comment (which seems fine and reasonable).
I think I naturally interpret a downvote as meaning “I think this post/comment isn’t helpful and I generally want to discourage posts/comments like it.” That seems pretty harsh in this case, and at odds with the fact Max seems to think the post actually points at some important things worth taking seriously. I also naturally feel a bit concerned about the CEO of CEA seeming to discourage posts which suggest EA should be doing things differently, especially where they are reasonable and constructive like this one.
This is a minor point in some ways but I think explicitly stating “I downvoted this post” can say quite a lot (especially when coming from someone with a senior position in the community). I haven’t spent a lot of time on this forum recently so I’m wondering if other people think the norms around up/downvoting are different to my interpretation, and in particular whether Max you meant to use it differently?
[EDIT: I checked the norms on up/downvoting, which say to downvote if either “There’s an error”, or “The comment or post didn’t add to the conversation, and maybe actually distracted.” I personally think this post added something useful to the conversation about the scope and focus of EA, and it seems harsh to downvote it because it conflated a few different dimensions—and that’s why Max’s comment seemed a bit harsh/dismissive to me]
This is a minor point in some ways but I think explicitly stating “I downvoted this post” can say quite a lot (especially when coming from someone with a senior position in the community).
I ran the Forum for 3+ years (and, caveat, worked with Max). This is a complicated question.
Something I’ve seen many times: A post or comment is downvoted, and the author writes a comment asking why people downvoted (often seeming pretty confused/dispirited).
Some people really hate anonymous downvotes. I’ve heard multiple suggestions that we remove anonymity from votes, or require people to input a reason before downvoting (which is then presumably sent to the author), or just establish an informal culture where downvotes are expected to come with comments.
So I don’t think Max was necessarily being impolite here, especially since he and Luke are colleagues who know each other well. Instead, he was doing something that some people want a lot more of and other people don’t want at all. This seems like a matter of competing access needs (different people wanting different things from a shared resource).
In the end, I think it’s down to individual users to take their best guess at whether saying “I downvoted” or “I upvoted” would be helpful in a given case. And I’m still not sure whether having more such comments would be a net positive — probably depends on circumstance.
***
Max having a senior position in the community is also a complicated thing. On the one hand, there’s a risk that anything he says will be taken very seriously and lead to reactions he wouldn’t want. On the other hand, it seems good for leaders to share their honest opinions on public platforms (rather than doing everything via DM or deliberately softening their views).
There are still ways to write better or worse comments, but I thought Max’s was reasonable given the balancing act he’s trying to do (and the massive support Luke’s post had gotten already — I’d feel differently if Max had been joining a pile-on or something).
I think the problem isn’t with saying you downvoted a post and why (I personally share the view that people should aim to explain their downvotes).
The problem is the actual reason:
I think you’re pointing to some important issues… However, I worry that you’re conflating a few pretty different dimensions, so I downvoted this post.
The message that, for me, stands out from this is “If you have an important idea but can’t present it perfectly—it’s better not to write at all.” Which I think most of us would not endorse.
I didn’t get that message at all. If someone tells me they downvoted something I wrote, my default takeaway is “oh, I could have been more clear” or “huh, maybe I need to add something that was missing” — not “yikes, I shouldn’t have written this”. *
I read Max’s comment as “I thought this wasn’t written very clearly/got some things wrong”, not “I think you shouldn’t have written this at all”. The latter is, to me, almost the definition of a strong downvote.
If someone sees a post they think (a) points to important issues, and (b) gets important things wrong, any of upvote/downvote/decline-to-vote seems reasonable to me.
*This is partly because I’ve stopped feeling very nervous about Forum posts after years of experience. I know plenty of people who do have the “yikes” reaction. But that’s where the users’ identities and relationship comes into play — I’d feel somewhat differently had Max said the same thing to a new poster.
I don’t share your view about what a downvote means. However, regardless of what I think, it doesn’t actually have any fixed meaning beyond that which people a assign to it—so it’d be interesting to have some stats on how people on the forum interpret it.
But that’s where the users’ identities and relationship comes into play — I’d feel somewhat differently had Max said the same thing to a new poster.
Most(?) readers won’t know who either of them is, not to mention their relationship.
I don’t share your view about what a downvote means.
What does a downvote mean to you? If it means “you shouldn’t have written this”, what does a strong downvote mean to you? The same thing, but with more emphasis?
It’d be interesting to have some stats on how people on the forum interpret it.
Why not create a poll? I would, but I’m not sure exactly which question you’d want asked.
Most(?) readers won’t know who either of them is, not to mention their relationship.
Which brings up another question — to what extent should a comment be written for an author vs. the audience?
Max’s comment seemed very directed at Luke — it was mostly about the style of Luke’s writing and his way of drawing conclusions. Other comments feel more audience-directed.
Personally, I primarily downvote posts/comments where I generally think “reading this post/comment will on average make forum readers be worse at thinking about this problem than if they didn’t read this post/comment, assuming that the time spent reading this post/comment is free.”
I basically never strong downvote posts unless it’s obvious spam or otherwise an extremely bad offender in the “worsens thinking” direction.
It’s been over a week so I guess I should answer even if I don’t have time for a longer reply.
What does a downvote mean to you? If it means “you shouldn’t have written this”, what does a strong downvote mean to you? The same thing, but with more emphasis?
I think so, but I’m not very confident.
to what extent should a comment be written for an author vs. the audience?
I don’t think private conversations can exist on a public platform. If it’s not a DM, there’s always an audience, and in most contexts, I’d expect much of a comment’s impact to come from its effects on that audience.
Why not create a poll?
The polls in that specific group look like they have a very small and probably unrepresentative sample size. Though I don’t we’ll be able to get a much larger one on such a question, I guess.
Thanks for sharing that perspective—that makes sense. Possibly I was holding this to too high a standard—I think that I held it to a higher standard partly because Luke is also an organization/community leader, and probably I shouldn’t have taken that into account. Still, overall my best guess is that this post distracted from the conversation, rather than adding to it (though others clearly disagree). Roughly, I think that the data points/perspectives were important but not particularly novel, and that the conflation of different questions could lead to people coming away more confused, or to making inaccurate inferences. But I agree that this is a pretty high standard, and maybe I should just comment in circumstances like this.
I also think I should have been more careful re seeming to discourage suggestions about EA. I wanted to signal “this particular set of suggestions seems muddled” not “suggestions are bad”, but I definitely see how my post above could make people feel more hesitant to share suggestions, and that seems like a mistake on my part. To be clear: I would love feedback and suggestions!
Thanks Max. I agree that there is a lot of ground covered here that isn’t broken up into different dimensions and that it could have been better if broken up as such. I disagree that entirely undermines the core proposition that: (a) whether we like it or not we are getting more attention; (b) it’s particularly important to think carefully about our “shop fronts” with that increased attention; and therefore (c) staying true to “EA as a question” instead of a particular set of conclusions is going to ultimately serve our goals better (this might be our biggest disagreement?).
I’d be very interested to hear you unpack that you think the opposite of “easier to get big with the ‘evidence and reasoning’ framing”. This seems to be a pretty important crux.
Ah, I think I was actually a bit confused what the core proposition was, because of the different dimensions.
Here’s what I think of your claims:
a) 100% agree, this is a very important consideration.
b) Agree that this is important. I think it’s also very important to make sure that our shop fronts are accurate, and that we don’t importantly distort the real work that we’re doing (I expect you agree with this?).
c) I agree with this! Or at least, that’s what I’m focused on and want more of. (And I’m also excited about people doing more cause-specific or community building to complement that/reach different audiences.)
So maybe I agree with your core thesis!
How easy is it to get big with evidence and reasoning?
I want to distinguish a few different worlds:
We just do cause specific community building, or action-specific community building.
We do community building focused on “EA as a question” with several different causes. Our epistemics are decent but not amazing.
We do community building focused on “EA as a question” with several different causes. We are aiming for the epistemics of core members to be world class (like probably better than the average on this Forum, around the level that I see at some core EA organizations).
I’m most excited about option 3. I think that the thing we’re trying to do is really hard and it would be easy for us to cause harm if we don’t think carefully enough.
And then I think that we’re kind of just about at the level I’d like to see for 3. As we grow, I naturally expect regression to the mean, because we’re adding new people who have had less exposure to this type of thinking and may be less inclined to it. And also because I think that groups tend to reason less well as they get older and bigger. So I think that you want to be really careful about growth, and you can’t grow that quickly with this approach.
I wonder if you mean something a bit more like 2? I’m not excited about that, but I agree that we could grow it much more quickly.
I’m personally not doing 1, but I’m excited about others trying it. I think that, at least for some causes, if you’re doing 1 you can drop the epistemics/deep understanding requirements, and just have a lot of people coordinate around actions. E.g. I think that you could build a community of people who are earning to give for charities, and deferring to GiveWell and OpenPhilanthropy and GWWC about where they give. I think that this thing could grow at >200%/year. (This is the thing that I’m most excited about GWWC being.) Similarly, I think you could make a movement focused on ending global poverty based on evidence and reasoning that grows pretty quickly—e.g. around lobbying governments to spend more on aid, and spend aid money more effectively. (I think that this approach basically doesn’t work for pre-paradigmatic fields like AI safety, wild animal welfare, etc. though.)
Had a bit of time to digest overnight and wanted to clarify this a bit further.
I’m very supportive of #3 including “epistemics of core members to be world class”. But fear that trying to achieve #3 too narrowly (demographics, worldviews, engagement levels etc) might ultimately undermine our goals (putting more people off, leaving the core group without as much support, worldviews becoming too narrow and this hurts our epistemics, we don’t create enough allies to get things we want to do done).
I think that nurturing the experience through each level of engagement from outsider to audience through to contributor and core while remaining a “big tent” (worldview and action diverse) will ultimately serve us better than focusing too much on just developing a world class core (I think remaining a “big tent” is a necessary precondition because the world class core won’t exist without diversity of ideas/approaches and the support network needed for this core to succeed).
Luke raised, you say, some “important issues”. However, you didn’t engage with the substance of those issues. Instead, you complained that he hadn’t adequately separated them even though, for my money, they are substantially related. I wouldn’t have minded that if you’d then go on to offer your thoughts on how EA should operate on each of the dimensions you listed, but you did not.
Given this, your comment struck me as unacceptably dismissive, particularly given you are the CEO of CEA. The message it conveys is something like “I will only listen to your concerns if you present them exactly in the format I want” which, again for my money, is not a good message to send.
I’m sorry that it came off as dismissive. I’ll edit to make clearer that I appreciate and value the datapoints and perspectives. I am keen to get feedback and suggestions in any form. I take the datapoints and perspectives that Luke shared seriously, and I’ve discussed lots of these things with him before. Sounds like you might want to share your perspective too? I’ll send you a DM.
I viewed the splitting out of different threads as a substantive contribution to the debate, but I’m sorry you didn’t see it that way. :) I agree that it would have been better if I’d given my take on all of the dimensions, but I didn’t really want to get into all of those threads right now.
Would you have this same reaction if you saw Luke and Max or GWWC/CEA as equals and peers?
Maybe so! It seems like you saw this as the head of CEA talking down to the OP. Max and Luke seem to know each other though; I read Max’s comment as a quick flag between equals that there’s a disagreement here, but writing it on the forum instead of an email means the rest of us get to participate a bit more in the conversation too.
FWIW, I do think that I reacted to this a bit differently because it’s Luke (who I’ve worked with, and who I view as a peer). I think I would have been more positive/had lower standards for a random community member.
Thanks for writing this up Luke! I think you’re pointing to some important issues. I also think you and the GWWC team are doing excellent work—I’m really excited to see more people introduced to effective giving!
[Edit to add: Despite my comment below, I still am taking in the datapoints and perspectives that Luke is sharing, and I agree with many of his recommendations. I don’t want to go into all of the sub-debates below because I’m focused on other priorities right now (including working on some of the issues Luke raises!).]
However, I worry that you’re conflating a few pretty different dimensions, so I downvoted this post.
Here are some things that I think you’re pointing to:
“Particular set of conclusions” vs. “commitment to using evidence and reasoning”
Size of the community, which we could in turn split into
Rate of growth of the community
Eventual size of the community
How welcoming we should be/how diverse
[I think you could split this up further.]
In what circumstances, and to what degree, there should be encouragement/pressure to take certain actions, versus just presenting people with options.
How much we should focus on clearly communicating EA to people who aren’t yet heavily involved.
This matters because you’re sometimes then conflating these dimensions in ways that seem wrong to me (e.g. you say that it’s easier to get big with the “evidence and reasoning” framing, but I think the opposite).
I also interpreted this comment as quite dismissive but I think most of that comes from the fact Max explicitly said he downvoted the post, rather than from the rest of the comment (which seems fine and reasonable).
I think I naturally interpret a downvote as meaning “I think this post/comment isn’t helpful and I generally want to discourage posts/comments like it.” That seems pretty harsh in this case, and at odds with the fact Max seems to think the post actually points at some important things worth taking seriously. I also naturally feel a bit concerned about the CEO of CEA seeming to discourage posts which suggest EA should be doing things differently, especially where they are reasonable and constructive like this one.
This is a minor point in some ways but I think explicitly stating “I downvoted this post” can say quite a lot (especially when coming from someone with a senior position in the community). I haven’t spent a lot of time on this forum recently so I’m wondering if other people think the norms around up/downvoting are different to my interpretation, and in particular whether Max you meant to use it differently?
[EDIT: I checked the norms on up/downvoting, which say to downvote if either “There’s an error”, or “The comment or post didn’t add to the conversation, and maybe actually distracted.” I personally think this post added something useful to the conversation about the scope and focus of EA, and it seems harsh to downvote it because it conflated a few different dimensions—and that’s why Max’s comment seemed a bit harsh/dismissive to me]
I ran the Forum for 3+ years (and, caveat, worked with Max). This is a complicated question.
Something I’ve seen many times: A post or comment is downvoted, and the author writes a comment asking why people downvoted (often seeming pretty confused/dispirited).
Some people really hate anonymous downvotes. I’ve heard multiple suggestions that we remove anonymity from votes, or require people to input a reason before downvoting (which is then presumably sent to the author), or just establish an informal culture where downvotes are expected to come with comments.
So I don’t think Max was necessarily being impolite here, especially since he and Luke are colleagues who know each other well. Instead, he was doing something that some people want a lot more of and other people don’t want at all. This seems like a matter of competing access needs (different people wanting different things from a shared resource).
In the end, I think it’s down to individual users to take their best guess at whether saying “I downvoted” or “I upvoted” would be helpful in a given case. And I’m still not sure whether having more such comments would be a net positive — probably depends on circumstance.
***
Max having a senior position in the community is also a complicated thing. On the one hand, there’s a risk that anything he says will be taken very seriously and lead to reactions he wouldn’t want. On the other hand, it seems good for leaders to share their honest opinions on public platforms (rather than doing everything via DM or deliberately softening their views).
There are still ways to write better or worse comments, but I thought Max’s was reasonable given the balancing act he’s trying to do (and the massive support Luke’s post had gotten already — I’d feel differently if Max had been joining a pile-on or something).
I think the problem isn’t with saying you downvoted a post and why (I personally share the view that people should aim to explain their downvotes).
The problem is the actual reason:
The message that, for me, stands out from this is “If you have an important idea but can’t present it perfectly—it’s better not to write at all.” Which I think most of us would not endorse.
I didn’t get that message at all. If someone tells me they downvoted something I wrote, my default takeaway is “oh, I could have been more clear” or “huh, maybe I need to add something that was missing” — not “yikes, I shouldn’t have written this”. *
I read Max’s comment as “I thought this wasn’t written very clearly/got some things wrong”, not “I think you shouldn’t have written this at all”. The latter is, to me, almost the definition of a strong downvote.
If someone sees a post they think (a) points to important issues, and (b) gets important things wrong, any of upvote/downvote/decline-to-vote seems reasonable to me.
*This is partly because I’ve stopped feeling very nervous about Forum posts after years of experience. I know plenty of people who do have the “yikes” reaction. But that’s where the users’ identities and relationship comes into play — I’d feel somewhat differently had Max said the same thing to a new poster.
I don’t share your view about what a downvote means. However, regardless of what I think, it doesn’t actually have any fixed meaning beyond that which people a assign to it—so it’d be interesting to have some stats on how people on the forum interpret it.
Most(?) readers won’t know who either of them is, not to mention their relationship.
What does a downvote mean to you? If it means “you shouldn’t have written this”, what does a strong downvote mean to you? The same thing, but with more emphasis?
Why not create a poll? I would, but I’m not sure exactly which question you’d want asked.
Which brings up another question — to what extent should a comment be written for an author vs. the audience?
Max’s comment seemed very directed at Luke — it was mostly about the style of Luke’s writing and his way of drawing conclusions. Other comments feel more audience-directed.
Personally, I primarily downvote posts/comments where I generally think “reading this post/comment will on average make forum readers be worse at thinking about this problem than if they didn’t read this post/comment, assuming that the time spent reading this post/comment is free.”
I basically never strong downvote posts unless it’s obvious spam or otherwise an extremely bad offender in the “worsens thinking” direction.
It’s been over a week so I guess I should answer even if I don’t have time for a longer reply.
I think so, but I’m not very confident.
I don’t think private conversations can exist on a public platform. If it’s not a DM, there’s always an audience, and in most contexts, I’d expect much of a comment’s impact to come from its effects on that audience.
The polls in that specific group look like they have a very small and probably unrepresentative sample size. Though I don’t we’ll be able to get a much larger one on such a question, I guess.
Nice to see you on the Forum again!
Thanks for sharing that perspective—that makes sense. Possibly I was holding this to too high a standard—I think that I held it to a higher standard partly because Luke is also an organization/community leader, and probably I shouldn’t have taken that into account. Still, overall my best guess is that this post distracted from the conversation, rather than adding to it (though others clearly disagree). Roughly, I think that the data points/perspectives were important but not particularly novel, and that the conflation of different questions could lead to people coming away more confused, or to making inaccurate inferences. But I agree that this is a pretty high standard, and maybe I should just comment in circumstances like this.
I also think I should have been more careful re seeming to discourage suggestions about EA. I wanted to signal “this particular set of suggestions seems muddled” not “suggestions are bad”, but I definitely see how my post above could make people feel more hesitant to share suggestions, and that seems like a mistake on my part. To be clear: I would love feedback and suggestions!
Thanks Max. I agree that there is a lot of ground covered here that isn’t broken up into different dimensions and that it could have been better if broken up as such. I disagree that entirely undermines the core proposition that: (a) whether we like it or not we are getting more attention; (b) it’s particularly important to think carefully about our “shop fronts” with that increased attention; and therefore (c) staying true to “EA as a question” instead of a particular set of conclusions is going to ultimately serve our goals better (this might be our biggest disagreement?).
I’d be very interested to hear you unpack that you think the opposite of “easier to get big with the ‘evidence and reasoning’ framing”. This seems to be a pretty important crux.
Ah, I think I was actually a bit confused what the core proposition was, because of the different dimensions.
Here’s what I think of your claims:
a) 100% agree, this is a very important consideration.
b) Agree that this is important. I think it’s also very important to make sure that our shop fronts are accurate, and that we don’t importantly distort the real work that we’re doing (I expect you agree with this?).
c) I agree with this! Or at least, that’s what I’m focused on and want more of. (And I’m also excited about people doing more cause-specific or community building to complement that/reach different audiences.)
So maybe I agree with your core thesis!
How easy is it to get big with evidence and reasoning?
I want to distinguish a few different worlds:
We just do cause specific community building, or action-specific community building.
We do community building focused on “EA as a question” with several different causes. Our epistemics are decent but not amazing.
We do community building focused on “EA as a question” with several different causes. We are aiming for the epistemics of core members to be world class (like probably better than the average on this Forum, around the level that I see at some core EA organizations).
I’m most excited about option 3. I think that the thing we’re trying to do is really hard and it would be easy for us to cause harm if we don’t think carefully enough.
And then I think that we’re kind of just about at the level I’d like to see for 3. As we grow, I naturally expect regression to the mean, because we’re adding new people who have had less exposure to this type of thinking and may be less inclined to it. And also because I think that groups tend to reason less well as they get older and bigger. So I think that you want to be really careful about growth, and you can’t grow that quickly with this approach.
I wonder if you mean something a bit more like 2? I’m not excited about that, but I agree that we could grow it much more quickly.
I’m personally not doing 1, but I’m excited about others trying it. I think that, at least for some causes, if you’re doing 1 you can drop the epistemics/deep understanding requirements, and just have a lot of people coordinate around actions. E.g. I think that you could build a community of people who are earning to give for charities, and deferring to GiveWell and OpenPhilanthropy and GWWC about where they give. I think that this thing could grow at >200%/year. (This is the thing that I’m most excited about GWWC being.) Similarly, I think you could make a movement focused on ending global poverty based on evidence and reasoning that grows pretty quickly—e.g. around lobbying governments to spend more on aid, and spend aid money more effectively. (I think that this approach basically doesn’t work for pre-paradigmatic fields like AI safety, wild animal welfare, etc. though.)
Had a bit of time to digest overnight and wanted to clarify this a bit further.
I’m very supportive of #3 including “epistemics of core members to be world class”. But fear that trying to achieve #3 too narrowly (demographics, worldviews, engagement levels etc) might ultimately undermine our goals (putting more people off, leaving the core group without as much support, worldviews becoming too narrow and this hurts our epistemics, we don’t create enough allies to get things we want to do done).
I think that nurturing the experience through each level of engagement from outsider to audience through to contributor and core while remaining a “big tent” (worldview and action diverse) will ultimately serve us better than focusing too much on just developing a world class core (I think remaining a “big tent” is a necessary precondition because the world class core won’t exist without diversity of ideas/approaches and the support network needed for this core to succeed).
Happy to chat more about this.
Thanks for clarifying! Not much to add now right this moment other than to say that I appreciate you going into detail about this.
Hello Max,
In turn, I strongly downvoted your post.
Luke raised, you say, some “important issues”. However, you didn’t engage with the substance of those issues. Instead, you complained that he hadn’t adequately separated them even though, for my money, they are substantially related. I wouldn’t have minded that if you’d then go on to offer your thoughts on how EA should operate on each of the dimensions you listed, but you did not.
Given this, your comment struck me as unacceptably dismissive, particularly given you are the CEO of CEA. The message it conveys is something like “I will only listen to your concerns if you present them exactly in the format I want” which, again for my money, is not a good message to send.
I’m sorry that it came off as dismissive. I’ll edit to make clearer that I appreciate and value the datapoints and perspectives. I am keen to get feedback and suggestions in any form. I take the datapoints and perspectives that Luke shared seriously, and I’ve discussed lots of these things with him before. Sounds like you might want to share your perspective too? I’ll send you a DM.
I viewed the splitting out of different threads as a substantive contribution to the debate, but I’m sorry you didn’t see it that way. :) I agree that it would have been better if I’d given my take on all of the dimensions, but I didn’t really want to get into all of those threads right now.
Would you have this same reaction if you saw Luke and Max or GWWC/CEA as equals and peers? Maybe so! It seems like you saw this as the head of CEA talking down to the OP. Max and Luke seem to know each other though; I read Max’s comment as a quick flag between equals that there’s a disagreement here, but writing it on the forum instead of an email means the rest of us get to participate a bit more in the conversation too.
FWIW, I do think that I reacted to this a bit differently because it’s Luke (who I’ve worked with, and who I view as a peer). I think I would have been more positive/had lower standards for a random community member.
👌