I also think, at least in the past, the attitude towards climate work has been vaguely dismissive.
As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.
There’s something about the “gateway to EA” argument that is a bit off-putting. It sounds like “those folks don’t yet understand that only x-risks are important, but eventually we can show them the error of their ways.” I understand that this viewpoint makes sense if you are convinced that your own views are correct, but it strikes me as a bit patronizing. I’m not trying to pick on you in particular, but I see this viewpoint advanced fairly frequently so I wanted to comment on it.
Thanks for sharing that. It’s good to know that that’s how the message comes across. I agree we should avoid that kind of bait-and-switch which engages people under false pretences. Sam discusses this in a different context as the top comment on this post, so it’s an ongoing concern.
I’ll just speak on my own experience. I was focused on climate change throughout my undergrad and early career because I wanted to work on a really important problem and it seemed obvious that thismeant I should work on climate change. Learning about EA was eye-opening because I realized (1) there are other important problems on the same scale as climate change, (2) there are frameworks to help me think about how to prioritize work among them, and (3) it might be even more useful for me to work on some of these other problems.
I personally don’t see climate change as some separate thing that people engage with before they switch to “EA stuff.” Climate change is EA stuff. It’s a major global problem that concerns future generations and threatens civilization. However, it is unique among plausible x-risks in that it’s also a widely-known problem that gets lots of attention from funders, voters, politicians, activists, and smart people who want to do altruistic work. Climate change might be the only thing that’s both an x-risk and a Popular Social Cause.
It would be nice for our climate change message to do at least two thing. First, help people like me, who are searching for the best thing to do with their life and have landed on climate because it’s a Popular Social Cause, discover the range of other important things to work on. Second, help people like you, who, I assume, care about future generations and want to help solve climate change, work in the most effective way possible. I think we can do both in the future, even if we haven’t in the past.
Yeah, I think many groups struggle with the exact boundary between “marketing” and “deception”. Though EAs are in general very truthful, different EAs will still differ both in where they put that boundary and their actual evaluation of climate change, so their final evaluation of the morality of devoting more attention to climate change for marketing purposes will differ quite a lot.
I was arguing elsewhere in this post for more of a strict “say what you believe” policy, but out of curiosity, would you still have that reaction (to the gateway/PR argument) if the EA in question thought that climate change thought was, like, pretty good, not the top cause, but decent? To me that seems a lot more ethical and a lot less patronising.
Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don’t think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It’s manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when “outsiders” (for lack of a better term) are discussed.
As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.
There’s something about the “gateway to EA” argument that is a bit off-putting. It sounds like “those folks don’t yet understand that only x-risks are important, but eventually we can show them the error of their ways.” I understand that this viewpoint makes sense if you are convinced that your own views are correct, but it strikes me as a bit patronizing. I’m not trying to pick on you in particular, but I see this viewpoint advanced fairly frequently so I wanted to comment on it.
Thanks for sharing that. It’s good to know that that’s how the message comes across. I agree we should avoid that kind of bait-and-switch which engages people under false pretences. Sam discusses this in a different context as the top comment on this post, so it’s an ongoing concern.
I’ll just speak on my own experience. I was focused on climate change throughout my undergrad and early career because I wanted to work on a really important problem and it seemed obvious that this meant I should work on climate change. Learning about EA was eye-opening because I realized (1) there are other important problems on the same scale as climate change, (2) there are frameworks to help me think about how to prioritize work among them, and (3) it might be even more useful for me to work on some of these other problems.
I personally don’t see climate change as some separate thing that people engage with before they switch to “EA stuff.” Climate change is EA stuff. It’s a major global problem that concerns future generations and threatens civilization. However, it is unique among plausible x-risks in that it’s also a widely-known problem that gets lots of attention from funders, voters, politicians, activists, and smart people who want to do altruistic work. Climate change might be the only thing that’s both an x-risk and a Popular Social Cause.
It would be nice for our climate change message to do at least two thing. First, help people like me, who are searching for the best thing to do with their life and have landed on climate because it’s a Popular Social Cause, discover the range of other important things to work on. Second, help people like you, who, I assume, care about future generations and want to help solve climate change, work in the most effective way possible. I think we can do both in the future, even if we haven’t in the past.
Yeah, I think many groups struggle with the exact boundary between “marketing” and “deception”. Though EAs are in general very truthful, different EAs will still differ both in where they put that boundary and their actual evaluation of climate change, so their final evaluation of the morality of devoting more attention to climate change for marketing purposes will differ quite a lot.
I was arguing elsewhere in this post for more of a strict “say what you believe” policy, but out of curiosity, would you still have that reaction (to the gateway/PR argument) if the EA in question thought that climate change thought was, like, pretty good, not the top cause, but decent? To me that seems a lot more ethical and a lot less patronising.
Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don’t think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It’s manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when “outsiders” (for lack of a better term) are discussed.