Thanks for saying what you think and raising concerns that matter to you. I have a concern that this kind of post pushes towards what is and isn’t acceptable in EA in ways that are symmetric rather than antisymmetric with respect to the truth, and doesn’t support truth-seeking.
I wish this post were more about disagreements with claims or arguments about how longtermism does not support pronatalism (or perhaps this kind of pronatalism?) or digging into concerns that the community including pronatalist elements (or these kind) would be bad even if it made sense under longtermist priorities (or regardless of whether it did?).
I appreciate people raising concerns and saying what they think, but I also want to push us to engage with many parts of strange or uncomfortable or bad-seeming (or indeed bad!) worldviews when we evaluate them. (I think Nathan does a good job here of saying what evidence he does and doesn’t see for concern around money in EA, and his feelings about it, regardless of whether they fit with the evidence. I think this post could benefit from having that kind of orientation).
In particular, I don’t want it to be intra-EA politically unacceptable to explore questions raised by pronatalists or find their arguments compelling, that seems like a really bad outcome to me.
The main reason for the post is not to start a discussion on whether or not the Collins’ brand of pronatalism is appropriate or a logical conclusion to longtermism. I already have a fairly settled view on this, and if it’s the case that we sit here and discuss the merits of this type of pronatalism and suggest that it is a natural conclusion to longtermism, I’m simply going to reject longtermism, or at least many attempts to implement this in practice.
The main reason for the post is to serve as a PSA, to bring attention to a faction that may be at least opportunistically looking to gain influence within the EA space for their own goals, and not for the truth-seeking goals that you deem important, and let others decide whether they think this is what they want for EA, or whether this is the type of EA they want to be associated with. I’ll note that as a result of this post, someone kindly pointed me (and other readers) to this post’s existence. It has since been heavily downvoted, and a comment engaging with the object level points were left. (This has additional benefits for those of us who don’t want anything to do with that brand of pronatalism, like having a place to point to next time some journalist associates these viewpoints with EA).
If a flat earther faction (especially one now successfully funded by the SFF) expressed a desire to become a dominant faction in the EA space to further their aims, I would make a similar post about this, and I don’t think I should be expected to engage with debunking flat-earth viewpoints before making this post. It sounds like you disagree?
I really appreciate your straightforwardness and honesty here. It would be very easy for you to give lip service to Chana’s goals, but you said what you believe and I respect that. … However I very much disagree with your conclusion. Most issues are not like flat-earthers. Most of the time you will have a much better time debating against the ideas you disagree with than writing PSAs about them.
This link explains some of my thinking in the area. Some of the ideas are applicable, but please don’t take the obvious analogy too directly. (Also: apologies for the length of the piece. In an ideal world I would write more of my own thoughts.)
Pronatalist.org is Collins’ organization that received funding from the SFF. I can see how that was unclear, apologies.
expressed a desire to become a dominant faction in the EA space to further their aims, I would make a similar post about this, and I don’t think I should be expected to engage with debunking flat-earth viewpoints before making this post. It sounds like you disagree?
I think the crux here is that everyone involved thinks it’s obvious flat-earthers are wrong and we’re working from the shared (implicit) assumption.
I think that’s not the case here (I don’t even know what pro-natalism specifically claims or involves other than “more babies is good, pushing society in that direction is good”, and I assume there’s a huge range of thinking within that, some more libertarian, some more statist, some more racist, some more futurist, etc), and so I don’t want to base a discussion on that as a shared assumption, and don’t want to port in that assumption without addressing it.
Maybe you think it is equally obvious and that we should all think so, and if EA started debating flat earth you’d be (correctly) very concerned, some things are just obvious, but I’ve never figured out how to have a good conversation around “this is so obvious we shouldn’t have a conversation about it” even though I think that’s sometimes a reasonable point to make.
In my read, this post is not about whether having children (literal ‘pro’ ‘natalism’) is correct or not. I think having a debate about that is great, and I’m inclined towards the ‘yes’ side.
It’s about pointing to signs suggesting the existence of power-seeking faction within EA, that (by their own admission) is attempting to coopt the movement for their own aims.
(Why the hedging in the previous paragraph: stating that your faction is “now 100X more likely to become a real, dominant faction” is not quite stating your intention to make it dominate, it just suggests it.)
I think coopting a broad altruism movement for particular people’s pet causes (or selfish aims) is bad. There is a very real possibility that EA will be coopted by power-seekers and become another club for career-conscious people who want to get ahead. I support good attempts to prevent that.
This pointing is asymmetrical with respect to the question of whether the purported ‘faction’ in question is in fact a faction, and is in fact trying to coopt the movement.
Sorry I missed this comment, just got a recent notification on this post and realized.
I don’t even know what pro-natalism specifically claims or involves other than “more babies is good, pushing society in that direction is good”, and I assume there’s a huge range of thinking within that, some more libertarian, some more statist, some more racist, some more futurist, etc), and so I don’t want to base a discussion on that as a shared assumption, and don’t want to port in that assumption without addressing it.
I am specifically talking about the Collins’ brand of pronatalism here as reported, as well as the possibility of a faction that are opportunistically seeking to co-opt the goals of the EA movement, rather than pronatalism that is as broad as you describe “more babies is good, pushing society in that direction is good”.
In the link (as well as in the comments above), there is discussion of some of these views. Are you happy to defend these views as things that EAs should spend more time discussing and funding on the margin?
“fear that falling birth rates in certain developed countries like the United States and most of Europe will lead to the extinction of cultures, the breakdown of economies, and, ultimately, the collapse of civilization.”
″worry that the overlap between the types of people deciding not to have children with the part of the population that values things like gay rights, education for women, and climate activism — traits they believe are genetically coded — is so great that these values could ultimately disappear.”
“What is really happening is that individuals from those families with sociological profiles amenable to movements like effective altruism, progressivism, or broad Western Civilisational values are being selected out of the gene pool.”
“Current birth rate trends suggest traits on which the EA community relies, such as prosociality, are being differentially selected out of populations.”
Do you think focusing on birth rates in “Western Civilization” is a good way of creating “intergenerationally, durable cultures that will lead to our species being a diverse, thriving, innovative interplanetary empire one day that isn’t at risk from, you know, a single asteroid strike or a single huge disease?”
To be clear I’m not likely to engage on the object level even if you are happy to defend these points, I’m just not sure it’s useful or relevant for me to spell out all the versions and parts of pronatalism I do support in order to make a post like this. I’m not even making a claim that any of pronatalism beyond what is reported is bad!
I’m just indicating that if there’s a faction of people focused on genetic improvement and low birth rates in “Western Civilization” in the EA community, I can see how longtermism rhetoric can be easily co-opted for this, and how this might implicate EA as a result. I stand by that, and I also believe that it should be seen as a clear potential risk for the EA community’s ability to fulfill its potential for impact! And if you’re just a “more babies is good” pronatalist and don’t think these views represent your movement, then this concern applies to you too (or perhaps even more so).
If you do think individual EAs, funders, or EA orgs should be spending more about ways to ensure Western Civilizational values remain competitive in the gene pool on the margin, that’s entirely your prerogative! In that case consider this post as well as comments like these to be an indication of the kinds of tradeoffs your movement should take into account when asking people to engage with arguments like this. (I’m reminded of similar conversations and disagreements around Bostrom’s letter and “truth-seeking” here).
I’ve never figured out how to have a good conversation around “this is so obvious we shouldn’t have a conversation about it” even though I think that’s sometimes a reasonable point to make.
Some things worth considering:
What kind of harms that these kinds of discussions could have? For example, should we dedicate more EA forum discussions to Bostrom’s use of a racial slur and whether that’s appropriate or not in the pursuit of truth-seeking? How action-guiding are these discussions? Should we spend more time on racial differences in IQ and the implications of this on EA community building? Are discussions of these topics actually a good indicator for people who deeply value truth-seeking, or just people who are edgy and want to outwardly signal this in a community where doing this is rewarded? Is this a strong signal, or is it noisy? Not everyone values outward signals of “truth-seeking” above all, especially if those signals can also be a cover-up for harmful ideas. Being open-minded doesn’t mean you give the same space to every idea. Which groups of people are harmed, which groups of people might (wrongly) never join EA because of an unnecessary focus on these discussions?
I think if you think talking about broad likely to be action guiding in ways that will benefit more than it will harm in expectation then it’s worth talking about. Unfortunately, on priors, cause areas that sound like the ensuring the survival of Western Civilization combined with strong genetic determinism do not have a strong track record here, and I’m happy to dismiss by default unless there’s good reason to believe this is misguided (whereas talking about feeling anxious about a sharp increase in EA funding does not have the same issue).
I’ve stumbled here after getting more interested in the object-level debate around pronatalism. I am glad you posted this because, in the abstract, I think it’s worthwhile to point out where someone may not be engaging in good faith within our community.
Having said that, I wish you had framed the Collins’ actions in a little more good faith yourself. I do not consider that one quoted tweet to be evidence that of an “opportunistic power grab”. I think it’s probably a bit unhealthy to see our movement in terms of competing factions, and to seek wins for one’s own faction through strategic means rather than through open debate.
But I’m not sure Malcolm Collins is quite there, on the evidence you’ve said. It seems like he’s happy that (according to him) his own favored cause area will get more attention (in the months since this has been posted, I don’t think his prediction has proven correct). I don’t think that’s the same as actively seeking a power grab—it might just be a slightly cynical, though realistic, view that even in a community that tries to promote healthy epistemics, sociological forces are going to have an influence on what we do.
Thanks for saying what you think and raising concerns that matter to you. I have a concern that this kind of post pushes towards what is and isn’t acceptable in EA in ways that are symmetric rather than antisymmetric with respect to the truth, and doesn’t support truth-seeking.
I wish this post were more about disagreements with claims or arguments about how longtermism does not support pronatalism (or perhaps this kind of pronatalism?) or digging into concerns that the community including pronatalist elements (or these kind) would be bad even if it made sense under longtermist priorities (or regardless of whether it did?).
I appreciate people raising concerns and saying what they think, but I also want to push us to engage with many parts of strange or uncomfortable or bad-seeming (or indeed bad!) worldviews when we evaluate them. (I think Nathan does a good job here of saying what evidence he does and doesn’t see for concern around money in EA, and his feelings about it, regardless of whether they fit with the evidence. I think this post could benefit from having that kind of orientation).
In particular, I don’t want it to be intra-EA politically unacceptable to explore questions raised by pronatalists or find their arguments compelling, that seems like a really bad outcome to me.
The main reason for the post is not to start a discussion on whether or not the Collins’ brand of pronatalism is appropriate or a logical conclusion to longtermism. I already have a fairly settled view on this, and if it’s the case that we sit here and discuss the merits of this type of pronatalism and suggest that it is a natural conclusion to longtermism, I’m simply going to reject longtermism, or at least many attempts to implement this in practice.
The main reason for the post is to serve as a PSA, to bring attention to a faction that may be at least opportunistically looking to gain influence within the EA space for their own goals, and not for the truth-seeking goals that you deem important, and let others decide whether they think this is what they want for EA, or whether this is the type of EA they want to be associated with. I’ll note that as a result of this post, someone kindly pointed me (and other readers) to this post’s existence. It has since been heavily downvoted, and a comment engaging with the object level points were left. (This has additional benefits for those of us who don’t want anything to do with that brand of pronatalism, like having a place to point to next time some journalist associates these viewpoints with EA).
If a flat earther faction (especially one now successfully funded by the SFF) expressed a desire to become a dominant faction in the EA space to further their aims, I would make a similar post about this, and I don’t think I should be expected to engage with debunking flat-earth viewpoints before making this post. It sounds like you disagree?
I really appreciate your straightforwardness and honesty here. It would be very easy for you to give lip service to Chana’s goals, but you said what you believe and I respect that. … However I very much disagree with your conclusion. Most issues are not like flat-earthers. Most of the time you will have a much better time debating against the ideas you disagree with than writing PSAs about them.
This link explains some of my thinking in the area. Some of the ideas are applicable, but please don’t take the obvious analogy too directly. (Also: apologies for the length of the piece. In an ideal world I would write more of my own thoughts.)
Seconding JP’s point that I appreciate you being clear about your goals. Not sure what organization on that list is a flat earth one?
Thanks!
Pronatalist.org is Collins’ organization that received funding from the SFF. I can see how that was unclear, apologies.
Interested in your view here!
I think the crux here is that everyone involved thinks it’s obvious flat-earthers are wrong and we’re working from the shared (implicit) assumption.
I think that’s not the case here (I don’t even know what pro-natalism specifically claims or involves other than “more babies is good, pushing society in that direction is good”, and I assume there’s a huge range of thinking within that, some more libertarian, some more statist, some more racist, some more futurist, etc), and so I don’t want to base a discussion on that as a shared assumption, and don’t want to port in that assumption without addressing it.
Maybe you think it is equally obvious and that we should all think so, and if EA started debating flat earth you’d be (correctly) very concerned, some things are just obvious, but I’ve never figured out how to have a good conversation around “this is so obvious we shouldn’t have a conversation about it” even though I think that’s sometimes a reasonable point to make.
In my read, this post is not about whether having children (literal ‘pro’ ‘natalism’) is correct or not. I think having a debate about that is great, and I’m inclined towards the ‘yes’ side.
It’s about pointing to signs suggesting the existence of power-seeking faction within EA, that (by their own admission) is attempting to coopt the movement for their own aims.
(Why the hedging in the previous paragraph: stating that your faction is “now 100X more likely to become a real, dominant faction” is not quite stating your intention to make it dominate, it just suggests it.)
I think coopting a broad altruism movement for particular people’s pet causes (or selfish aims) is bad. There is a very real possibility that EA will be coopted by power-seekers and become another club for career-conscious people who want to get ahead. I support good attempts to prevent that.
This pointing is asymmetrical with respect to the question of whether the purported ‘faction’ in question is in fact a faction, and is in fact trying to coopt the movement.
Sorry I missed this comment, just got a recent notification on this post and realized.
I am specifically talking about the Collins’ brand of pronatalism here as reported, as well as the possibility of a faction that are opportunistically seeking to co-opt the goals of the EA movement, rather than pronatalism that is as broad as you describe “more babies is good, pushing society in that direction is good”.
In the link (as well as in the comments above), there is discussion of some of these views. Are you happy to defend these views as things that EAs should spend more time discussing and funding on the margin?
To be clear I’m not likely to engage on the object level even if you are happy to defend these points, I’m just not sure it’s useful or relevant for me to spell out all the versions and parts of pronatalism I do support in order to make a post like this. I’m not even making a claim that any of pronatalism beyond what is reported is bad!
I’m just indicating that if there’s a faction of people focused on genetic improvement and low birth rates in “Western Civilization” in the EA community, I can see how longtermism rhetoric can be easily co-opted for this, and how this might implicate EA as a result. I stand by that, and I also believe that it should be seen as a clear potential risk for the EA community’s ability to fulfill its potential for impact! And if you’re just a “more babies is good” pronatalist and don’t think these views represent your movement, then this concern applies to you too (or perhaps even more so).
If you do think individual EAs, funders, or EA orgs should be spending more about ways to ensure Western Civilizational values remain competitive in the gene pool on the margin, that’s entirely your prerogative! In that case consider this post as well as comments like these to be an indication of the kinds of tradeoffs your movement should take into account when asking people to engage with arguments like this. (I’m reminded of similar conversations and disagreements around Bostrom’s letter and “truth-seeking” here).
Some things worth considering:
What kind of harms that these kinds of discussions could have? For example, should we dedicate more EA forum discussions to Bostrom’s use of a racial slur and whether that’s appropriate or not in the pursuit of truth-seeking? How action-guiding are these discussions? Should we spend more time on racial differences in IQ and the implications of this on EA community building? Are discussions of these topics actually a good indicator for people who deeply value truth-seeking, or just people who are edgy and want to outwardly signal this in a community where doing this is rewarded? Is this a strong signal, or is it noisy? Not everyone values outward signals of “truth-seeking” above all, especially if those signals can also be a cover-up for harmful ideas. Being open-minded doesn’t mean you give the same space to every idea. Which groups of people are harmed, which groups of people might (wrongly) never join EA because of an unnecessary focus on these discussions?
I think if you think talking about broad likely to be action guiding in ways that will benefit more than it will harm in expectation then it’s worth talking about. Unfortunately, on priors, cause areas that sound like the ensuring the survival of Western Civilization combined with strong genetic determinism do not have a strong track record here, and I’m happy to dismiss by default unless there’s good reason to believe this is misguided (whereas talking about feeling anxious about a sharp increase in EA funding does not have the same issue).
I’ve stumbled here after getting more interested in the object-level debate around pronatalism. I am glad you posted this because, in the abstract, I think it’s worthwhile to point out where someone may not be engaging in good faith within our community.
Having said that, I wish you had framed the Collins’ actions in a little more good faith yourself. I do not consider that one quoted tweet to be evidence that of an “opportunistic power grab”. I think it’s probably a bit unhealthy to see our movement in terms of competing factions, and to seek wins for one’s own faction through strategic means rather than through open debate.
But I’m not sure Malcolm Collins is quite there, on the evidence you’ve said. It seems like he’s happy that (according to him) his own favored cause area will get more attention (in the months since this has been posted, I don’t think his prediction has proven correct). I don’t think that’s the same as actively seeking a power grab—it might just be a slightly cynical, though realistic, view that even in a community that tries to promote healthy epistemics, sociological forces are going to have an influence on what we do.