Feel free to DM me anything you want to share but don’t want to or can’t under your own account(s), and I can share them on your behalf if I think it’s adding value to the discourse/community.
The corollary is that views shared on this account don’t necessarily reflect my own personal views, though they will usually be worded to sound like they are.
Sorry I missed this comment, just got a recent notification on this post and realized.
I am specifically talking about the Collins’ brand of pronatalism here as reported, as well as the possibility of a faction that are opportunistically seeking to co-opt the goals of the EA movement, rather than pronatalism that is as broad as you describe “more babies is good, pushing society in that direction is good”.
In the link (as well as in the comments above), there is discussion of some of these views. Are you happy to defend these views as things that EAs should spend more time discussing and funding on the margin?
To be clear I’m not likely to engage on the object level even if you are happy to defend these points, I’m just not sure it’s useful or relevant for me to spell out all the versions and parts of pronatalism I do support in order to make a post like this. I’m not even making a claim that any of pronatalism beyond what is reported is bad!
I’m just indicating that if there’s a faction of people focused on genetic improvement and low birth rates in “Western Civilization” in the EA community, I can see how longtermism rhetoric can be easily co-opted for this, and how this might implicate EA as a result. I stand by that, and I also believe that it should be seen as a clear potential risk for the EA community’s ability to fulfill its potential for impact! And if you’re just a “more babies is good” pronatalist and don’t think these views represent your movement, then this concern applies to you too (or perhaps even more so).
If you do think individual EAs, funders, or EA orgs should be spending more about ways to ensure Western Civilizational values remain competitive in the gene pool on the margin, that’s entirely your prerogative! In that case consider this post as well as comments like these to be an indication of the kinds of tradeoffs your movement should take into account when asking people to engage with arguments like this. (I’m reminded of similar conversations and disagreements around Bostrom’s letter and “truth-seeking” here).
Some things worth considering:
What kind of harms that these kinds of discussions could have? For example, should we dedicate more EA forum discussions to Bostrom’s use of a racial slur and whether that’s appropriate or not in the pursuit of truth-seeking? How action-guiding are these discussions? Should we spend more time on racial differences in IQ and the implications of this on EA community building? Are discussions of these topics actually a good indicator for people who deeply value truth-seeking, or just people who are edgy and want to outwardly signal this in a community where doing this is rewarded? Is this a strong signal, or is it noisy? Not everyone values outward signals of “truth-seeking” above all, especially if those signals can also be a cover-up for harmful ideas. Being open-minded doesn’t mean you give the same space to every idea. Which groups of people are harmed, which groups of people might (wrongly) never join EA because of an unnecessary focus on these discussions?
I think if you think talking about broad likely to be action guiding in ways that will benefit more than it will harm in expectation then it’s worth talking about. Unfortunately, on priors, cause areas that sound like the ensuring the survival of Western Civilization combined with strong genetic determinism do not have a strong track record here, and I’m happy to dismiss by default unless there’s good reason to believe this is misguided (whereas talking about feeling anxious about a sharp increase in EA funding does not have the same issue).