“Pronatalists” may look to co-opt effective altruism or longtermism

Link post

Archived version: https://​​archive.ph/​​2wG0k#selection-1743.0-1743.242

Edit: this is getting lots of downvotes. I don’t mind this, because the article isn’t great, and probably isn’t a good use of most people’s time. I just thought it was a relevant article to flag, given it explicitly mentions the possibility of an opportunistic power grab in the EA space.


The most relevant quote:

“This means our faction (more conservative, pronatalist, long-termist-civilization-building-focused, likely to self fund) is now 100X more likely to become a real, dominant faction in the EA space,” Simone wrote in a text message on November 12.

Today, as I was doomscrolling through my favorite good-faith EA criticism from Timnit Gebru, I saw this article shared. Of course, I took the bait. It’s a long article, and EA/​longtermism are only mentioned a few times, but I have pasted a few relevant quotes over.

“I do not think humanity is in a great situation right now. And I think if somebody doesn’t fix the problem, we could be gone,” Malcolm half-shouted as he pushed his sniffling 18-month-old, Torsten, back and forth in a child-size Tonka truck.

Along with his 3-year-old brother, Octavian, and his newborn sister, Titan Invictus, Torsten has unwittingly joined an audacious experiment. According to his parents’ calculations, as long as each of their descendants can commit to having at least eight children for just 11 generations, the Collins bloodline will eventually outnumber the current human population.

If they succeed, Malcolm continued, “we could set the future of our species.”

Malcolm, 36, and his wife, Simone, 35, are “pronatalists,” part of a quiet but growing movement taking hold in wealthy tech and venture-capitalist circles. People like the Collinses fear that falling birth rates in certain developed countries like the United States and most of Europe will lead to the extinction of cultures, the breakdown of economies, and, ultimately, the collapse of civilization. It’s a theory that Elon Musk has championed on his Twitter feed, that Ross Douthat has defended in The New York Times’ opinion pages, and that Joe Rogan and the billionaire venture capitalist Marc Andreessen bantered about on “The Joe Rogan Experience.” It’s also, alarmingly, been used by some to justify white supremacy around the world, from the tiki-torch-carrying marchers in Charlottesville, Virginia, chanting “You will not replace us” to the mosque shooter in Christchurch, New Zealand, who opened his 2019 manifesto: “It’s the birthrates. It’s the birthrates. It’s the birthrates.”

While pronatalism is often associated with religious extremism, the version now trending in this community has more in common with dystopian sci-fi. The Collinses, who identify as secular Calvinists, are particularly drawn to the tenet of predestination, which suggests that certain people are chosen to be superior on earth and that free will is an illusion. They believe pronatalism is a natural extension of the philosophical movements sweeping tech hubs like the Silicon Hills of Austin, Texas. Our conversations frequently return to transhumanism (efforts to merge human and machine capabilities to create superior beings), longtermism (a philosophy that argues the true cost of human extinction wouldn’t be the death of billions today but the preemptive loss of trillions, or more, unborn future people), and effective altruism (or EA, a philanthropic system currently focused on preventing artificial intelligence from wiping out the human population).

In February, the PayPal cofounder Luke Nosek, a close Musk ally, hosted a gathering at his home on Austin’s Lake Travis to discuss “The End of Western Civilization,” another common catchphrase in the birth-rate discourse.

These worries tend to focus on one class of people in particular, which pronatalists use various euphemisms to express. In August, Elon’s father, Errol Musk, told me that he was worried about low birth rates in what he called “productive nations.” The Collinses call it “cosmopolitan society.” Elon Musk himself has tweeted about the movie “Idiocracy,” in which the intelligent elite stop procreating, allowing the unintelligent to populate the earth.

Musk was echoing an argument made by Nick Bostrom, one of the founding fathers of longtermism, who wrote that he worried declining fertility among “intellectually talented individuals” could lead to the demise of “advanced civilized society.” Émile P. Torres, a former longtermist philosopher who has become one of the movement’s most outspoken critics, put it more bluntly: “The longtermist view itself implies that really, people in rich countries matter more.”

A source who worked closely with Musk for several years described this thinking as core to the billionaire’s pronatalist ideology. “He’s very serious about the idea that your wealth is directly linked to your IQ,” he said. The source, who spoke on the condition of anonymity for this article, also said Musk urged “all the rich men he knew” to have as many children as possible.

Musk’s ties to the EA and longtermist communities have been gradually revealed in recent months. In September, text logs released as part of Musk’s legal battle with Twitter showed conversations between Musk and the prominent longtermist William MacAskill, who works at Oxford’s Future of Humanity Institute, where Musk is a major donor. In the messages, MacAskill offered to introduce Musk to Sam Bankman-Fried, a now-disgraced cryptocurrency entrepreneur who had donated millions of dollars to longtermist organizations.

MacAskill has never explicitly endorsed pronatalism, and he declined to be interviewed for this article. He did, however, devote a chapter of his best-selling book, “What We Owe the Future,” to his fear that dwindling birth rates would lead to “technological stagnation,” which would increase the likelihood of extinction or civilizational collapse. One solution he offered was cloning or genetically optimizing a small subset of the population to have “Einstein-level research abilities” to “compensate for having fewer people overall.”

Malcolm said he was glad to see Musk bring these issues to the forefront. “He’s not as afraid of being canceled as everyone else,” Malcolm told me. “Any smart person with a certain cultural aesthetics of their life is looking at this world and saying, ‘How do we create intergenerationally, durable cultures that will lead to our species being a diverse, thriving, innovative interplanetary empire one day that isn’t at risk from, you know, a single asteroid strike or a single huge disease?’”

The Collinses worry that the overlap between the types of people deciding not to have children with the part of the population that values things like gay rights, education for women, and climate activism — traits they believe are genetically coded — is so great that these values could ultimately disappear.

She also weighed in on the stunning implosion of Sam Bankman-Fried’s crypto exchange FTX, which represented one of the largest financial hubs for the effective-altruism movement. The Collinses, who never directly associated with the top Democratic donor Bankman-Fried, spied an opportunity in his demise.

“This means our faction (more conservative, pronatalist, long-termist-civilization-building-focused, likely to self fund) is now 100X more likely to become a real, dominant faction in the EA space,” Simone wrote in a text message on November 12.

I personally want nothing to do with a faction of people focused on genetic improvement and low birth rates in “Western Civilization”, and I can see how longtermism rhetoric can be easily co-opted for this, and how this might implicate EA as a result. If this is a group of well-funded people who see the recent FTX events as an opportunity, that might be another reason EA/​longtermism priorities might be shifted.

Can people more involved or have insight into the Austin and Bay Area EA communities give us a better sense of what this is about? And can people with a better understanding of longtermism clarify the extent of overlap here? And can Will or EA or longtermism clearly disassociate with this “faction” of people, if their goal is to opportunistically use EA as a way to further their means, or if there’s significant disagreement in terms of values and ideology?