Hi Sean,
I would be glad to connect with you to discuss EA and religion (especially Christianity!) Would you have time for a short call about this? https://calendly.com/jdbauman
Hi Sean,
I would be glad to connect with you to discuss EA and religion (especially Christianity!) Would you have time for a short call about this? https://calendly.com/jdbauman
Thank you for the post!
Are you aware of catholic institutes/seminars that discuss themes of AXR? Have you considered starting one?
I could connect you to other Catholics in the EA community who are interested. There’s a vibrant community of Christians engaging with these themes.
I’m very excited to see this. EA for Christians is working with local EAs in Kenya and Nigeria to do grassroots outreach. We would love to see more EA community building in an African context, including outreach that builds networks to impactful careers that don’t require difficult-to-obtain western visas.
Thanks, we did!
#1
Popular myth says “teach a man to fish, feed him for a lifetime”.
But 200+ studies say “give a poor man cash, lift him out of poverty”.
Let’s stick to the research.
#2
Teach a man to fish, feed him for a year (maybe).
Give a poor man cash, lift him out of poverty.
“Give a man a fish, feed him for a day. Teach a man to fish, feed him for a lifetime.”
Forget fish. Just send cash.
This looks excellent! I’m keen to support this in any way I can.
Excellent! Looking forward to seeing you there.
Related to this thread, Rory Stewart is speaking on “Can EA convince governments to make international aid effective” at the EA Christian 2023 annual conference.
I appreciate efforts to get Christians on board about AI risks, but respectfully, Antichrist memes aren’t generally taken very seriously. A fundamental issue seems to be that most people (Christians included) don’t take superhuman AI as a credible threat. How then could it be a candidate for the Antichrist?
Thanks for writing this! A couple of random thoughts
I’m also surprised by the cost of these CEP retreats ($1,500+ per attendee). Assuming the organizer’s salary is already provided for, I expected the cost for the average attendee to be closer to $300-$750.
Also, I respect the establishment of a scoring system, but the weightings seem problematic. For instance, “someone reports starting a project in an EA-aligned cause area” receives a score of 50, and “someone meets someone who inspires them to take an opportunity or cause area more seriously” receives a score of 1.
That’s not intuitive to me. I would much prefer 50 people attending a retreat/EAGx and reporting they felt inspired to take a cause area or opportunity more seriously over 1 person reporting they started a new project. “Projects” are just too vague, but maybe you have something more specific in mind?
Scoring systems like this will affect how community builders design events. e.g. Say I’m an events organizer and I want funding from the CEA events team. I know the CEA events team prefers projects over updated cause prioritization at 50:1. Then I’m going to shape my event in a way that makes starting new projects an especially large (plausibly) the largest focus of my event. Is that your intent? Are there already guidelines on how community builders should think about this?
Thank you for this post.
I find the closing comment especially striking “So, I want a more synodal Catholic Church because I find secular communities like nerdfighters, like Effective Altruism9, like the Covid Tracking Project and yes, like some LGBT activists the church persecutes are running laps around the bishops on some of the most important issues of our time.”
As far as I can tell (and as much as it disappoints me as a fellow Christian), your conclusion is correct.
Plenty of Christians would love more neartermist career content (and would be unlikely to engage with 80k as it’s currently branded). So over the past year a group of Christian EAs created an advisory for this, under the direction of EA for Christians
Thank you all very much for sharing!
Thanks so much for sharing!
This event is moved to April 7!
Eric Sampson published a paper on this in Oxford Studies in Philosophy of Religion. See here.
Abstract: Longtermist Effective Altruists (EAs) aim to mitigate the risk of existential catastrophes. In this paper, I have three goals. First, I identify a catastrophic risk that has been completely ignored by EAs. I call it religious catastrophe: the threat that (as Christians and Muslims have warned for centuries) billions of people stand in danger of going to hell for all eternity. Second, I argue that, even by secular EA lights, religious catastrophe is at least as bad and at least as probable, and therefore at least as important as many of the standard EA catastrophic risks (e.g., catastrophic climate change, nuclear winter). Third, I present the following dilemma for secular EAs: either adopt religious catastrophe as an EA cause or ignore religious catastrophe but also ignore catastrophic risks whose mitigation has a similar, or lower, expected value (i.e., most, or all, of them). Business as usual—ignoring religious catastrophe while championing the usual EA causes—is not an option consistent with longtermist EA principles.
Not a popular topic among secular EAs, in my experience.
I find this really interesting for personal reasons. I grew up in a Calvinist church (and also, for a brief period of time, considered myself a calvinist).
Now, looking back, I find it fascinating that the church was successful in motivating itself to take evangelism still very seriously.
It did so not on consequentialist grounds. No one ever said “evangelize because your effort actually might affect where someone spends eternity.”
Instead, people said things like “evangelize because you can share Good News of the hope that is within you” (1 Peter 3:15) or “God wants to work through you to bring nonbelievers to knowledge of salvation—that’s how God works: through people like you and me” (Romans 10:14-15). And people seemed to find that quite inspiring and motivating.
They would have probably balked at language of “tractability” of evangelism.
Not sure if this is the place to post but I’ll share.
I took the pledge about 6 years ago but I hesitated for years. I think my reasons then were:
(1) Legalism
Pledges risk falling into “legalism” i.e. a habit of relying on specific commitments and stated duties at the expense of a broader, all-encompassing spirit of generosity.
(2) Low Anchor
Related to (1), 10% sounded great but not so radical. Why set a lower bar for myself than I could handle? Speaking for myself, I thought then (and still do now) that I ought to be giving more than 10%. Plus, devout evangelical Christians in the US (one social group I encounter very frequently) already have a weak expectation that people give 10%. All that considered, I think the pledge was communicated that made it sound less radical and less exciting for 19-year-old me (I hadn’t heard of the giving further pledge).
(3) Religious Reasons Against Sharing
I was worried about “sounding the trumpet” and the possible social and spiritual negative effects of that. Quoting here a section of the Sermon on the Mount, one of Jesus’ most famous sermons: Matt 6:1-4 “Be careful not to do your `acts of righteousness’ before men, to be seen by them. If you do, you will have no reward from your Father in heaven. So when you give to the needy, do not announce it with trumpets, as the hypocrites do in the synagogues and on the streets, to be honored by men. I tell you the truth, they have received their reward in full. But when you give to the needy, do not let your left hand know what your right hand is doing, so that your giving may be in secret. Then your Father, who sees what is done in secret, will reward you.” Although this contrasts with Matt 5:16 (“Let your light shine before others”). On the whole, I think contemporary American Christian protestants are less disposed to speak openly about their giving or make public pledges.
(4) Little Social Reinforcement/Encouragement/Support
I wasn’t sure what the benefit was of pledging on paper to an online community (unfair, but that’s how GWWC seemed to me 8 years ago. Fortunately, I’ve since met tons of GWWC people by way of 1-on-1 calls and EA conferences.)
Why I took the pledge:
For reference, I mostly give to GiveWell-recommended global health and poverty charities. I think my pledge saves (in expectation) several lives a year, or accomplishes some roughly equivalent amount of a good thing. Basically, I started to feel guilty that if I didn’t take the pledge or talk about giving, fewer people would give to GiveWell and fewer lives would be saved. That cost seemed far greater than my moral scruples about protecting my motivations.
I also just met a bunch more pledgetakers who were giving over 10%. It became more socially normal and the low anchor point started to matter to me less.
Thanks for flagging this.
I’ve updated the language to “do the most good” to avoid any confusion.