Hi, Folks! I’m Wahhab Baldwin, a 78-year-old retired software developer and manager and minister. I have donated at least 10% of my income for decades, strongly favoring effectiveness. I ran into EA through the Podcast interview with William MacAskill on “People I (Mostly) Admire.”
I strongly affirm much of EA, but I disagree with certain elements, and hope I am able to have some enlightening conversations here. I hope tomorrow to write a post on longtermism. As a preview, I will argue that we must discount a future good compared to a present good. It is better to save a life this year than to save a life next year. If we discount at the conservative rate of 2% per year, then a life 1000 years from now should be valued at 1⁄600,000,000 of a life today, meaning (imo) that we should really focus only on the next century. But before you argue, read my more detailed post! I look forward to our conversation. (Now at https://forum.effectivealtruism.org/posts/xvsmRLS998PpHffHE/concerns-with-longtermism).
As a newcomer to EA, and a person with a fair amount of experience of cults and cult-like groups (and I’m 78 years old), I would like to report my experience.
I am very attracted to the ideas expressed in Doing Good Better. Having a science background, the idea of analyzing how effective my philanthropy may be is something I have pursued for many years, leading to many of the same conclusions.
On the other hand, many of the ideas of longtermism, trying to save human beings thousands of years in the future, being concerned about spreading to other planets, seeing malevolent AGI as among the most critical issues to address, strike me as similar to cults like Scientology and other groups whose vision and ideas seem contrary to common sense (if not downright wacky) but which seems to be common currency if not required by “members.”
In What We Owe the Future, MacAskill often expresses reservations about his ideas, points out alternatives or potential flaws, and in general shows somewhat more humility that I encounter on this Forum, for example. I certainly disagree with some of his conclusions and approaches, which I have begun to attempt to express in my few posts here to date, but I do respect his and others’ efforts to think long-term when accompanied by express recognition of our limitations in trying to impact the future (except in set-in-stone certainties) more than a couple of decades out. Without those ongoing acknowledgments of our limitations, our uncertainty, and the weirdness of our perspectives (from a “normal” viewpoint), we are bound to come across as potentially cult-like.