I have often been fascinated watching young children expressing great confidence, even though, from my adult point of view, they have no basis for confidence other than their internal sense (i.e. they don’t understand the domain about which they are speaking in any adult way).
It is also my experience and belief that adults carry this same unwarranted sense of confidence in their opinions. Just to pick one example, 73% of Americans (including 80% of American men) believe they are better than average drivers.
Our culture selects for confidence, especially in men. This leads to overconfidence, especially among successful men. Successful people have often made at least one successful prediction (which may have led to their success), which may have simply been luck, but which reinforces their self-confidence.
I therefore strongly agree that longtermist predictions carry huge uncertainty despite expressions of confidence by those promoting them. I argue that in evaluating effective action, we should lower our expected value of any intervention based on how far in the future we are predicting, with a discount rate of 3.87% annually [/joke].
As a newcomer to EA, and a person with a fair amount of experience of cults and cult-like groups (and I’m 78 years old), I would like to report my experience.
I am very attracted to the ideas expressed in Doing Good Better. Having a science background, the idea of analyzing how effective my philanthropy may be is something I have pursued for many years, leading to many of the same conclusions.
On the other hand, many of the ideas of longtermism, trying to save human beings thousands of years in the future, being concerned about spreading to other planets, seeing malevolent AGI as among the most critical issues to address, strike me as similar to cults like Scientology and other groups whose vision and ideas seem contrary to common sense (if not downright wacky) but which seems to be common currency if not required by “members.”
In What We Owe the Future, MacAskill often expresses reservations about his ideas, points out alternatives or potential flaws, and in general shows somewhat more humility that I encounter on this Forum, for example. I certainly disagree with some of his conclusions and approaches, which I have begun to attempt to express in my few posts here to date, but I do respect his and others’ efforts to think long-term when accompanied by express recognition of our limitations in trying to impact the future (except in set-in-stone certainties) more than a couple of decades out. Without those ongoing acknowledgments of our limitations, our uncertainty, and the weirdness of our perspectives (from a “normal” viewpoint), we are bound to come across as potentially cult-like.