Thanks for this lovely post. I have SO many reasons to love effective altruism, hereâs one (maybe Iâll write more later):
Truthseekingness
Iâve been seeking out truthseeking communities all my life, but they all fell short of my goals until I found EA. Some examples:
I studied particle physicsâwhat could be more truthseeky than trying to find the fundamental nature of the universe? Back then, a bunch of particle physicists claimed to believe in a class of theories called âsupersymmetryââand I never understood whyâthere was no evidence for it, and I never really grokked why people thought the theoretical arguments were so compelling. At the time I just thought I wasnât intelligent or knowledgeable enough to get it, but I might have undersold myself. The Large Hadron Collider has since ruled out all the then most popular versions of supersymmetry, and it isnât cool any more. I think there might have been some sort of shared delusion because people wanted it to be true, partly because it was testable by the particle accelerator that was under construction.
When I was a science teacher it used to drive me BONKERS that students were taught (and required to regurgitate in national exams) incorrect force diagrams. (E.g. for a car accelerating on a flat, level road, students were required to draw the âfrictionâ arrow pointing backwardsâwhereas in reality, friction is forward and without it the wheels would just spin and the car wouldnât go anywhere). I get the need for simplification as much as the next guy, but you canât tell me that âpoint the arrow in the exact opposite directionâ counts as simplification. I talked to a bunch of teachers, the national qualifications authority, the ministry of education, and created draft alternative (and equally simple) resources for them to review. But no one else seemed to care at all about whether what we were teaching was accurateâor at least didnât care enough to do anything about it.
But EA seems to be very differentâthis community seems to be unusually good at seeking the truth, even (or especially?) when it is inconvenient, scary, or even shameful. One of the first EA talks I went to blew my mind by questioning whether we are currently wasting our donations, by doing and then undoing good. Then I read GiveWellâs noodling on whether or not some of their (then) top charities are likely to have no impact, and discussions on whether becoming vegetarian increases animal suffering. More recently Iâve seen commentary about whether our community has accelerated dangerous AI capabilities or whether our community contributed to an environment that led to multi-billion dollar fraud. We should take these possible negatives very seriously. The fact that we do take these negatives seriously, and that we continue to try to get better a truth seeking gives me a whole bunch of hope.
Thanks for this lovely post. I have SO many reasons to love effective altruism, hereâs one (maybe Iâll write more later):
Truthseekingness
Iâve been seeking out truthseeking communities all my life, but they all fell short of my goals until I found EA. Some examples:
I studied particle physicsâwhat could be more truthseeky than trying to find the fundamental nature of the universe? Back then, a bunch of particle physicists claimed to believe in a class of theories called âsupersymmetryââand I never understood whyâthere was no evidence for it, and I never really grokked why people thought the theoretical arguments were so compelling. At the time I just thought I wasnât intelligent or knowledgeable enough to get it, but I might have undersold myself. The Large Hadron Collider has since ruled out all the then most popular versions of supersymmetry, and it isnât cool any more. I think there might have been some sort of shared delusion because people wanted it to be true, partly because it was testable by the particle accelerator that was under construction.
When I was a science teacher it used to drive me BONKERS that students were taught (and required to regurgitate in national exams) incorrect force diagrams. (E.g. for a car accelerating on a flat, level road, students were required to draw the âfrictionâ arrow pointing backwardsâwhereas in reality, friction is forward and without it the wheels would just spin and the car wouldnât go anywhere). I get the need for simplification as much as the next guy, but you canât tell me that âpoint the arrow in the exact opposite directionâ counts as simplification. I talked to a bunch of teachers, the national qualifications authority, the ministry of education, and created draft alternative (and equally simple) resources for them to review. But no one else seemed to care at all about whether what we were teaching was accurateâor at least didnât care enough to do anything about it.
But EA seems to be very differentâthis community seems to be unusually good at seeking the truth, even (or especially?) when it is inconvenient, scary, or even shameful. One of the first EA talks I went to blew my mind by questioning whether we are currently wasting our donations, by doing and then undoing good. Then I read GiveWellâs noodling on whether or not some of their (then) top charities are likely to have no impact, and discussions on whether becoming vegetarian increases animal suffering. More recently Iâve seen commentary about whether our community has accelerated dangerous AI capabilities or whether our community contributed to an environment that led to multi-billion dollar fraud. We should take these possible negatives very seriously. The fact that we do take these negatives seriously, and that we continue to try to get better a truth seeking gives me a whole bunch of hope.