I’m pretty skeptical about arguments from optics, unless you’re doing marketing for a big organization or whatever. I just think it’s really valuable to have a norm of telling people your true beliefs rather than some different version of your beliefs designed to appeal to the person you’re speaking to. This way people get a more accurate idea of what a typical EA person thinks if they talk to them, and you’re likely better able to defend your own beliefs vs the optics-based ones if challenged. (The argument about there being so much funding in longtermism that the best opportunities are already funded I think is pretty separate to the optics one, and I don’t have any strong opinions there)
For me, I would donate to where you think there’s the highest EV, and if that turns out to be longtermism, think about a clear and non-jargony way to explain that to non-EA people, i.e. say something like ‘I’m concerned about existential risks from things like nuclear war, future pandemics and risks from emerging technologies like AI, so I donate some money to a fund trying to alleviate those risks’ (rather than talking about the 10^100 humans who will be living across many galaxies etc etc). A nice side effect of having to explain your beliefs might be convincing some more people to go check out this ‘longtermism’ stuff!
I’m pretty skeptical about arguments from optics, unless you’re doing marketing for a big organization or whatever. I just think it’s really valuable to have a norm of telling people your true beliefs rather than some different version of your beliefs designed to appeal to the person you’re speaking to. This way people get a more accurate idea of what a typical EA person thinks if they talk to them, and you’re likely better able to defend your own beliefs vs the optics-based ones if challenged. (The argument about there being so much funding in longtermism that the best opportunities are already funded I think is pretty separate to the optics one, and I don’t have any strong opinions there)
For me, I would donate to where you think there’s the highest EV, and if that turns out to be longtermism, think about a clear and non-jargony way to explain that to non-EA people, i.e. say something like ‘I’m concerned about existential risks from things like nuclear war, future pandemics and risks from emerging technologies like AI, so I donate some money to a fund trying to alleviate those risks’ (rather than talking about the 10^100 humans who will be living across many galaxies etc etc). A nice side effect of having to explain your beliefs might be convincing some more people to go check out this ‘longtermism’ stuff!