I think many people following effective altruism principles are focusing on politics, but don’t write it in places like the EA forum, because the EA brand is toxic in many circles, and/or has a significant chance of becoming toxic in the future.
But sometimes we stick our heads in the sand as if that were something we couldn’t control.
Or maybe some EA’s kind of like this feeling of being outsiders and being the minority. I don’t know.
Every other group I’ve ever worked with accepts that PR is part of the world. Companies know that they will get good press and bad press, and that it won’t always reflect reality, but they hire people to make it as positive as possible. Politicians are the same. They run focus groups and figure out what words to use to share their message with the public, to maximise support.
Too often we act like we’re above all that. We’re right and that’s enough. If people can’t accept that, that’s their loss.
But it’s not their loss. It’s our loss. it’s the world’s loss.
Public perception of EA’s outside the EA community is often “a bunch of ‘rationalist’ tech guys who like to argue about abstract concepts and believe that AI should have rights,” or something along those lines. This is totally at odds with vast majority of EA’s who are among the most generous, caring people in the world, who want to help people and animals who are suffering.
A world run by EA’s, or on EA principles, would be so wonderful. This should be our vision if we’re truly sincere. But if we want to make this happen, we need to be willing to get our hands dirty, do the PR, challenge the newspaper articles that mis-characterize us, learn to communicate in 15 second tweets as well as 22222222243 word essays so that more people can be exposed to EA ideas rather than stereotypes.
If you ask anyone outside the EA community to name an EA, they probably only have heard of SBF. If you push them, they might wonder if Elon Musk is also an EA. It’s no wonder they don’t trust EA’s. But it’s up to us to proactively change that perception.
It’s true that some EAs work in government but I think this piece lays out pretty well what that actually looks like, and it doesn’t typically involve politics—it’s more civil service type things. I’m pretty sure I know most EAs who work on direct pollitical work (e.g. elections) and it’s quite a small number.
That said, yeah, it’s good that there are some people working in government and that does help broader EA understand the political situation a little better.
I think many people following effective altruism principles are focusing on politics, but don’t write it in places like the EA forum, because the EA brand is toxic in many circles, and/or has a significant chance of becoming toxic in the future.
See e.g. NIST staffers revolt against expected appointment of ‘effective altruist’ AI researcher to US AI Safety Institute
Yes, in many circles the EA brand is toxic.
But sometimes we stick our heads in the sand as if that were something we couldn’t control.
Or maybe some EA’s kind of like this feeling of being outsiders and being the minority. I don’t know.
Every other group I’ve ever worked with accepts that PR is part of the world. Companies know that they will get good press and bad press, and that it won’t always reflect reality, but they hire people to make it as positive as possible. Politicians are the same. They run focus groups and figure out what words to use to share their message with the public, to maximise support.
Too often we act like we’re above all that. We’re right and that’s enough. If people can’t accept that, that’s their loss.
But it’s not their loss. It’s our loss. it’s the world’s loss.
Public perception of EA’s outside the EA community is often “a bunch of ‘rationalist’ tech guys who like to argue about abstract concepts and believe that AI should have rights,” or something along those lines. This is totally at odds with vast majority of EA’s who are among the most generous, caring people in the world, who want to help people and animals who are suffering.
A world run by EA’s, or on EA principles, would be so wonderful. This should be our vision if we’re truly sincere. But if we want to make this happen, we need to be willing to get our hands dirty, do the PR, challenge the newspaper articles that mis-characterize us, learn to communicate in 15 second tweets as well as 22222222243 word essays so that more people can be exposed to EA ideas rather than stereotypes.
If you ask anyone outside the EA community to name an EA, they probably only have heard of SBF. If you push them, they might wonder if Elon Musk is also an EA. It’s no wonder they don’t trust EA’s. But it’s up to us to proactively change that perception.
It’s true that some EAs work in government but I think this piece lays out pretty well what that actually looks like, and it doesn’t typically involve politics—it’s more civil service type things. I’m pretty sure I know most EAs who work on direct pollitical work (e.g. elections) and it’s quite a small number.
That said, yeah, it’s good that there are some people working in government and that does help broader EA understand the political situation a little better.