I’m in favour of direct AI safety movement building too, but the point still remains that the EA community is a vital talent pipeline for cause areas that are more talent dependent. And given the increasing prominence of these cause areas, it seems like it would be a mistake to optimise for the other cause, at least when it’s looking highly plausible that the community may shift even more in the longtermist/x-risk over the next few years.
The community is shifting more long-termist because of intentional decisions that were made—there’s no reason that these shifts have to be locked into place if there happens to be a good reason to shift away from them—not suggesting there is one! If the shift turns out to be a mistake in the future, we should be happy to move away from it, not say “oh but the community may shift towards it in the future”, especially when that shift is caused by intentional decisions in EA leadership.
Publishing What We Owe the Future was an intentional decision, but there’s a sense in which people read whatever people write and make up their own minds.
“Oh but the community may shift towards it in the future”—I guess some of these shifts are pretty predictable in advance, but that’s less important than the point I was making about maintaining option value especially for options that are looking increasingly high value.
I’m in favour of direct AI safety movement building too, but the point still remains that the EA community is a vital talent pipeline for cause areas that are more talent dependent. And given the increasing prominence of these cause areas, it seems like it would be a mistake to optimise for the other cause, at least when it’s looking highly plausible that the community may shift even more in the longtermist/x-risk over the next few years.
The shift to longtermism/x-risk to me seems to have been an intentional one, but your comment makes it sound otherwise?
I don’t know what you mean by intentional or not.
But my guess is that the community will shift more long-termist after more people have had time to digest What We Owe the Future.
The community is shifting more long-termist because of intentional decisions that were made—there’s no reason that these shifts have to be locked into place if there happens to be a good reason to shift away from them—not suggesting there is one! If the shift turns out to be a mistake in the future, we should be happy to move away from it, not say “oh but the community may shift towards it in the future”, especially when that shift is caused by intentional decisions in EA leadership.
I guess this is why I asked what you meant.
Publishing What We Owe the Future was an intentional decision, but there’s a sense in which people read whatever people write and make up their own minds.
“Oh but the community may shift towards it in the future”—I guess some of these shifts are pretty predictable in advance, but that’s less important than the point I was making about maintaining option value especially for options that are looking increasingly high value.