Iām obviously not Matthew, but the OED defines them like so:
sell-out: āa betrayal of oneās principles for reasons of expedienceā
traitor: āa person who betrays [be gravely disloyal to] someone or something, such as a friend, cause, or principleā
Unless he is lying about what he believesāwhich seems unlikelyāMatthew is not a sell-out, because according to him Mechanize is good or at minimum not bad for the world on his worldview. Hence, he is not betraying his own principles.
As for being a traitor, I guess the first question is, traitor of what? To EA principles? To the AI safety cause? To the EA or AI safety community? In order:
I donāt think Matthew is gravely disloyal to EA principles, as he explicitly says he endorses them and has explained how his decisions make sense on his worldview
I donāt think Matthew is gravely disloyal to the AI safety cause, as heās been openly critical of many common AI doom arguments for some time, and you canāt be disloyal to a cause you never really bought into in the first place
Whether Matthew is gravely disloyal to the EA or AI safety communities feels less obvious to me. Iām guessing a bunch of people saw Epoch as an an AI safety organisation, and by extension its employees as members of the AI safety community, even if the org and its employees did not necessarily see itself or themselves that way, and felt betrayed for that reason. But it still feels off to me to call Matthew a traitor to the EA or AI safety communities, especially given that heās been critical of common AI doom arguments. This feels more like a difference over empirical beliefs than a difference over fundamental values, and it seems wrong to me to call someone gravely disloyal to a community for drawing unorthodox but reasonable empirical conclusions and acting on those, while broadly having similar values. Like, I think people should be allowed to draw conclusions (or even change their minds) based on evidenceāand act on those conclusionsāwithout it being betrayal, assuming they broadly share the core EA values, and assuming theyāre being thoughtful about it.
(Of course, itās still possible that Mechanize is a net-negative for the world, even if Matthew personally is not a sell-out or a traitor or any other such thing.)
Yes, I understand the arguments against it applying here. My question is whether the threshold is being set at a sufficiently high level that it basically never applies to anyone. Hence why I was looking for examples which would qualify.
Sellout (in the context of Epoch) would apply to someone e.g. concealing data or refraining from publishing a report in exchange for a proposed job in an existing AI company.
As for traitor, I think the only group here that can be betrayed is humanity as a whole, so as long as one believes theyāre doing something good for humanity I donāt think itād ever apply.
As for traitor, I think the only group here that can be betrayed is humanity as a whole, so as long as one believes theyāre doing something good for humanity I donāt think itād ever apply.
Hmm, that seems off to me? Unless you mean āsevere disloyalty to some group isnāt Ultimately Bad, even though it can be instrumentally badā. But to me it seems useful to have a concept of group betrayal, and to consider doing so to be generally bad, since I think group loyalty is often a useful norm thatās good for humanity as a whole.
Specifically, I think group-specific trust networks are instrumentally useful for cooperating to increase human welfare. For example, scientific research canāt be carried out effectively without some amount of trust among researchers, and between researchers and the public, etc. And you need some boundary for these groups thatās much smaller than all humanity to enable repeated interaction, mutual monitoring, and norm enforcement. When someone is severely disloyal to one of those groups they belong to, they undermine the mutual trust that enables future cooperation, which Iād guess is ultimately often bad for the world, since humanity as a whole depends for its welfare on countless such specialised (and overlapping) communities cooperating internally.
Itās not that Iām ignoring group loyalty, just that the word ātraitorā seems so strong to me that I donāt think thereās any smaller group here thatās owed that much trust. I could imagine a close friend calling me that, but not a colleague. I could imagine a researcher saying I ābetrayedā them if I steal and publish their results as my own after they consulted me, but thatās a much weaker word.
[Context: I come from a country where youāre labeled a traitor for having my anti-war political views, and I donāt feel such usage of this word has done much good for society here...]
Iām obviously not Matthew, but the OED defines them like so:
sell-out: āa betrayal of oneās principles for reasons of expedienceā
traitor: āa person who betrays [be gravely disloyal to] someone or something, such as a friend, cause, or principleā
Unless he is lying about what he believesāwhich seems unlikelyāMatthew is not a sell-out, because according to him Mechanize is good or at minimum not bad for the world on his worldview. Hence, he is not betraying his own principles.
As for being a traitor, I guess the first question is, traitor of what? To EA principles? To the AI safety cause? To the EA or AI safety community? In order:
I donāt think Matthew is gravely disloyal to EA principles, as he explicitly says he endorses them and has explained how his decisions make sense on his worldview
I donāt think Matthew is gravely disloyal to the AI safety cause, as heās been openly critical of many common AI doom arguments for some time, and you canāt be disloyal to a cause you never really bought into in the first place
Whether Matthew is gravely disloyal to the EA or AI safety communities feels less obvious to me. Iām guessing a bunch of people saw Epoch as an an AI safety organisation, and by extension its employees as members of the AI safety community, even if the org and its employees did not necessarily see itself or themselves that way, and felt betrayed for that reason. But it still feels off to me to call Matthew a traitor to the EA or AI safety communities, especially given that heās been critical of common AI doom arguments. This feels more like a difference over empirical beliefs than a difference over fundamental values, and it seems wrong to me to call someone gravely disloyal to a community for drawing unorthodox but reasonable empirical conclusions and acting on those, while broadly having similar values. Like, I think people should be allowed to draw conclusions (or even change their minds) based on evidenceāand act on those conclusionsāwithout it being betrayal, assuming they broadly share the core EA values, and assuming theyāre being thoughtful about it.
(Of course, itās still possible that Mechanize is a net-negative for the world, even if Matthew personally is not a sell-out or a traitor or any other such thing.)
Yes, I understand the arguments against it applying here. My question is whether the threshold is being set at a sufficiently high level that it basically never applies to anyone. Hence why I was looking for examples which would qualify.
Sellout (in the context of Epoch) would apply to someone e.g. concealing data or refraining from publishing a report in exchange for a proposed job in an existing AI company.
As for traitor, I think the only group here that can be betrayed is humanity as a whole, so as long as one believes theyāre doing something good for humanity I donāt think itād ever apply.
Hmm, that seems off to me? Unless you mean āsevere disloyalty to some group isnāt Ultimately Bad, even though it can be instrumentally badā. But to me it seems useful to have a concept of group betrayal, and to consider doing so to be generally bad, since I think group loyalty is often a useful norm thatās good for humanity as a whole.
Specifically, I think group-specific trust networks are instrumentally useful for cooperating to increase human welfare. For example, scientific research canāt be carried out effectively without some amount of trust among researchers, and between researchers and the public, etc. And you need some boundary for these groups thatās much smaller than all humanity to enable repeated interaction, mutual monitoring, and norm enforcement. When someone is severely disloyal to one of those groups they belong to, they undermine the mutual trust that enables future cooperation, which Iād guess is ultimately often bad for the world, since humanity as a whole depends for its welfare on countless such specialised (and overlapping) communities cooperating internally.
Itās not that Iām ignoring group loyalty, just that the word ātraitorā seems so strong to me that I donāt think thereās any smaller group here thatās owed that much trust. I could imagine a close friend calling me that, but not a colleague. I could imagine a researcher saying I ābetrayedā them if I steal and publish their results as my own after they consulted me, but thatās a much weaker word.
[Context: I come from a country where youāre labeled a traitor for having my anti-war political views, and I donāt feel such usage of this word has done much good for society here...]