When is it better to risk being too naive, or too cynical
That reminds me of what I read about game theory in Give and Take by Adam Grant (iirc). The conclusion was that the strategy which results in most rewards was to behave cooperatively and only switch (to non-coop) once every three times if the other is uncooperative. The reasoning was that if you don’t cooperate, the “selfish” won’t either. But if you “forgive” and try to cooperate again after they weren’t cooperative, you may sway them to cooperate too. You don’t cooperate always regardless, at risk of being too naive and taken advantage of, but you lean towards cooperating more often than not.
If you are unsure how harsh the world is, lean toward acting like you’re living in a less harsh world—there is more value for EA to take there.
I’d be interested in reading more about this. I think a less cynical view would elicit more cooperation and goodwill due to likeability. I’m not sure this is the direction you’re going so that’s why I’m curious about it.
That reminds me of what I read about game theory in Give and Take by Adam Grant (iirc). The conclusion was that the strategy which results in most rewards was to behave cooperatively and only switch (to non-coop) once every three times if the other is uncooperative. The reasoning was that if you don’t cooperate, the “selfish” won’t either. But if you “forgive” and try to cooperate again after they weren’t cooperative, you may sway them to cooperate too. You don’t cooperate always regardless, at risk of being too naive and taken advantage of, but you lean towards cooperating more often than not.
I’d be interested in reading more about this. I think a less cynical view would elicit more cooperation and goodwill due to likeability. I’m not sure this is the direction you’re going so that’s why I’m curious about it.