I suspect that unfortunately, both in the initial writing of such things and the finding of them, that we’d get more conflict rather than less. I think it will be hard to get EAs to reach consensus on what our biases are, and I’d guess that adversarial people will use that kind of thing as fodder, unfortunately. Maybe there will be people who appreciate learning it and being able to understand EA’s role in the intellectual ecosystem, but I don’t foresee that doing a lot to reduce friction.
Having more projects in common would serve this goal better, I’d guess, but that’s of course complicated in lots of ways
I suspect that unfortunately, both in the initial writing of such things and the finding of them, that we’d get more conflict rather than less. I think it will be hard to get EAs to reach consensus on what our biases are, and I’d guess that adversarial people will use that kind of thing as fodder, unfortunately. Maybe there will be people who appreciate learning it and being able to understand EA’s role in the intellectual ecosystem, but I don’t foresee that doing a lot to reduce friction.
Having more projects in common would serve this goal better, I’d guess, but that’s of course complicated in lots of ways