Hi Nathan! If a field includes an EA-relevant concept which could benefit from an explanation in EA language, then I don’t see why we shouldn’t just include an entry for that particular concept.
For concepts which are less directly EA-relevant, the marginal value of including entries for them in the wiki (when they’re already searchable on Wikipedia) is less clear to me. On the contrary, it could plausibly promote the perception that there’s an “authoritative EA interpretation/opinion” of an unrelated field, which could cause needless controversy or division.
Writing good Wiki articles is hard, and translating between worldviews even harder. If someone wants to do it, that’s cool and I would respect them, but funding people to do it seems odd—”explain X to the ~10k EAs in the world”. Surely those fields have texts that can explain themselves?
I didn’t vote but would assume some the feminists part is an issue for some. I think that it’s a good idea but controversial issues might look like unanimous endorsement or might be wrong on certain matters. Very relevant is the current controversy about psychometric testing, race, etc.
The wiki should aim to contain distillations of useful knowledge in other fields in EA language—feminism, psychology etc.
Curious to hear from people who disagree with this
Hi Nathan! If a field includes an EA-relevant concept which could benefit from an explanation in EA language, then I don’t see why we shouldn’t just include an entry for that particular concept.
For concepts which are less directly EA-relevant, the marginal value of including entries for them in the wiki (when they’re already searchable on Wikipedia) is less clear to me. On the contrary, it could plausibly promote the perception that there’s an “authoritative EA interpretation/opinion” of an unrelated field, which could cause needless controversy or division.
I don’t think the wiki adequately covers EA topics yet, so I wouldn’t expand the scope until we’ve covered these topics well.
Writing good Wiki articles is hard, and translating between worldviews even harder. If someone wants to do it, that’s cool and I would respect them, but funding people to do it seems odd—”explain X to the ~10k EAs in the world”. Surely those fields have texts that can explain themselves?
I didn’t vote but would assume some the feminists part is an issue for some. I think that it’s a good idea but controversial issues might look like unanimous endorsement or might be wrong on certain matters. Very relevant is the current controversy about psychometric testing, race, etc.