âItâs harder to maintain good epistemics and strong reasoning + reasoning transparency in large coalitions of groups who have different worldviews/âgoals. (âWe shouldnât say X because our allies in AI ethics will think itâs weird.â) I donât think âX is bad for epistemicsâ means âwe definitely shouldnât consider Xâ, but I think itâs a pretty high cost that often goes underappreciated/âunderacknowledgedâ
This is probably a real epistemic cost in my view, but it takes more than identifying a cost to establish that forming a coalition with people with different goals/âbeliefs is overall epistemically costly, given that doing so also has positive effects like bringing in knowledge that we donât have because no group knows everything.
âItâs harder to maintain good epistemics and strong reasoning + reasoning transparency in large coalitions of groups who have different worldviews/âgoals. (âWe shouldnât say X because our allies in AI ethics will think itâs weird.â) I donât think âX is bad for epistemicsâ means âwe definitely shouldnât consider Xâ, but I think itâs a pretty high cost that often goes underappreciated/âunderacknowledgedâ
This is probably a real epistemic cost in my view, but it takes more than identifying a cost to establish that forming a coalition with people with different goals/âbeliefs is overall epistemically costly, given that doing so also has positive effects like bringing in knowledge that we donât have because no group knows everything.