’It’s harder to maintain good epistemics and strong reasoning + reasoning transparency in large coalitions of groups who have different worldviews/goals. (“We shouldn’t say X because our allies in AI ethics will think it’s weird.”) I don’t think “X is bad for epistemics” means “we definitely shouldn’t consider X”, but I think it’s a pretty high cost that often goes underappreciated/underacknowledged’
This is probably a real epistemic cost in my view, but it takes more than identifying a cost to establish that forming a coalition with people with different goals/beliefs is overall epistemically costly, given that doing so also has positive effects like bringing in knowledge that we don’t have because no group knows everything.
’It’s harder to maintain good epistemics and strong reasoning + reasoning transparency in large coalitions of groups who have different worldviews/goals. (“We shouldn’t say X because our allies in AI ethics will think it’s weird.”) I don’t think “X is bad for epistemics” means “we definitely shouldn’t consider X”, but I think it’s a pretty high cost that often goes underappreciated/underacknowledged’
This is probably a real epistemic cost in my view, but it takes more than identifying a cost to establish that forming a coalition with people with different goals/beliefs is overall epistemically costly, given that doing so also has positive effects like bringing in knowledge that we don’t have because no group knows everything.