I’ve become much more familiar with EA, historically I’ve consider the two communities to be similarly rational and I thought the two were generally a lot more similar in their beliefs than I do now.
So when I learn of a difference of opinion, I update my outside view and the extent to which I consider people the relevant experts. E.g., when I learn that Eliezer thinks pigs aren’t morally relevant because they’re not self-aware, I lose a bit of confidence in my belief that pigs are morally relevant and I become a bit less trustful that any alignment ‘solutions’ coming from the rationalist community would capture the bulk of what I care about.
I’m interested to hear why you’re asking this question. How would this affect your confidence in certain beliefs and they way you defer?
I’ve become much more familiar with EA, historically I’ve consider the two communities to be similarly rational and I thought the two were generally a lot more similar in their beliefs than I do now.
So when I learn of a difference of opinion, I update my outside view and the extent to which I consider people the relevant experts. E.g., when I learn that Eliezer thinks pigs aren’t morally relevant because they’re not self-aware, I lose a bit of confidence in my belief that pigs are morally relevant and I become a bit less trustful that any alignment ‘solutions’ coming from the rationalist community would capture the bulk of what I care about.