I think that if these trends are real then they are extremely important to predict and understand because they are major existential risk factors and also directly impede the ability of our community to figure out what we need to do to help the world and coordinate to do it.
This seems like an interesting line of reasoning, and I’d maybe be excited to see more strategic thinking around this.
Might eventually turn out to be pointless and/or futile, of course.
I agree! I’d love to see more research into this stuff. In my relevant pre-agi possibilities doc I call this “Deterioration of collective epistemology.” I intend to write a blog post about a related thing (Persuasion Tools) soon.
This seems like an interesting line of reasoning, and I’d maybe be excited to see more strategic thinking around this.
Might eventually turn out to be pointless and/or futile, of course.
I agree! I’d love to see more research into this stuff. In my relevant pre-agi possibilities doc I call this “Deterioration of collective epistemology.” I intend to write a blog post about a related thing (Persuasion Tools) soon.