Biggest disagreement between the average worldview of people I met with at EAG and my own is something like “cluster thinking vs sequence thinking,” where people at EAG are like “but even if we get this specific policy/technical win, doesn’t it not matter unless you also have this other, harder thing?” and I’m more like, “Well, very possibly we won’t get that other, harder thing, but still seems really useful to get that specific policy/technical win, here’s a story where we totally fail on that first thing and the second thing turns out to matter a ton!”
Cluster thinking vs sequence thinking remains unbeaten as a way to typecast EA disagreements. It’s been a while since I saw it discussed on the forum. Maybe lots of newer EAs don’t even know about it!
Biggest disagreement between the average worldview of people I met with at EAG and my own is something like “cluster thinking vs sequence thinking,” where people at EAG are like “but even if we get this specific policy/technical win, doesn’t it not matter unless you also have this other, harder thing?” and I’m more like, “Well, very possibly we won’t get that other, harder thing, but still seems really useful to get that specific policy/technical win, here’s a story where we totally fail on that first thing and the second thing turns out to matter a ton!”
Cluster thinking vs sequence thinking remains unbeaten as a way to typecast EA disagreements. It’s been a while since I saw it discussed on the forum. Maybe lots of newer EAs don’t even know about it!