You make some great points here. I’ll admit my arguments weren’t as charitable as they should’ve been, and more motivated from heat than light.
I hope to find time to explore this in more detail and with more charity!
Your point about genuine truth seeking is certainly something I love about EA, and don’t want to see go away. It’s definitely a risk if we can’t figure out how to screen for that sort of thing.
Do you have any recommendations for screening based on epistemics?
You make some great points here. I’ll admit my arguments weren’t as charitable as they should’ve been, and more motivated from heat than light.
I hope to find time to explore this in more detail and with more charity!
Your point about genuine truth seeking is certainly something I love about EA, and don’t want to see go away. It’s definitely a risk if we can’t figure out how to screen for that sort of thing.
Do you have any recommendations for screening based on epistemics?