I came to EA from the rationalshpere, and I find all the first part both untrue for me, and worrying. It’s important part of my life to not be a person like that. and indeed, I had instinctive (or maybe socialstinctive) opinion pro-socialism as a child, and I changed her when I encountered evidence.
In a similar way, I just fail to imagine how can you believe you faction is the weakestin Starcraft. like, you already said it’s not! It’s obvious! I can understand aliefing it, but not believing it. and there is a difference, and big one.
and it’s not even what happening in EA. what I see happening is people look on the evidence, and change their mind. (when there are factions of EA that behave in mindkilled way I find it deeply concerning.)
so, the way I observe EA work in practice, and way I expect it to work in this toy example, is the same - EA will start with the believe that their favorite politic idea is most effective, then go and search for evidence, not find good enough evidence, and go for global health and not torturing animals and not destroying all humanity and maybe AI (this actually look like historical incident to me, but even that is contestable—It’s not an accident that the same sort of people that interested in AI is interested in EA. there is a thoughts generator that generate both).
I see this post as giving up on some really basic rationality skill, with the implicit claim it’s impossible to do. when people in real life have this skill and use it all the time!
so while I support tugging sideways, I find this post worrying. EA is based on having better judgment, not on giving up or claiming it’s impossible to have better judgment. especially in a world where the possibility was proven again and again. be more ambitious! what you implicitly claim is impossible look like pretty basic skill for me, and one that really worth acquiring. ES are much worst then EA as it exist today.
I came to EA from the rationalshpere, and I find all the first part both untrue for me, and worrying. It’s important part of my life to not be a person like that. and indeed, I had instinctive (or maybe socialstinctive) opinion pro-socialism as a child, and I changed her when I encountered evidence.
In a similar way, I just fail to imagine how can you believe you faction is the weakestin Starcraft. like, you already said it’s not! It’s obvious! I can understand aliefing it, but not believing it. and there is a difference, and big one.
and it’s not even what happening in EA. what I see happening is people look on the evidence, and change their mind. (when there are factions of EA that behave in mindkilled way I find it deeply concerning.)
so, the way I observe EA work in practice, and way I expect it to work in this toy example, is the same - EA will start with the believe that their favorite politic idea is most effective, then go and search for evidence, not find good enough evidence, and go for global health and not torturing animals and not destroying all humanity and maybe AI (this actually look like historical incident to me, but even that is contestable—It’s not an accident that the same sort of people that interested in AI is interested in EA. there is a thoughts generator that generate both).
I see this post as giving up on some really basic rationality skill, with the implicit claim it’s impossible to do. when people in real life have this skill and use it all the time!
so while I support tugging sideways, I find this post worrying. EA is based on having better judgment, not on giving up or claiming it’s impossible to have better judgment. especially in a world where the possibility was proven again and again. be more ambitious! what you implicitly claim is impossible look like pretty basic skill for me, and one that really worth acquiring. ES are much worst then EA as it exist today.