Fwiw I’d also say that most of “the most engaged EAs” would not feel betrayed or lied to (for the same reasons), though I would be more uncertain about that. Mostly I’m predicting that there’s pretty strong selection bias in the people you’re thinking of and you’d have to really precisely pin them down (e.g. maybe something like “rationalist-adjacent highly engaged EAs who have spent a long time thinking meta-honesty and glomarization”) before it would become true that a majority of them would feel betrayed or lied to.
That’s plausible, though I do think I would take a bet here if we could somehow operationalize it. I do think I have to adjust for a bunch of selection effects in my thinking, and so am not super confident here, but still a bit above 50%.
Fwiw I’d also say that most of “the most engaged EAs” would not feel betrayed or lied to (for the same reasons), though I would be more uncertain about that. Mostly I’m predicting that there’s pretty strong selection bias in the people you’re thinking of and you’d have to really precisely pin them down (e.g. maybe something like “rationalist-adjacent highly engaged EAs who have spent a long time thinking meta-honesty and glomarization”) before it would become true that a majority of them would feel betrayed or lied to.
That’s plausible, though I do think I would take a bet here if we could somehow operationalize it. I do think I have to adjust for a bunch of selection effects in my thinking, and so am not super confident here, but still a bit above 50%.