I’m currently based at FOCAL at Carnegie Mellon University. My research interests include decision theory, self-locating beliefs/anthropics, and cooperation and conflict between AI systems. From 2020-23, I worked at the Center on Long-term Risk.
EmeryCooper
I didn’t downvote your comment, but I did feel a bit like it wasn’t really addressing the points Chi was making, so if I had to guess, I’d say that might be why.
This is a really interesting point! I think I’m also sometimes guilty of using the norms of signalling epistemic uncertainty in order to mask what is actually anxious social signalling on my part, which I hadn’t thought about so explicitly until now.
One thing that occurred to me while reading this—I’d be curious as to whether you have any thoughts on how this might interact with gender diversity in EA, if at all?
Over time, I’ve become less convinced of the value of thinking explicitly about weirdness points for most individuals, and I’m concerned that for many people the concept can actually be pretty harmful. To a large extent, I’m referring less to this actual post, and more to weirdness points as a meme, which I think is somewhat less nuanced than the original post. So I might not be maximally charitable in my criticisms, since what I am criticising is the concept as it is often expressed, rather thanas it was originally expressed.
My concerns are a combination of:
I think the model is somewhat flawed, especially in domains like hobbies and physical appearance, rather than things like policies and opinions, and that likeability is more messy and complicated
Even if that weren’t true, I worry that, in practice, people’s attempts to make themselves less weird may in fact make them more weird.
Finally, regardless of 1 or 2, I think that the psychological costs of worrying about weirdness points can be pretty high relative to the potential benefits, at least for a significant subset of EAs. Additionally, to the extent to which one puts credence in the model being correct, I think it can be hard to avoid these costs once one is familiar with the idea of weirdness points.
I’m going to focus primarily on point 3, since I think other people have already made points 1 and 2.
I’m going to talk about my personal experience for a bit, because I think it’s illustrative of one of the ways this can go wrong.
I encountered the concept of weirdness points as a fairly new EA. To give some context, I have always been a fairly weird person, to some extent for reasons largely outside of my control. I’ve also, for a long time, been afraid of being disliked and ostracised for this, and to some extent internalised the notion of weirdness as Bad. At the point I joined EA, I still was pretty insecure about this, but derived reassurance from being able to say that my actual goals had nothing to do with being liked by people. Then suddenly BAM, this reassurance didn’t really work anymore. Me being weird and people disliking me no longer just affected me, but, to some extent, the EA community at large. So suddenly every weird thing I did actively made the world worse! I don’t really endorse this line of reasoning, and I’m certainly not trying to suggest other people endorse it. But I did find it pretty hard to shake because there might well be *some* grain of truth to it, and was in some sense supported by some interpretations of weirdness points. In any case, I don’t think me thinking along these lines was helpful to either myself or the world, since it mostly just worsened by confidence and wellbeing, and made me more afraid to do anything, which probably, if anything, made me less likeable.
While some aspects of this are likely particular to me, I do think the meme of weirdness points and related concepts may have similar detrimental effects for other EAs, especially those with a history of poor mental health and confidence issues, which seem to be disproportionately prevalent within the community. The idea that you have a *set* number of weirdness points also seems to me to be potentially particularly harmful (regardless of whether it is true), because this seems to imply that those who already have high baseline weirdness for factors entirely outside of their control, such as neurodivergence, or atypical appearance, have a lot fewer of their weirdness points left over before they do anything at all. I think this can lead to people feeling they have to put even more effort than other people into curating their image and have even less freedom to do weird things (when this curation may be more mentally taxing for this precise group of people). Or worse, that their very presence and visibility is *by itself* harming the EA community.
The concept of weirdness points clearly has some merit to it, especially for individuals going into policy or something very public-facing. The law of equal and opposite advice applies, and it probably is net helpful for a bunch of people. However, most people already worry about being perceived as weird for normal human reasons, and I think that adding the additional worry that being perceived as weird may cause actual moral harm can be psychologically damaging for a proportion of people, and hamper their efforts to do good.
On net, I’m not sure whether it is wise as a community to spread the meme of weirdness points to the extent that we have.
- 7 Nov 2021 18:56 UTC; 16 points) 's comment on You have a set amount of “weirdness points”. Spend them wisely. by (
I just want to flag up that The Better Angels of Our Nature, whilst a great book, contains quite a few graphic descriptions of torture, which even as an adult I found somewhat disturbing. I don’t necessarily think teenage-me would have been affected any worse, but you might still not want to put it in a school library.
I expect this isn’t what you’re actually implying, but I’m a bit worried this could be misread as saying that most people who are sufficiently talented in the relevant sense to work at an EA org are capable of earning $1m/year elsewhere, and that if you can’t, then you probably aren’t capable of working at an EA org or doing direct work. I just wanted to flag that I think the kinds of talent required for doing direct work are often not all that correlated with the kinds of talent that are highly financially rewarded outside of EA, and that people shouldn’t rule themselves out for the former because they wouldn’t be capable of earning a ton of money.
(Edit: People (or person?) who downvoted—I’d love to know why! Is it because you think smountjoy is obviously not saying the thing I thought they might be misread as saying, and so you think this is a pointless comment, or because you disagree with it, or something else? I’m fairly new to actually commenting on the forum, so maybe I’ve not understood the ettiquette properly.)