Thank you for this comment. You’ve made some things explicit that I’ve been thinking about for a long time. It feels analogous to saying the emperor has no clothes.
I am growing increasingly concerned that the people supposedly working to protect us from unaligned AI have such weak ethics. I am wondering if a case can be made for it being better to have a small group of high integrity people work on AI safety than to have even a twice as large group comprised 50% of low-integrity individuals. I wouldn’t want a bank robber to safeguard democracy, for example.
The idea of having fewer AI alignment researchers, but those researchers having more intensive ethical training, is compelling.
Actually, some of my best mentors around sexuality has been my female friends. I really recommend men foster deep, meaningful friendships with heterosexual women. When they tell you about their dating experiences you will very quickly understand how to behave around women you are interested in sexually.
There is currently a huge vacuum in mentorship for men about how to interact with women (hence the previously burgeoning market of red pill, dating coaches, Jordan Peterson, etc). More thought leadership by men who have healthy relationships with women would be a service to civilization. Maybe you should write some blog posts :).
Thanks for taking time to respond so thoughtfully Lucrectia! I am considering many things to improve things especially in EA: -Make a reading group on allyship happen, with a focus on EA (to anyone reading this: please let me know if you are a self identified man and want to be part of this!) -Try to find a way to talk to and understand the men who have conflicted feelings about gender equality etc. (to anyone who might read this: please let me know if you would like to talk—I understand trust can be an issue but I think we can work through that) -Write posts e.g. here on the EAF, but I am unsure of the balance between taking a stance and taking up too much space -Do my share of reproductive work at home with a wife who is a successful and ambitious academic and with 2 small kids—I feel like this should be my first priority!
This is a great list! I think this one is extremely valuable and something that men may be better equipped to do than I would:
Try to find a way to talk to and understand the men who have conflicted feelings about gender equality etc. (to anyone who might read this: please let me know if you would like to talk—I understand trust can be an issue but I think we can work through that)
I’d love to write another post about this too, targeted at men who have conflicted feeling about gender equality, sexual violence, etc. The problem with this current post is it may be preaching to the choir :) Someone (probably me) needs to shill AI Twitter with these ideas, but rebranded to the average mid-twenties male AI researcher. “Fighting bad actors in AI” has been one message I’ve been playing with.
Thank you for this comment. You’ve made some things explicit that I’ve been thinking about for a long time. It feels analogous to saying the emperor has no clothes.
The idea of having fewer AI alignment researchers, but those researchers having more intensive ethical training, is compelling.
There is currently a huge vacuum in mentorship for men about how to interact with women (hence the previously burgeoning market of red pill, dating coaches, Jordan Peterson, etc). More thought leadership by men who have healthy relationships with women would be a service to civilization. Maybe you should write some blog posts :).
As for the rest of your comment, I responded below to Rebecca.
Thanks for taking time to respond so thoughtfully Lucrectia! I am considering many things to improve things especially in EA:
-Make a reading group on allyship happen, with a focus on EA (to anyone reading this: please let me know if you are a self identified man and want to be part of this!)
-Try to find a way to talk to and understand the men who have conflicted feelings about gender equality etc. (to anyone who might read this: please let me know if you would like to talk—I understand trust can be an issue but I think we can work through that)
-Write posts e.g. here on the EAF, but I am unsure of the balance between taking a stance and taking up too much space
-Do my share of reproductive work at home with a wife who is a successful and ambitious academic and with 2 small kids—I feel like this should be my first priority!
This is a great list! I think this one is extremely valuable and something that men may be better equipped to do than I would:
I’d love to write another post about this too, targeted at men who have conflicted feeling about gender equality, sexual violence, etc. The problem with this current post is it may be preaching to the choir :) Someone (probably me) needs to shill AI Twitter with these ideas, but rebranded to the average mid-twenties male AI researcher. “Fighting bad actors in AI” has been one message I’ve been playing with.