This is what happens when you centralize power so much. I’m so sorry for what happened. So many people remaining silent and covering for abusers.
(it shouldn’t matter but for the record I have multiple partners)
sapphire
Nothing serious can change until the whole ‘all important decisions are made by about ten people who dont see any need to get community buy in’ issue is solved.
I’ve honestly developed some pretty serious mental health issues. It’s just miserable to worry about everyone dying or worse.
Really brave of you to do this. It’s important.
My friend? Yes. CEA? No.
I dont actually think there exists a genuinely good record to exclude them.
I’m sure unless I was just being lied to? I was talking to the film maker directly in the screenshot. We talked in person about this at length as well. I find it pretty hard to understand why CEA would want to exclude Zvi or Zeynep. But I’m pretty sure this happened. I’m aware of a lot of hard to explain behavior. but I gave this example because Im pretty sure I correctly understand what happened and can substantiate the claims.
CEA didn’t want Zvi Moshowitz, Zeynep Tufekci, or Helen Chu in the Covid documentary they commissioned?
Empirically it feels hard to get much credit/egoist-value from helping people? Maybe your experience has just been different. But I don’t find helping people very helpful for improving my status.
This was posted in 2019 by a group of EAs from underrepresented groups. Wish it had been taken seriously.
This is great. Wish it had been taken seriously in 2019. Deserves to be back on the front page!
I think the connection is weaker but there is still a lot of really wacky social dynamics.
“In one case I mentioned to someone that i was surprised the SSC subreddit had people posting the white surpremacist 14 words openly with others approving. This person then spent a year+ worried that I was infiltrating their group as a woke sjw so I could get people canceled. They never mentioned it to me but was just very jumpy and kind of unfriendly. Like 2 years later they were like oh ok thanks for not doing that and I was like ??? ”
There are also still quite a few rationalists discussing and promoting legitimately far right sources. Multiple people I met at the NYC rat meetup were literally into Qanon. Almost all the rationalists I know who got involved with the far right started with HBD.
(reposting from the ‘does it matter’ thread)
It is commonly theorized that having friends who hold a viewpoint should make one more charitable to that viewpoint. This has not been the case with HBD. I have a close friend of around decade who has gotten increasingly obsessed with HBD. In general they are a smart and friendly person. But the things they has started espousing have become really shocking.
Example of their views: If Black people are not heavily under-represented in a cognitively demanding organization that is strong evidence the organization is racist against White and Asian individuals!
Obviously these points of view are completely at odds with any sort of fair and inclusive community or organization. They have also moved further and further rightwing. This resulted in a lot of personal problems when I came out as trans. They don’t ‘just’ have some abstract objections, they were quite toxic to an old and supportive friend when she was having a hard time. They explicitly admit that a huge driving force for them moving rightward in general is belief in HBD. The logic for why is not hard to see. If you believe in HBD you can start to feel ‘persecuted’ by people on the left or center-left. It’s easy to start sympathizing with the right and far right.
I’ve been in the rationalist community for over a decade and the EA community for a somewhat shorter period. I have seen tons of seemingly kind and reasonable friends become increasingly far-right after they got into HBD. Im honestly not surprised FLI was considering funding an explicit far-right nazi-adjacent group. The sympathies run deep. Neo-reaction has been close to the EA and rationalist communities for a very large fraction of our history.It would be extremely hypocritical for me to hold people to views they no longer support. I endorsed HBD in 2015 and 2016. Like many rationalists I was introduced to HBD by reading Slatestarcodex. Promoting HBD in anyway, including privately exposing people to the ideas, is one of the biggest regrets of my life. It is a seriously harmful philosophy. Im very, very sorry for any negative impact my actions may have caused. For obvious reasons I have sympathy for people who have gotten into the racist pipeline. I honestly only got out because the right is so shitty to trans people and is pretty anti-vegan. Like many eggs I had a lot of trans friends. Independently I was quite convinced veganism was a positive lifestyle. HBD is a very harmful pseudo-science and it is totally unacceptable that people with power in Effective Altruism believe in it.
I truly hope the EA movement can move toward a better future free from this toxicity.
- Does EA understand how to apologize for things? by Jan 15, 2023, 7:14 PM; 159 points) (
- Jan 15, 2023, 1:02 PM; 40 points) 's comment on Do better, please … by (
- Jan 19, 2023, 12:56 AM; 1 point) 's comment on Do better, please … by (
It is commonly theorized that having friends who hold a viewpoint should make one more charitable to that viewpoint. This has not been the case with HBD. I have a close friend of around decade who has gotten increasingly obsessed with HBD. In general they are a smart and friendly person. But the things they have started espousing have become really shocking.
Example of their beliefs: If Black people are not heavily under-represented in a ‘cognitively demanding’ organization that is very strong evidence the organization is racist against White and Asian individuals!
Obviously this point of view is completely at odds with any sort of fair and inclusive community or organization. They have also moved further and further rightwing. This resulted in a lot of personal problems when I came out as trans. They don’t ‘just’ have some abstract objections, they were quite toxic to an old and supportive friend when she was having a hard time. They explicitly admit that a huge driving force for them moving rightward in general is belief in HBD. The logic for why is not hard to see. If you believe in HBD you can start to feel ‘persecuted’ by people on the left or center-left. It’s easy to start sympathizing with the right and far right.
I’ve been in the rationalist community for over a decade and the EA community for a somewhat shorter period. I have seen tons of seemingly kind and reasonable friends become increasingly far-right after they got into HBD. Im honestly not surprised FLI was considering funding an explicit far-right nazi-adjacent group. The sympathies run deep. Neo-reaction has been close to the EA and rationalist communities for a very large fraction of our history.It would be extremely hypocritical for me to hold people to views they no longer support. I endorsed HBD in 2015 and 2016. Like many rationalists I was introduced to HBD by reading Slatestarcodex. Promoting HBD in anyway, including privately exposing people to the ideas, is one of the biggest regrets of my life. It is a seriously harmful philosophy. Im very, very sorry for any negative impact my actions may have caused. For obvious reasons I have sympathy for people who have gotten into the racist pipeline. I honestly only got out because the right is so shitty to trans people and is pretty anti-vegan. Like many eggs I had a lot of trans friends. Independently I was quite convinced veganism was a positive lifestyle. them being so shitty on slam dunk issues like trans rights made me start rethinking other parts of the ideology I had started adopting. HBD is a very harmful pseudo-science and it is totally unacceptable that people with power in Effective Altruism believe in it.
I truly hope the EA movement can move toward a better future free from this toxicity.
A pretty large fraction of engaged EAs believe in HBD. Its quite common the deeper you go into the community.
Scientific Racism is definitely considered acceptable in the EA community and lots of the leadership support it. I would bet Bostrom still supports HBD. He doesn’t even deny this in his apology!
If Microsoft takes the lead among big tech companies then the market cap doubling in five years would be reasonable. Though its unclear they will pull that off. If timelines are fast and Microsoft stays in the lead 10T isn’t crazy. It’s also worth noting that even if the AI thesis doesn’t play out Microsoft is a 25PE blue chip with very capable management. So the downside here is pretty low as far as buying tech stocks goes. Buying call options requires getting the timing right.
If OpenAI is working closely with Microsoft than MSFT becomes an extremely attractive investment. Microsoft is in talks to invest 10B. If this goes through I would strongly advise investing in MSFT. Seems like one of the best ways to get exposure.
I did say some of the best investments are private. But there are good public investments (MSFT, TSM, SMSN, ASML). Nothing in investing is guaranteed but trying to invest in AI companies seems like a much better bet than shorting interest rates. Also many rationalists are rich enough they can try to invest in various private companies.
I read the appendix and it doesn’t seem very convincing. For example they bring up openAI but you can buy MSFT stock. MSFT already owns a chunk of openAI and is in talks to own a much larger share.
I do not think the appendix 1 is likely to convince people shorting interest rates is the best way to express an AI thesis.
The alleged perpetrator seems to be at least tolerated by some influential people. About Two years ago Anna Salomon wrote:
One year ago she wrote: