People have been arguing about religion for hundreds if not thousands of years. Maybe there has been progress and maybe there hasn’t but I’m not sure why you would think EA is particularly well positioned to make any progress on either truth-finding or on convincing anyone of the truth. The sort of “fair trial” you propose sounds extremely alienating to religious people. Many religious people do not believe, for example, that religion should be subjected to rational debate and scientific inquiry. To them, it would be a little bit like a parent making a pro and con list about whether their particular baby is worthy of love based on the baby’s particular characteristics. It wouldn’t come across as giving the baby a “fair chance”, it would just come across as gross. Most adults have given the matter some significant thought and come to a conclusion that works for them. I’m not religious myself but I’m glad that EA is working on building common ground with people across different religions (ea for Christians/Jew/Muslims/etc). This seems like it would burn those bridges to no good end.
justsaying
What percentage chance would you estimate of a large scale nuclear war conditional on the U.S. bombing a Chinese data center? What percentage of the risk from agi do you think this strategy reduces?
I hope they do wake up to the danger, and I am all for trying to negotiate treaties!
It’s possible I am misinterpreting what EY means by “rogue data centers.” To clarify, the specific thing I am calling insane is the idea that the U.S. or NATO should under (almost) any circumstance bomb data centers inside other nuclear powers.
I appreciate Eliezer’s honesty and consistency in what he is is calling for. This approach makes sense if you believe, as Eliezer does, that p(doom | business as usual)>99%. Then it is worth massively increasing the risk of a nuclear war. If you believe, as I do and as most AI experts do, that p(doom | business as usual) <20%, this plan is absolutely insane.
This line of thinking is becoming more and more common in EA. It is going to get us all killed if it has any traction. No, the U.S. should not be willing to bomb Chinese data centers and risk a global nuclear war. No, repeatedly bombing China for pursing something that is a central goal of the CCP that has dangers that are completely illegible to 90% of the population is not a small, incremental risk of nuclear war on the scale of aiding Ukraine as some other commenters are suggesting. This is insane.
By all means, I support efforts for international treaties. Bombing Chinese data centers is suicidal and we all know it.
I say this all as someone who is genuinely frightened of AGI. It might well kill us, but not as quickly or surely as implementing this strategy will.
Edited to reflect that upon further thought, I probably do not support bombing the data centers of less powerful countries either.
Thank you, Peter. These are the things that initially attracted me to effective altruism and I appreciate you articulating them so effectively. I will also say that these are ideas I admire you for obviously fostering, both through rethink priorities and your forecasting work.
Unfortunately it seems to me that the first and third ideas are far less prominent of a feature of EA than they used to be.
The first idea seems to me to be less prominent as a result of so many people believing in extremely high short term catastrophic ai risk. It seems that this has encouraged an attitude of animal welfare being trivial by comparison and the welfare of humans in the far future being irrelevant (because if we don’t solve it, humans will go extinct within decades). Attitudes about animal welfare seem in my opinion to be compounded by the increasing influence of Eliezer, who does not believe that non human animals (with the possible exception of chimps) are sentient.
The third idea also seems to be declining as a result of hard feelings related to internal culture warring. In my view, bickering about the integrity of various prominent figures, about the appropriate reaction to sbf, about whose fault sbf was, about how prevalent sexual assault is in EA, about how to respond to sexual assault in EA, about whether those responses are cultish or at least bigoted, etc etc etc has just made the general epistemics a lot worse. I see these internal culture wars bleeding into cause areas and other ostensibly unrelated topics. People are frustrated with the community and regardless of whatever side of these culture wars they are on, they are annoyed about the existence of the other side and frustrated that these seemingly fundamental issues of common decency are even a discussion. It puts them in no mood to discuss malaria vaccines with curiosity.
I personally deactivated my real-name forum account and stopped participating in the in person community and talking to people about ea. I still really really value these three ideas and see pockets of the community that still embody them. I really hope the community once again embodies them like I think they used to.
Yes but I still think the vast majority have been in the bay area.
Probably has something to do with it, but lots of cities have very active in person EA activity and I have not heard anywhere near as many complaints about anywhere else as I have about the bay area.
I’m not convinced I blame any sort of “culture” for at least some of the high profile events. Putting too much attention on culture really shifts blame away from individual responsibility. Shouldn’t we place the blame squarely on perpetrators and not on those who had nothing to do with the incident but supposedly “failed to think of prevention measures”? I’m not even clear what those measures could be.
Personally, I would not do this to my marriage.
Additionally, I want to draw your attention to one thing. I have a strong belief (correct me if I’m wrong) that the vast majority (if not all) of sexual misconduct causes which were described over the last couple of days in the articles or here, on the forum, come from either US or the UK. EA crowd is definitely not limited to those.
I think this is actually understating the problem. A huge percentage of recent forum posts explicitly relate to a relatively small group of people living the Bay Area. It is really weird how common it is to generalize the idiosyncrasies of that particular social group to “EA culture”.
Yea I basically agree with this although it seems difficult to make an intentional effort to expand the circumstances where individuals are banned because it is only possible when there is wide spread knowledge and agreement about the accusation. However I’m all for making accusations public after some threshold of evidence (although it am not sure exactly what that threshold should be and there would need to be some care with the phrasing to avoid libel lawsuits).
I’m 100 percent in favor of kicking people out to the extent that we can but we should also recognize that it’s not really possible for a community as decentralized as EA. So much EA activity goes on at events hosted by someone other than effective ventures (and this goes for parties, events, conferences, etc) so I don’t really understand the mechanics of what it would mean to kick someone out of EA.
Really depressing how many disagree votes this is getting. Calling bs on all the hand wringing about sexism in EA. People are so worried about sexism and gender inclusion yet think that blatant double standards are just fine? How phony.
I would strongly push back against the idea that norms are about “politely discussing” appropriate behavior. Norms are about social pressure and getting into other peoples business. It is a contradiction in terms to say “doing or not doing x is a private decision that should be left up to the individual with no external social pressure to do x” and “it should be a norm to do x”.
Regarding the grant example, I have said and continue to believe that it is totally appropriate for organizations to impose conflict of interest policies including limiting romantic relationships between grantees and decision makers. But if the organization has no such policy then that is the issue.
I think the idea that people who are not you simply being in a relationship has anything like an effect on you that playing loud music at 3 am does is both wrong and unhealthy. If a grant-making organization doesn’t have a robust conflict of interest policy, isn’t that the organization’s fault? If it does have one, why do you need these norms on top of that?
Policing the differentials in the “soft power” of other people’s relationships is precisely the sort of busybodying that I find so toxic and intrusive. Are rich, well-connected, influential people only allowed to date other rich, well-connected, influential people? That’s the logical conclusion of this line of reasoning and it strikes me as ugly. Besides, soft power is complicated and sometimes the person who looks more powerful from the outside is less powerful in the relationship. But more importantly, if you are not in the relationship, it is not your business.
Of course if a person is pushy or unwilling to take a no, then it’s not a matter of consenting adults and it’s a different story.
Enforcing some sort of social sanction against other people’s private relationships on the basis of power differentials seems just incredibly outside the realm of common decency or healthy boundaries.
For a norm to be a norm, it needs to be enforced via social pressure (most commonly using shame, but other forms too) or else it is not really a norm. What would it mean to have a norm against drinking with reports if someone repeatedly and publicly got drunk with reports and proceeded to receive no social sanction of any kind? If by “norm” you strictly mean “advocacy for organizational policy” and not “exerting social pressure” then I am with you. However, that is not how most people use the term. Personally, I would react much more positively to a post that said “orgs should have policies against drinking with reports” vs. “avoid drinking with reports” (and for the record I fully support orgs having policies of not drinking with reports).
I have no problem advocating influencing other organization’s policies around behavior. I have a problem with trying to directly influence individuals’ behavior through community norms. As I said above, organizational policies are a) explicit, b) governed by consent, and c) structurally limited in scope whereas “community norms” that attempt to directly alter behavior through social sanction, public shaming, etc. are a) inherently murky, b) not governed by consent, and c) limitless in scope. For these reasons, organizational policy is much less likely to create a community full of intrusive and toxic behavior than the encouragement of community norms.
It’s one thing to say that it is my business that the grantmaking organization has a bad policy or bad enforcement of existing policy. In fact I do endorse organizations having robust and well-enforced conflict of interest policies.
What I do not endorse is the notion that, supposing I am in the situation you describe, I have standing to directly take issue with the people involved or their behavior. I don’t. If they are violating a policy and their employer doesn’t know about it, I have standing to notify the employer. If the employer has a bad policy, I have standing to publicly not support the employer. What I object to is the idea that “community norms” are an acceptable way of handling this situation (through direct shaming/socially sanctioning/publicly judging of the individual behavior rather than the organizational policy).
Organizational policies are explicit and governed by consent. If I don’t like the policy, I don’t have to work there. If I am unclear on what the policy is, I can read the employee manual. “Community norms” in contrast are murky/non-explicit and no one ever signs a form saying they now agree to enter this community with these rules. For this reason, they are vulnerable to arbitrarily expanding and being used in service of signaling conformity and allegiance rather than serving their alleged purpose. Similarly, organizations also only have jurisdiction over people for whom they actually have reason to care about these matters with and they do not have jurisdiction over people for whom they randomly decide to moralize about. “Community norms”, in contrast, invite everyone to be a busybody about everyone else. Count me out.
I used to think that it was really unfair how people characterized ea as cultish but seeing the discussion lately around dating in ea/polyamory and the public shaming of individuals who have made some missteps just really reminds me of religions including the one I grew up in where people inexplicably feel justified in inserting themselves into other peoples private lives and then using public shaming to enforce these norms.
It’s one thing for an institution (grant maker, employer, university, etc.) or relationship (e.g. monogamous) to have a policy on what kind of relationships are forbidden on the grounds of conflicts of interest or unit cohesion or concerns about power dynamics because that is their business. I still fail to see where the rest of us have standing to “establish norms” about this matter.
I don’t like cheating or sleeping with your grantees or employee any more than the next person but these things are not my business and they aren’t yours either.
I say this all as someone who has been off the dating market for over a decade, so nothing directly applies to me but as someone raised in an oppressive religion, this is looking like my cue to exit.
- Feb 28, 2023, 6:48 AM; 5 points) 's comment on Timothy Chan’s Quick takes by (
- Nov 19, 2023, 11:38 AM; 2 points) 's comment on Timothy Chan’s Quick takes by (
This not dating funders/grantees is a little strange to me as phrased, although I certainly strongly agree in the cases most people are imagining.
As phrased it sounds like there is a problem (for example) of paying a girlfriend/boyfriend with your own funds to do an extended project. Which is sort of weird and unusual but what exactly is the problem with that? I think what this is getting at is you shouldn’t date a grantee that you are deciding to pay with someone else’s money or on behalf of a larger organization. Correct?