Non-EA interests include chess and TikTok (@benthamite). We are probably hiring: https://www.centreforeffectivealtruism.org/careers
Ben_West
Still, it’s hard to see how tweaking EA can lead to a product that we and others be excited about growing. Especially considering that we have the excellent option of just talking directly about the issues that matter to us, and doing field-building around those ideas… This would be a relatively clean slate, allowing us to do more (as outlined in 11), to discourage RB, and stop bad actors.
Do you remember how animal rights was pre-EA? The first Animal Rights National Conference I went to, Ingrid Newkirk dedicated her keynote address to criticizing scope sensitivity, and arguing that animal rights activists should not focus on tactics which help more animals. And my understanding is that EA deserves a lot of the credit for removing and preventing bad actors in the animal rights space (e.g. by making funding conditional on organizations following certain HR practices).
It’s useful to identify ways to improve EA, but we have to be honest that imaginary alternatives largely seem better because they are imaginary, and actual realistic alternatives also have lots of flaws.
(Of course, it’s possible that those flawed alternatives are still better than EA, but figuring this out requires actually comparing EA to those alternatives. Some people have started to do this e.g. here, and I find that work valuable.)
Super sorry to see you go Max. It’s honestly kind of hard to believe how different CEA is today from when I joined, and a lot of that is due to your leadership. CEA has a bunch of projects going on, and the fact that you can step down without these projects being jeopardized is a strong endorsement of the team you’ve built here.
I look forward to continuing to work with you in an advisory role!
EA has reached a size and level of visibility now that is sure to keep it continuously embroiled in various controversies and scandals from now on. We can’t just mourn and hang our heads in shame for the rest of our lives.
One animal welfare advocate told me something like “You EA’s are such babies. There are entire organizations devoted to making animal advocacy look bad, sending “undercover investigators” into organizations to destroy trust, filing frivolous claims and lawsuits to waste time, placing stories in the media which paint us in the worst light possible, etc. Yet EA has a couple of bad months in the press and you all want to give up?”
I found that a helpful reframe.
The earning to give company I started got acquired.
I still don’t understand why they can’t give a clear promise of when they will talk and that the lack of this makes me trust them less
fwiw I will probably post something in the next ~week (though I’m not sure if I’m one of the people you are waiting to hear from).
Animal Justice Appreciation Note
Animal Justice et al. v A.G of Ontario 2024 was recently decided and struck down large portions of Ontario’s ag-gag law. A blog post is here. The suit was partially funded by ACE, which presumably means that many of the people reading this deserve partial credit for donating to support it.
Thanks to Animal Justice (Andrea Gonsalves, Fredrick Schumann, Kaitlyn Mitchell, Scott Tinney), co-applicants Jessica Scott-Reid and Louise Jorgensen, and everyone who supported this work!
I also donated $5,800. Thanks Andrew for making this post – this seems like a somewhat rare opportunity for <$10k donations to be unusually impactful
This forum has taken off over the past year. Thanks to all the post authors who have dedicated so much time to writing content for us to read!
Marcus Daniell appreciation note
@Marcus Daniell, cofounder of High Impact Athletes, came back from knee surgery and is donating half of his prize money this year. He projects raising $100,000. Through a partnership with Momentum, people can pledge to donate for each point he gets; he has raised $28,000 through this so far. It’s cool to see this, and I’m wishing him luck for his final year of professional play!
First in-ovo sexing in the US
Egg Innovations announced that they are “on track to adopt the technology in early 2025.” Approximately 300 million male chicks are ground up alive in the US each year (since only female chicks are valuable) and in-ovo sexing would prevent this.
UEP originally promised to eliminate male chick culling by 2020; needless to say, they didn’t keep that commitment. But better late than never!
Congrats to everyone working on this, including @Robert—Innovate Animal Ag, who founded an organization devoted to pushing this technology.[1]
- ^
Egg Innovations says they can’t disclose details about who they are working with for NDA reasons; if anyone has more information about who deserves credit for this, please comment!
- ^
I feel like there’s some implicit claim that only a subset of people (socially awkward men?) aren’t romantically perceptive, but my understanding is that basically everyone is bad at this and if you are going to flirt with someone you should expect that you are probably unable to tell whether they want it.[1]
An example paper largely chosen at random says:
Based on a community sample of real-life speed daters we
were able to show that actual mate choices are not reciprocal,
although people strongly expect their choices to be
reciprocated and dating behaviour (flirting) is indeed
strongly reciprocal.I.e. people reciprocate flirting essentially independent of whether they are actually attracted to the other person, and the other person is essentially unable to distinguish “real” from “fake” flirting.
Furthermore, that paper had two “independent, trained raters” who watched recordings and marked if the person involved was flirting. These raters had interrater reliability of which isn’t terrible, but isn’t amazing either.[2]
tl;dr: my guess is that most people should 1) not assume that they can reliably identify flirting and 2) even if they can, should not assume that they can reliably predict whether this flirting is indicative of romantic interest.
Of course, this also cuts the other way: people who you don’t think are attracted to you are sometimes attracted to you. But whatever risk/reward calculation you are running should include the fact that you are probably going to make mistakes here.
- ^
Obviously it’s possible to get reliable signals, e.g. if someone explicitly says “I don’t like you” then probably you can accurately guess that they don’t like you. This comment is referring to “normal” flirting signals like eye contact, touch, etc.
- ^
I assume these “trained raters” were grad students who had thought about the problem for a couple days or something, and I bet that if you actually genuinely studied this you could get good at it, but probably very few people are in that reference class.
- ^
Startups aren’t good for learning
I fairly frequently have conversations with people who are excited about starting their own project and, within a few minutes, convince them that they would learn less starting project than they would working for someone else. I think this is basically the only opinion I have where I can regularly convince EAs to change their mind in a few minutes of discussion and, since there is now renewed interest in starting EA projects, it seems worth trying to write down.
It’s generally accepted that optimal learning environments have a few properties:
You are doing something that is just slightly too hard for you.
In startups, you do whatever needs to get done. This will often be things that are way too easy (answering a huge number of support requests) or way too hard (pitching a large company CEO on your product when you’ve never even sold bubblegum before).
Established companies, by contrast, put substantial effort into slotting people into roles that are approximately at their skill level (though you still usually need to put in proactive effort to learn things at an established company).
Repeatedly practicing a skill in “chunks”
Similar to the last point, established companies have a “rhythm” where e.g. one month per year where everyone has a priority of writing up reflections on how the sales cycle is going, commenting on each other’s writeups, and updating your own. Startups do things by the seat of their pants, which means employees are usually rapidly switching between tasks.
Feedback from experts/mentorship
Startup accelerators like YCombinator partially address this, but still a defining characteristic of starting your own project is that you are doing the work without guidance/oversight.
Moreover, even supposing you learn more at a startup, it’s worth thinking about what it actually is you learn. I know way more about the laws regarding healthcare insurance than I did before starting a company, but that knowledge isn’t super useful to me outside the startup context.
This isn’t a 100% universal knockdown argument – some established companies suck for professional development, and some startups are really great. But by default, I would expect startups to be worse for learning.
- 13 Nov 2022 4:37 UTC; 9 points) 's comment on Will MacAskill’s role in connecting SBF to Elon Musk for a potential Twitter deal by (
Sam Bankman-Fried’s trial is scheduled to start October 3, 2023, and Michael Lewis’s book about FTX comes out the same day. My hope and expectation is that neither will be focused on EA,[1] but several people have recently asked me about if they should prepare anything, so I wanted to quickly record my thoughts.
The Forum feels like it’s in a better place to me than when FTX declared bankruptcy: the moderation team at the time was Lizka, Lorenzo, and myself, but it is now six people, and they’ve put in a number of processes to make it easier to deal with a sudden growth in the number of heated discussions. We have also made a number of design changes, notably to the community section.
CEA has also improved our communications and legal processes so we can be more responsive to news, if we need to (though some of the constraints mentioned here are still applicable).
Nonetheless, I think there’s a decent chance that viewing the Forum, Twitter, or news media could become stressful for some people, and you may want to preemptively create a plan for engaging with that in a healthy way.
- ^
This market is thinly traded but is currently predicting that Lewis’s book will not explicitly assert that Sam misused customer funds because of “ends justify the means” reasoning
- ^
The Forum moderation team has been made aware that Kerry Vaughn published a tweet thread that, among other things, accuses a Forum user of doing things that violate our norms. Most importantly:
Where he crossed the line was his decision to dox people who worked at Leverage or affiliated organizations by researching the people who worked there and posting their names to the EA forum
The user in question said this information came from searching LinkedIn for people who had listed themselves as having worked at Leverage and related organizations.
This is not “doxing” and it’s unclear to us why Kerry would use this term: for example, there was no attempt to connect anonymous and real names, which seems to be a key part of the definition of “doxing”. In any case, we do not consider this to be a violation of our norms.
At one point Forum moderators got a report that some of the information about these people was inaccurate. We tried to get in touch with the then-anonymous user, and when we were unable to, we redacted the names from the comment. Later, the user noticed the change and replaced the names. One of CEA’s staff asked the user to encode the names to allow those people more privacy, and the user did so.
Kerry says that a former Leverage staff member “requests that people not include her last name or the names of other people at Leverage” and indicates the user broke this request. However, the post in question requests that the author’s last name not be used in reference to that post, rather than in general. The comment in question doesn’t refer to the former staff member’s post at all, and was originally written more than a year before the post. So we do not view this comment as disregarding someone’s request for privacy.
Kerry makes several other accusations, and we similarly do not believe them to be violations of this Forum’s norms. We have shared our analysis of these accusations with Leverage; they are, of course, entitled to disagree with us (and publicly state their disagreement), but the moderation team wants to make clear that we take enforcement of our norms seriously.
We would also like to take this opportunity to remind everyone that CEA’s Community Health team serves as a point of contact for the EA community, and if you believe harassment or other issues are occurring we encourage you to reach out to them.
I do wish we could be having this discussion in a more productive and conciliatory way, which has less of a chance of ending in an acrimonious split.
At the risk of stating the obvious: emailing organizations (anonymously, if you want) is a pretty good way of raising concerns with them.
I’ve emailed a number of EA organizations (including ACE) with question/concerns, and generally find they are responsive.
And I’ve been on the receiving side of emails as well, and usually am appreciative; I often didn’t even consider that there could be some confusion or misinterpretation of what I said, and am appreciative of people who point it out.
EA seems reliant on nerdy millennial technology, namely long plaintext social media posts.
I’m interested in communicating in Gen Z ways, which I think roughly means “short amateur videos”. I’ve had moderate success on TikTok (35,000 followers as of this writing), and I would encourage more people to try it out.
There’s a nice self-selection where your content is only displayed to 16-year-olds who spend their free time watching math videos (or whatever niche you target), which I expect to be one of the best easily-available audiences of young people.
I’m sad to see EA Giving Tuesday go, it was really cool to have a community holiday like this, but I think you’re right that it’s not worth it anymore. Kudos on shutting down the project when it no longer made sense to run.
Possible Vote Brigading
We have received an influx of people creating accounts to cast votes and comments over the past week, and we are aware that people who feel strongly about human biodiversity sometimes vote brigade on sites where the topic is being discussed. Please be aware that voting and discussion about some topics may not be representative of the normal EA Forum user base.
- 20 Jan 2023 6:06 UTC; 39 points) 's comment on Linch’s Quick takes by (
- 23 Feb 2023 9:09 UTC; 27 points) 's comment on A statement and an apology by (
In case anyone else is wondering, I think this lawsuit is claiming:
Karen Dawn had an affair with Peter Singer
This caused her to break up with the person she cheated on[1]
Because they broke up, she had to move out of the $4 million house that the person she cheated on owned
Therefore, she is suing Singer for $4 million in damages
I’m not 100% sure about this though; corrections appreciated. I’m referencing 79-82 here.
- ^
Dawn describes it as an “affair” which I assume means that her partner viewed it as cheating?
Hey everyone, on an admin note I want to announce that I’m stepping in as “Transition Coordinator.” Basically, Max wanted to step down immediately, and choosing an ED even on an interim basis might take a bit, so I will be doing the minimal set of ED-like tasks to keep CEA running and start an ED search.
If things go well you shouldn’t even notice that I’m here, but you can reach me at ben.west@centreforeffectivealtruism.org if you would like to contact me personally.