For what itâs worth, I upvoted and disagree-voted, because I think I think youâre wrong and because you clearly put thought and effort into your writing, and produced the sort of content I think we should generally have more of, even though Iâm annoyed locally that âdonât do eitherâ is a much easier comment to write than âhereâs the analysis you asked forâ, leading to the only serious comments on the post being people stating your view.
keller_scholl đ¸
This is a good discussion, but I think that youâre missing the strongest argument, in this context, against donation-splitting: weâre not in a one-shot. Within the context of a one-shot, it makes much more sense to donation-split. But thereâs a charity that I donate to in part because I am convinced that they are very financially constrained (iteration impact). Furthermore, Donors can respond to each other within a year. If most EAs give around December/âJanuary, OpenPhil and GWWC can distribute in February, after hearing from relevant orgs how much funding they received. If a charity is credible in saying that theyâre under their expected funding, again, people can donate in response. So in practice I donât expect donation splitting to have that positive an effect on charity financing uncertainty, particularly compared to something like multi-year commitments.
Nonlinear, thank you. Edited.
Looking at this comment after Nonlinear, I think it holds up. There exists a point at which an org loses the (moral, not legal) right to see questions /â a writeup in advance, and Nonlinear was past it. Legal threats, contacting the people you spoke with, and contacting your employer are classic examples of this. I am also sympathetic to journalists covering industries that are known to react strongly, such as oil and tobacco. But the items in the list you provide do not come close to the bar of the org being untrustworthy, and that is the bar I think must be cleared.
I think that distinguishing between 1-8 hours (preferably paid), up to 40 hours, and 1-6 months, is very important here. I am happiest about the shortest ones, particularly for people who have to leave a job (part of why I think that OP is talking about the latter sort).
Correct: Iâm vaguely aware of Kat Woods posting on FB, but havenât investigated Nonlinear in any depth before: having an explicit definition of âwhat information Iâm working withâ seemed useful.
Yes, Nonlinear is smaller than expected.
I outlined a bad org with problems, even after adjusting for a hostile reporter and a vengeful ex-employee. I think that the evidence is somewhat weaker than what I expect, not counting that I trust you personally, and the allegations are stronger/âworse. Overall, it was a negative update about Nonlinear.
I think part of the disconnect, from my perspective, is that I have experience with small scrappy conventions that deliver good talks and an enjoyable time and a large central room where people can mingle. The scrappier science-fiction conventions seem to charge in the range of $60-$120, usually on the lower side, and, while relying very heavily on volunteer labor and physical assets, about break even. The fancier ones might charge $250/âperson/âweekend. Thatâs not the true price, since it excludes what dealers pay for access, advertising, etc. But my sense of con budgets is that it is at least half of the true price.
Obviously a large chunk of that is the $240 on food that youâre spending and theyâre not. Another chunk of the cost is location: said cons tend to be out in the boonies of their relevant cities, passing along to attendees costs of travel or increased hotel prices.
The context that non-profit conventions tend to be $400+ is helpful: thank you. I really appreciate the transparency.
I donât think that this is a good state of affairs. I think that the points I raise range from âthis should be completely unacceptableâ (4, 6) to âif this is the worst credible information that can be shared, the org is probably doing very well (3, 5)â. This is not a description of an org that I would support! But if a friend told me they were doing good work there and they felt the problems were blown out of proportion or context by a hostile critic and a vengeful ex-employee with an axe to grind, I would take them seriously and not say âyou have to leave immediately. I can cover two months salary for you while you find another job, but I believe that strongly that you should not work here.â
As always, context is important: âthe head of the org is a serial harasser with no effective checksâ and âwe fired someone when their subordinate came forward with a sexual harassment allegation that, after a one-week investigation, we validated and found credible: the victim is happily employed by us todayâ are very different states of affairs. If someone is sharing the worst credible information, then the difference between âwe were slow to update on Xâ and âthey knew X was false from the report by A, but didnât change their marketing materials for another six monthsâ can be hard to distinguish.
Running an org is complicated and hard, and I think many people underestimate how much negative spin a third party with access to full information can include. I am deliberately not modelling âBen Pace, who I have known for almost a decadeâ and instead framing âhostile journalist looking for clicksâ, which I think is the appropriate frame of reference.
Worst credible information about a charity that I would expect based on the following description (pulled from Googleâs generative AI summary: may or may not be accurate, but seemed like the best balance to me of engaging with some information quickly):
Nonlinear is an organization that funds and researches AI safety interventions. They also offer an incubation program that provides seed funding and mentorship. The Nonlinear Library is a podcast that uses text-to-speech software to convert the best writing from the rationalist and EA communities into audio.
The Nonlinear Fund is an organization that aims to research, fund, and seed AI safety interventions. Their incubation program provides seed funding and mentorship. The seed funding is for a yearâs salary, but you can also use it for other things, such as hiring other people.
The Nonlinear Library is a podcast that uses text-to-speech software to convert the best writing from the rationalist and EA communities into audio. You can listen to the podcast on Apple Podcasts and Spotify.
I am not describing a charity with ideal management practices, but envisioning one with 25 employees, active for 5 years, and which has poor but not shockingly or offensively bad governance by the standards of EA orgs. Someplace where I wouldnât be worried if a friend worked there, but I would sympathetically listen to their complaints and consider them not the best use of my marginal dollar.
Credible accusations of sexual harassment by at least one current or former employee
One or more incidents of notable financial mismanagement
Promised use of donor funds that did not materialize into a finished project (less than 10% of one yearâs annual budget in scope)
Credible evidence of evading employment or tax law in some way that, when framed by a hostile observer, looks âpretty badâ: I do not expect sweatshops, but encouraging employees to violate the terms of visas or preferentially hiring donors in a way that can be made to sound scary.
Multiple stories of funding going to friends and personal contacts rather than âobjectively betterâ candidates who did not have personal contacts.
Credible evidence that a moderately important claim they fundraised on continued to be propagated after it stopped being true or the evidence for it was much weaker than previously thought.
Maybe I am excessively cynical about what bad things happen at small charities, but this feels like a reasonable list to me. There may be other events of similar badness.
From what I can tell, Harris has impressively low name recognition and is fairly unpopular with voters. That doesnât mean that party elites wonât object to an outside group sponsoring a candidate who doesnât have their blessing.
A few points.
There were (expensive, time-consuming, costly) efforts to get political allies, elect friendly candidates, etc. Then FTX collapsed. That would need to be rebuilt, first.
Presidential candidates are Big Deals. You get ones who are single-issue on climate, or maybe trade in years where that is particularly salient. There might be a Republican challenger this year whoâs notably pro-choice relative to current Republican policy positions.
On the Democratic side, challenging Biden is a way to make yourself Very Unpopular with party elites. Challenging Harris, if she is his chosen successor, would be That But Worse. It might be worth it, but there are serious costs.
On the Republican side, you need a candidate who can compete with DeSantis and Trump. A single issue that most people donât care about wonât cut it. Generalized anti-tech sentiment, maybe?
With the same resources, itâs probably easier and more effective to try to persuade candidates who are more successful.
Youâre massively underestimating your ROI, probably by an order of magnitude. $10 billion in charitable contributions per year, even with a very steep discount rate of 20%, would be an ROI of, not 18-fold, but closer to 90-fold (with a net present value of $50 billion). With a more reasonable discount rate of 10% (would have said 5%, but then the Fed happened), youâre talking about 180-fold returns.
Of course, this falls apart under sufficiently short timelines.
I donât think that any of those justify not sending either your questions or a writeup of the post to the org in advance. They have a public email address. Itâs at the bottom of their home page. I donât think itâs a particularly excessive burden to send a copy once youâre done and give them a week. Perhaps two if they apologize and ask for a bit more time. I understand why people might be suspicious at the moment, but forcing people to scramble while on vacation is not a good norm. As you say, this post clearly wasnât that time-sensitive. I donât think that the Forum should have taken your post down, but thatâs a much higher bar.
For comparison, when I posted a piece that was somewhat critical of CEAâs admissions and transparency policies, it was after I had asked in a more private Slack channel and gotten an answer I was not satisfied with. You can see that they clarified that they did inform people, and that others chimed in to thank me for informing them with the post.
I am not speaking for the DoD, the US government, or any of my employers.
I think that your claim about technological inevitability is premised on the desire of states to regulate key technologies, sometimes mediated by public pressure. All of the examples listed were blocked for decades by regulation, sometimes supplemented with public fear, soft regulation, etc. Thatâs fine so long as, say, governments donât consider advancements in the field a core national interest. The US and China do, and often in an explicitly securitized form.
Quoting CNAS
Chinaâs leadership â including President Xi Jinping â believes that being at the forefront in AI technology is critical to the future of global military and economic power competition.
English-language Coverage of the US tends to avoid such sweeping statements, because readers have more local context, because political disagreement is more public, and because readers expect it.
But the DoD in the most recent National Defense Strategy identified AI as a secondary priority. Trump and Biden identified it as an area to maintain and advance national leadership in. And, of course, with the US at the head they donât need to do as much in the way of directing people, since the existing system is delivering adequate results.
Convincing the two global superpowers not to develop militarily useful technology while tensions are rising is going to be the first time in history that has ever been accomplished.
Thatâs not to say that we canât slow it down. But AI very much is inevitable if it is useful, and it seems like it will be very useful.
Someone might be out about being bi at an after-party with friends, but not want to see that detail being confirmed by a fact-checker for a national paper. This doesnât seem particularly unusual.
This isnât the only thing that could go wrong, but itâs a straightforward example. Perhaps they donât want their full name blatantly linked to their online account. There are lots of reasons that people might want privacy. Unless your life is at risk, I would not assume that you have privacy from a journalist who isnât a personal friend unless they have an explicit commitment. I trust journalists who are also community members to not take harmful advantage of access.
Something that is sometimes not obvious to people not used to dealing with journalists is that off-the-record sometimes means âI canât officially tell you this, so please find another source who can corroborate itâ. Itâs not remotely the same thing as an expectation of privacy and good sense that one would have with a friend.
Without getting too far into the specifics, I think that this is a good attitude to have across a wide range of policy concerns, and that similar issues apply to other policy areas EAs are interested in.
Bay Area 2023. Will edit.
Some post-EAG thoughts on journalists
For context, CEA accepted at EAG Bay Area 2023 a journalist who has at times written critically of EA and individual EAs, and who is very much not a community member. I am deliberately not naming the journalist, because they havenât done anything wrong and Iâm still trying to work out my own thoughts.
On one hand, âjournalists who write nice things get to go to the events, journalists who write mean things get excludedâ is at best ethically problematic. Itâs very very very normal: political campaigns do it, industry events do it, individuals do it. âAccess journalismâ is the norm more than it is the exception. But that doesnât mean that we should. One solution is to be very very careful about maintaining the differentiation between âcommunity memberâ and âcritical or notâ. Dylan Matthews is straightforwardly an EA and has reported critically on a past EAG: if he was excluded for this I would be deeply concerned.
On the other hand, I think that, when hosting an EA event, an EA organization has certain obligations to the people at that event. One of them is protecting their safety and privacy. EAs who are journalists can, I think, generally be relied upon to be fair and to respect the privacy of individuals. That is not a trust I extend to journalists who are not community members: the linked example is particularly egregious, but tabloid reporting happens.
EAG is a gathering of community members. People go to advance their goals: see friends, network, be networked at, give advice, get advice, learn interesting things, and more. In a healthy movement, I think that EAGs should be a professional obligation, good for the individual, or fun for the individual. It doesnât have to be all of them, but it shouldnât harm them on any axis.
Someone might be out about being bi at an after-party with friends, but not want to see that detail being confirmed by a fact-checker for a national paper. This doesnât seem particularly unusual. They would be right to trust community members, but might not realize that there could be journalists at the after-party. Non-community journalists will not necessarily share norms about privacy or have particularly strong incentives to follow any norms that do exist.
On the gripping hand, it feels more than a little hypocritical to complain about the low quality of criticism of EA and also complain when a journalist wants to attend an EA event to get to know the movement better.
One thing Iâm confident of is that I wish that this had been more clearly disclosed. âThis year we are excited to welcome X, who will be providing a critical view on EAâ is good enough to at least warn people that someone whose bio says that they are interested in
how the wealthiest people in society spend their money or live their lives
(emphasis mine)
is attending.
Iâm still trying to sort out the rest of my views here. Happy to take feedback. Itâs very possible that Iâm missing some information about this.
P.S.
I have been told by someone at CEA that all attending journalists have agreed that everything at EAG is off the record by default. I donât consider this to be an adequate mitigating factor for accepting non-community journalists and not mentioning this to attendees or speakers.
P.P.S.
And no, Iâm not using a pseudonym for this. I think that that is a bad and damaging trend on the Forum, and I donât, actually, believe that anyone at CEA will retaliate against me for posting this.
That seems almost aggressively misleading. âSome of this category of debt may have been held by these descendants, therefore it should have been invalidatedâ, as you seem to be implying, proves far too much.
One point that occurs to me is that firms run by senior employees are reasonably common in white-collar professions: certainly not all of them, but many doctors function under this system, and itâs practically normative for lawyers, operates in theory for university professors, and I believe to a lesser extent accountants and financiers. There is likely to be a managing partner, but that person serves with the consent of the senior partners.
A democracy to which new members must be voted in, socialized for a number of years, and buy in their own stake seems to have substantial advantages over one where everyone gets a vote the moment that they join. I also suspect that not understanding what theyâre engaged in as a political experiment is helpful for reducing certain types of distractions.
With that in mind, expanding coops among the white-collar elite seems relatively practical, and elite persuasion is always a powerful tool.