Project lead of LessWrong 2.0, often helping the EA Forum with various issues with the forum. If something is broken on the site, it’s a good chance it’s my fault (Sorry!).
Habryka
Sure, I am not saying that EV should have gone back on some kind of promise.
But generally expect that when I trade with leadership in this ecosystem that they will be held to a standard of cost-effectiveness and not a standard of looking good on optics-grounds. And I think it continues to be good to judge EV and OP for making decisions on grounds that seem inconsistent with EA principles to me.
It shouldn’t cost them like an infinite amount of social capital, but I do think it makes the people a bit less suited to being in long-term EA leadership positions (i.e. if we had an election for EA leadership, the degree to which people optimize for optics instead of cost-effectiveness would hopefully be one of the central virtues on which to evaluate our leadership).
That reputational effects may have overruled a cost-effective analysis that disregarded those effects in this particular case does not update me on the probability that EA is at risk that vanity and signaling will “dominate,” or even play a major role, in EA funding decisions writ large.
I think Open Philanthropy staff will transparently tell you that they have recently substantially shifted towards considering optics and reputation as a core component of their grants.
I agree that Wytham itself is only one datapoint so should not update you much, though I think if you are curious about this, it wouldn’t be too hard to confirm that there is a broader shift going on (to be clear, I wouldn’t consider it vanity in that case, though broader signaling concerns seem quite substantial).
and I see no basis for demanding that they continue to associate themselves with the “castle” if they do not wish to do so
I agree that the donors should feel free to disassociate themselves from whatever they want, though in this case how the castle is being handled is a decision by EV, the most central EA organization. Also, of course, if a donor chooses to disassociate like that, it’s within the rights of EA community members to think less of them (they might still think positively on-net, but highlighting how someone’s grantmaking ignoring cost-effectiveness concerns in favor of personal reputation managemetn clearly is a valid criticism and should make you less excited about someone’s giving, and also concerned about the secondary effects of their giving)
Finally, I’m more willing to weight optics on meta stuff than on object-level concerns
This seems like a mistake.
I agree that it would seem more legitimate to do things for optics-reasons, but the detrimental effect on incentives and ability to think that come from optics-focused decision-making are just as real in meta work as for object-level work. The reason to not do things for optics-reasons is of course not that people will see you as more legitimate if you don’t, that’s just another optics-concern. The reason is that it affects the incentives on people to do good work, sets up an adversarial epistemic environment, and generally makes decision-making predictably worse. I don’t see why we should make a different tradeoff on those axis for meta work, where figuring out how to have a positive impact is usually substantially harder and messier than in more clear-cut global health and development cases.
Sales for this kind of property would almost always have many months of notice, so I don’t think scheduling things 6 months in-advance would deter potential buyers.
It would be pretty normal and standard for a buyer to have to wait 6 months before they can take possession of the property, so I don’t think this would matter that much. And my guess is 6 months is plenty of time for Wytham to provide most of its value.
To be clear, what I am criticizing here is not operating the venue while the sale is going on, or setting some kind of target for the operators in terms of quality-adjusted-events or estimates of counterfactual events caused, that would allow them to continue operating the venue.
I totally agree that observing someone spending money on a “vanity project” would totally be evidence that they are poorly run or corrupt, but like, Wytham would not be a vanity project if it were to make economic sense for EV or the EA community at large to operate. So whether a project is a vanity project is dependent on a cost-effectiveness analysis (which I don’t think really has occurred in this case).
Yes, totally possible. I am just specifically claiming that given that the cost of capital is one of the major expenses for this project, it would be surprising to me if it wasn’t worth the marginal cost of operating it on financial grounds, at least until some kind of buyer was found.
I am trying to make a pretty concrete claim about how I expect a benefit calculation to come out if done well, and definitely could be wrong (the thing that I have higher confidence in is that this decision wasn’t very sensitive to such a cost-benefit calculation and seems more driven by other factors).
Yep, I think that would be a reasonable calculation.
I mean it in the sense that they will have to sell substantially below market value if they want to sell it quickly.
This kind of property tends to have huge bid-ask-spreads and the usual thing to do is to continue operating the property while looking for a buyer (my guess is they would succeed at selling it eventually at market value, but it would take a while).
It’s plausible that it was an error in the initial reasoning for buying it, but CEA will additionally have to likely take a huge loss on selling it, and I think it’s unlikely that that makes sense from a cost-effectiveness standpoint.
My vague sense, partially from the Open Philanthropy update is that reputation management was the primary consideration here.
My understanding (based on talking to people involved in Wytham and knowing the economics of renting and buying large venues in a lot of detail) is that the sale of Wytham does not actually make any economic sense for EV in terms of its mission to do as much good as possible. It is plausible that the initial purchase was a mistake, but my understanding is that it will likely take many years for EV to sell during which the venue will be basically completely empty, or the venue will have to be sold at a pretty huge loss. This means at this point, it’s likely worth it to keep it running.
Also based on talking to some of the people close to these decisions, and trying to puzzle together how this decision was made, it seems very likely to me that the reason why Wytham is being sold is not based in a cost-effectiveness analysis, but the result of a PR-management strategy which seems antithetical to the principles of Effective Altruism to me.
EV (and Open Phil) are supposed to use its assets and funds to help the most people and cause the most good for the world, not to protect their own reputation. Making donations and major financial decisions primarily driven by reputation-concerns is the primary pathology of most of the world’s charity landscape, where vanity projects and complicated signaling games dominate where donations go, and going down this path seems to me a very worrying development for the future of EA.
My sense is that with this move, EV and Open Philanthropy have opened up a huge number of organizations within EA to attacks by any sufficiently large online mob, despite potentially producing enormous value, given that they have demonstrated they are willing to force the leadership of the EA community to give up projects with little concern for their cost-effectiveness if they do not align with the signaling aims of Open Phil and EV.
It is possible that maybe someone made a cost-effectiveness analysis here that turned out negative, and if so I would love to see it since it has large relevance to my work. But I would be extremely surprised that a positive cost-effectiveness analysis here would cause EV to reverse the sale of the property, and in conversations on this topic with people involved it seemed that curiosity and appetite for understanding the actual cost-effectiveness of this project was very low compared to the PR-implications of it.
(To be clear, I am not saying that we should fully blind ourselves to considerations of reputation and public relations. However, I think this kind of reputational optimization is perilous and if is one of the domains where naive consequentialist-type reasoning tends to most often go awry.
I think our reputational strategy should primarily be oriented around acting with integrity and honesty. And on that dimension the central tenet of how the EA community has presented itself is that we make decisions on the basis of what we think will help the most people, and are very much not making decisions on the basis of what will look good to other people, or will put us personally in the most powerful positions.
Imagine GiveWell releasing their recommended charities saying “well, there was one charity that easily defeated AMF in terms of the cost-effectiveness of its program activities, but it was dealing with sanitization issues which are really gross that nobody wants to donate to and we expected that if we recommended it this would overall reduce the donations going through GiveWell. We thought this effect was big enough to cause us to decide to not recommend this charity as our top charity”. I think this would be crazy and clearly violate the principles that GiveWell set out according to which it compiles its recommendations. While weaker, I think something similar is going on in how this decision seems to have been made)
That is definitely relevant data! Looking at the recent dates (and hovering over the exact data at the link where the graphs are from) it looks like its around 60% logged-out, 40% logged in.
I do notice I am surprised by this and kind of want confirmation from the EA Forum team they are not doing some kind of filtering on traffic here. When I compare these numbers naively to the Google Analytics data I have access to for those dates, they seem about 20%-30% too low, and it makes me think there is some filtering going on (though my guess is that 80%-90% logged-out traffic definitely still does not seem representative)
This post captures some of my feelings for why I don’t think we should make exceptions for US elections:
https://www.benlandautaylor.com/p/the-four-year-locusts
See also:
https://www.lesswrong.com/posts/9weLK2AJ9JEt2Tt8f/politics-is-the-mind-killer
Hmm, my guess is by the time a system might succeed at takeover (i.e. has more than like a 5% chance of actually disempowering all of humanity permanently), I expect its behavior and thinking to be quite rational. I agree that there will probably be AIs taking reckless action earlier than that, but in as much as an AI is actually posing a risk of takeover, I do expect it to behave pretty rationally overall.
I didn’t (cross-)post this on LessWrong really only because I’m not often on LessWrong and feel less able to judge what they’d welcome. Happy to take recommendations there too.
FWIW, the post would definitely be welcome on LW/the AI Alignment Forum.
Or put another way, would people engage differently if the forum was run on stock software by a single sysadmin and some regular posters granted volunteer mod privileges?
Well, I mean it isn’t a perfect comparison, but we know roughly what that world looks like because we have both the LessWrong and OG EA Forum datapoints, and both point towards “the Forum gets on the order of 1/5th the usage” and in the case of LessWrong to “the Forum dies completely”.
I do think it goes better if you have at least one well-paid sysadmin, though I definitely wouldn’t remotely be able to do the job on my own.
Yep, totally, it’s a pretty bad proxy. I think the obvious analogy at least for the EA Forum would be that the organizations who are hiring people from the EA Forum are in a comparable position to advertisers, but it’s not amazing.
I am not sure what you mean by the first. Facebook makes almost all of its revenue with ads. It also does some stuff to do better ad-targeting, for which it uses cookies and does some cross-site tracking, which I do think drives up profit, though my guess is that isn’t responsible for a large fraction of the revenue (though I might be wrong here).
But that doesn’t feel super relevant here. The primary reason why I brought up FB is to establish a rough order-of-magnitude reference class for what normal costs and revenue numbers are associated with internet platforms for a primarily western educated audience.
My best guess is the EA Forum could probably also finance itself with subscriptions, ads and other monetization strategies at its current burn rate, based on these number, though I would be very surprised if that’s a good idea.
Yeah, that seems like the right comparison? Revenue is a proxy for value produced, so if you are arguing about whether something is worth funding philanthropically, revenue seems like the better comparison than costs. Though you can also look at costs, which I expect to not be more than a factor 2 off.
$500/monthly user is actually pretty reasonable. As an example, Facebook revenue in the US is around $200/user/year, which is roughly in the same ballpark (and my guess is the value produced by the EA Forum for a user is higher than for the average Facebook user, though it’s messy since Facebook has such strong network effects).
Also, 4000 users is an underestimate since the majority of people benefit from the EA Forum while logged out (on LW about 10-20% of our traffic comes from logged-in users, my guess is the EA Forum is similar, but not confident), and even daily users are usually not logged in. So it’s more like $50-$100/user, which honestly seems quite reasonable to me.
No subreddit is free. If there is a great subreddit somewhere, it is probably the primary responsibility of at least one person. You can run things on volunteer labor but that doesn’t make them free. I would recommend against running a crucial piece of infrastructure for a professional community of 10,000+ people on volunteer labor.
Just as a piece of context, the EA Forum now has about ~8x more active users than it had at the beginning of those few years. I think it’s uncertain how good growth of this type is, but it’s clear that the forum development had a large effect in (probably) the intended direction of the people who run the forum, and it seems weird to do an analysis of the costs and benefits of the EA Forum without acknowledging this very central fact.
(Data: https://data.centreforeffectivealtruism.org/)
I don’t have data readily available for the pre-CEA EA Forum days, but my guess is it had a very different growth curve (due to reaching the natural limit of the previous forum platform and not getting very much attention), similar to what LessWrong 1.0 was at before I started working on it.
The Washington Post article rings quite hollow. It claims that CEA and other EA organizations have taken FTX’s downfall as an opportunity for “reflection and institutional reform”, and cites the legal investigation CEA sponsored as evidence of this.
However, as far as I can tell the primary goal of that legal investigation was mostly PR-related, trying to clear CEA’s name and prove to the outside world that no one knew the full extend of FTX’s fraud, and was not aimed at facilitating an internal reflection process (or at the very least people I’ve talked to at CEA described it to me as something they did not at all expect to be helpful as part of a reflection process, and multiple described the constraints imposed by it as harmful).
If-anything the legal investigation seems to have actually substantially interferred with CEA reflection processes, with Will MacAskill himself telling me that the EV board prevented him from publishing his reflections on FTX due to legal and PR concerns, and due to it maybe making the legal investigation seem less legitimate.
Overall, the piece reads like a puff piece and not like something that displays real reflection. I mean, it’s fine to write puff-pieces from time to time, but it rings hollow in a world where actual institutional reform as a result of FTX has been almost completely absent.