I feel like: 1. I haven’t seen any writing about how disagreements between Dustin/Cari, and other OP execs, have changed priorities. (Or how other “political” considerations changed priorities) 2. I’m sure that there have been disagreements between them, that have changed priorities. 3. I would naively expect many organizations to swipe “changes made, just because some other exec wanted them” under the rug of some other reason or other, like, “Well, we have a lot of uncertainty on this topic.” Likewise, I don’t trust the reasons that many organizations give to many of their non-obvious high-level decisions.
Therefore, I think it’s pretty natural to conclude that there’s probably something funky going on, as I’d similar expect for other institutions. That some/many of the reasons for high-level decisions are political instead of epistemic.
I’d similarly assume that many high-level people at OP would like to signal these differences, but it would be difficult for them to do so (as is usually the case), so wouldn’t mind EAs making conclusions like this.
That said—if it is the case that there are important political reasons for things, I think it would be really useful if people at OP could signal that more, in some fashion. Like, “Again, we want to remind people that many of our high-level assessments are made in ways specific to opinions of Dustin, Cari, and OP execs, often opinions we expect that other EAs would disagree with. Many of these opinions are private. So please don’t assume that the conclusions we find should mirror ones that others would conclude on.”
I’ve heard from a few people who have taken some of OP’s high-level prioritization far too seriously as a conclusive epistemic take, in my opinion. Like, “OP has split it’s top-level budget this way, so I assume that I’d also conclude that for my own spending or time.”
Kudos for commenting here, and in the rest of this thread!
Just fyi, I find your comments in threads like these a lot more informative than blog posts like the one you linked to.
I think that blog post is reasonable, but it is fairly high-level, and I find that the devil is typically in the details. I feel like I’ve seen other people, both good and bad, post high-level epistemic takes that seemed good to me, so I’m just not how much I can take away from posts like that specifically.
But comments to questions explaining specific decisions is something I find quite useful!
I’m not detailing specific decisions for the same reason I want to invest in fewer focus areas: additional information is used as additional attack surface area. The attitude in EA communities is “give an inch, fight a mile”. So I’ll choose to be less legible instead.
The attitude in EA communities is “give an inch, fight a mile”. So I’ll choose to be less legible instead.
As a datapoint (which you can completely ignore), I feel like in the circles I travel in, I’ve heard a lot more criticism of OP that look more like “shady non-transparent group that makes huge decisions/mistakes without consulting anyone except a few Trusted People who all share the same opinions.”
There are certainly some cases in which the attack surface is increased when you’re fully open/transparent about reasoning.
But I do think it can be easy to underestimate the amount of reputational damage that OP (and you, by extension) take from being less legible/transparent. I think there’s a serious risk that many subgroups in EA will continue to feel more critical of OP as it becomes more clear that OP is not interested in explaining its reasoning to the broader community, becomes more insular, etc. I also suspect this will have a meaningful effect on how OP is perceived in non-EA circles. I don’t mean e/accs being like “OP are evil doomers who want to give our future to China”– I mean neutral third-parties who dispassionately try to form an impression of OP. When they encounter arguments like “well OP is just another shady billionaire-funded thing that is beholden to a very small group of people who end up deciding things in non-transparent and illegible ways, and those decisions sometimes produce pretty large-scale failures”, I expect that they will find these concerns pretty credible.
Caveating that not all of these concerns would go away with more transparency and that I do generally buy that more transparency will (in some cases) lead to a net increase on the attack surface. The tradeoffs here seem quite difficult.
But my own opinion is that OP has shifted too far in the “worry a lot about PR in the conventional sense” direction in ways that have not only led to less funding for important projects but also led to a corresponding reduction in reputation/status/prestige, both within and outside of EA circles.
> The attitude in EA communities is “give an inch, fight a mile”. So I’ll choose to be less legible instead.
I assume you meant that EAs are the ones fighting OP, with things like poor comments?[1]
If you feel that way, that seems intensely bad, for both you and the rest of the community.
I’m obviously saddened that reactions from this community seem to have been so frustrating. It also generally seems unhealthy to have an epistemic environment like this.
I’m really curious about ways to make these communications go better. My impression is that: 1. Communication is very important. 2. Most (though not all) people in OP+EA are very reasonable and altruistic. 3. It is often the case that posting important content on the EA Forum is unpleasant. Sometimes the most upset people might respond, for instance, and upvotes/downvotes can be scary. I know a lot of people who very much avoid this.
This doesn’t seem like it should be an impossible knot to me. Even if it cost $10M+ to hire some professional business coaches or something, that seems very likely worth it to me. We could move key conversations to any possible platforms that might be smoother. (Conversations, podcasts, AMAs, private or public, etc).
I’m not sure if I could be useful here, but definitely happy to help if there are ideas. If there are suggestions you have for me or community members to make things go better, I’d be eager.
I personally strongly care about communication between these parties feeling open and not adversarial—I like and value both these groups (OP + EA) a lot, and it really seems like a pity (a huge amount of EV lost) if communication is seriously strained.
[1](It could also mean external journalists anti-EA doing the fighting, with blog posts)
As CEO, I [Alex] work more closely with Cari Tuna and Dustin Moskovitz [@Dustin Moskovitz], our primary funders and founding board members, than I had in the past. Dustin and especially Cari were very involved at the founding of Open Philanthropy — our grant approval process in the very early days was an email to Cari. But their level of day-to-day involvement has ebbed and flowed over time. Cari, in particular, has recently had more appetite to engage, which I’m excited about because I find her to be a wise and thoughtful board president and compelling champion for Open Philanthropy and our work. Dustin has also been thinking more about philanthropy and moral uncertainty recently, as reflected in this essay he posted last month.
It’s worth noting that their higher level of engagement means that some decisions that would have been made autonomously by our staff in the recent past (but not in the early days of the organization) will now reflect input from Cari and Dustin. Fundamentally, it has always been the case that Open Philanthropy recommends grants; we’re not a foundation and do not ultimately control the distribution of Cari and Dustin’s personal resources, though of course they are deeply attentive to our advice and we all expect that to continue to be the case. All things considered, I think Cari and Dustin have both managed to be involved while also offering an appropriate — and very welcome — level of deference to staff, and I expect that to continue.
Could you elabotate on the influence of Cari and Dustin on your grantmaking (see what I have highlighted below [above]), ideally by giving concrete examples?
(Mostly agreeing)
I feel like:
1. I haven’t seen any writing about how disagreements between Dustin/Cari, and other OP execs, have changed priorities. (Or how other “political” considerations changed priorities)
2. I’m sure that there have been disagreements between them, that have changed priorities.
3. I would naively expect many organizations to swipe “changes made, just because some other exec wanted them” under the rug of some other reason or other, like, “Well, we have a lot of uncertainty on this topic.” Likewise, I don’t trust the reasons that many organizations give to many of their non-obvious high-level decisions.
Therefore, I think it’s pretty natural to conclude that there’s probably something funky going on, as I’d similar expect for other institutions. That some/many of the reasons for high-level decisions are political instead of epistemic.
I’d similarly assume that many high-level people at OP would like to signal these differences, but it would be difficult for them to do so (as is usually the case), so wouldn’t mind EAs making conclusions like this.
That said—if it is the case that there are important political reasons for things, I think it would be really useful if people at OP could signal that more, in some fashion.
Like, “Again, we want to remind people that many of our high-level assessments are made in ways specific to opinions of Dustin, Cari, and OP execs, often opinions we expect that other EAs would disagree with. Many of these opinions are private. So please don’t assume that the conclusions we find should mirror ones that others would conclude on.”
I’ve heard from a few people who have taken some of OP’s high-level prioritization far too seriously as a conclusive epistemic take, in my opinion. Like, “OP has split it’s top-level budget this way, so I assume that I’d also conclude that for my own spending or time.”
I wrote at length about my views on epistemic confidence here https://medium.com/@moskov/works-in-progress-the-long-journey-to-doing-good-better-9dfb68e50868
Kudos for commenting here, and in the rest of this thread!
Just fyi, I find your comments in threads like these a lot more informative than blog posts like the one you linked to.
I think that blog post is reasonable, but it is fairly high-level, and I find that the devil is typically in the details. I feel like I’ve seen other people, both good and bad, post high-level epistemic takes that seemed good to me, so I’m just not how much I can take away from posts like that specifically.
But comments to questions explaining specific decisions is something I find quite useful!
I’m not detailing specific decisions for the same reason I want to invest in fewer focus areas: additional information is used as additional attack surface area. The attitude in EA communities is “give an inch, fight a mile”. So I’ll choose to be less legible instead.
As a datapoint (which you can completely ignore), I feel like in the circles I travel in, I’ve heard a lot more criticism of OP that look more like “shady non-transparent group that makes huge decisions/mistakes without consulting anyone except a few Trusted People who all share the same opinions.”
There are certainly some cases in which the attack surface is increased when you’re fully open/transparent about reasoning.
But I do think it can be easy to underestimate the amount of reputational damage that OP (and you, by extension) take from being less legible/transparent. I think there’s a serious risk that many subgroups in EA will continue to feel more critical of OP as it becomes more clear that OP is not interested in explaining its reasoning to the broader community, becomes more insular, etc. I also suspect this will have a meaningful effect on how OP is perceived in non-EA circles. I don’t mean e/accs being like “OP are evil doomers who want to give our future to China”– I mean neutral third-parties who dispassionately try to form an impression of OP. When they encounter arguments like “well OP is just another shady billionaire-funded thing that is beholden to a very small group of people who end up deciding things in non-transparent and illegible ways, and those decisions sometimes produce pretty large-scale failures”, I expect that they will find these concerns pretty credible.
Caveating that not all of these concerns would go away with more transparency and that I do generally buy that more transparency will (in some cases) lead to a net increase on the attack surface. The tradeoffs here seem quite difficult.
But my own opinion is that OP has shifted too far in the “worry a lot about PR in the conventional sense” direction in ways that have not only led to less funding for important projects but also led to a corresponding reduction in reputation/status/prestige, both within and outside of EA circles.
Thanks for explaining your position here!
(Again, feel free to stop this at any time)
> The attitude in EA communities is “give an inch, fight a mile”. So I’ll choose to be less legible instead.
I assume you meant that EAs are the ones fighting OP, with things like poor comments?[1]
If you feel that way, that seems intensely bad, for both you and the rest of the community.
I’m obviously saddened that reactions from this community seem to have been so frustrating. It also generally seems unhealthy to have an epistemic environment like this.
I’m really curious about ways to make these communications go better. My impression is that:
1. Communication is very important.
2. Most (though not all) people in OP+EA are very reasonable and altruistic.
3. It is often the case that posting important content on the EA Forum is unpleasant. Sometimes the most upset people might respond, for instance, and upvotes/downvotes can be scary. I know a lot of people who very much avoid this.
This doesn’t seem like it should be an impossible knot to me. Even if it cost $10M+ to hire some professional business coaches or something, that seems very likely worth it to me. We could move key conversations to any possible platforms that might be smoother. (Conversations, podcasts, AMAs, private or public, etc).
I’m not sure if I could be useful here, but definitely happy to help if there are ideas. If there are suggestions you have for me or community members to make things go better, I’d be eager.
I personally strongly care about communication between these parties feeling open and not adversarial—I like and value both these groups (OP + EA) a lot, and it really seems like a pity (a huge amount of EV lost) if communication is seriously strained.
[1](It could also mean external journalists anti-EA doing the fighting, with blog posts)
Nice points, Ozzie! For reference, Alex wrote:
I asked Alex about the above 3 months ago:
There was no answer.