Your second statement is basically right, though my personal view is they impose costs on the movement/āEA brand and not just us personally.ā¦ I hope to see everything funded by a more diverse group of actors, so that their dollar and non-dollar costs are more distributed.
Do you think that these āPRā costs would be mitigated if there were more large (perhaps more obscure) donors? Also, do you think that āweirdā stuff like artificial sentience should be funded at all or just not by Good Ventures?
Yes, Iām explicitly pro-funding by others. Framing the costs as āPRā limits the way people think about mitigating costs. Itās not just ālower riskā but more shared responsibility and energy to engage with decision making, persuading, defending, etc.
@Dustin Moskovitz I think some of the confusion is resulting from this:
Your second statement is basically right, though my personal view is they impose costs on the movement/āEA brand and not just us personally. Digital minds work, for example, primes the idea that our AI safety concerns are focused on consciousness-driven catalysts (āTerminator scenariosā), when in reality that is just one of a wide variety of ways AI can result in catastrophe.
In my reading of the thread, you first said āyeah, basically I think a lot of these funding changes are based on reputational risk to me and to the broader EA movement.ā
Then, people started challenging things like āhow much should reputational risk to the EA movement matter and what really are the second-order effects of things like digital minds research.ā
Then, I was expecting you to just say something like āyeah, we probably disagree on the importance of reputation and second-order effects.ā
But instead, it feels (to me) like you kind of backtracked and said āno actually, itās not really about reputation. Itās more about limited capacityā we have finite energy, attention, stress, etc. Also shared responsibility.ā
Itās plausible that Iām misunderstanding something, but it felt (at least to me) like your earlier message made it seem like PR/āreputation was the central factor and your later messages made it seem like itās more about limited capacity/āenergy. These feel like two pretty different rationales, so it might be helpful for you to clarify which one is more influential (or present a clearer synthesis of the two rationales).
(Also, I donāt think you necessarily owe the EAF an explanationā itās your money etc etc.)
>> In my reading of the thread, you first said āyeah, basically I think a lot of these funding changes are based on reputational risk to me and to the broader EA movement.ā
I agree people are paraphrasing me like this. Letās go back to the quote I affirmed: āSeparately, my guess is one of the key dimensions on which Dustin/āCari have strong opinions here are things that affect Dustin and Cariās public reputation in an adverse way, or are generally āweirdā in a way that might impose more costs on Dustin and Cari.ā
I read the part after āorā as extending the frame beyond reputation risks, and I was pleased to see that and chose to engage with it. The example in my comment is not about reputation. Later comments from Oliver seem to imply he really did mean just PR risk so I was wrong to affirm this.
If you look at my comments here and in my post, Iāve elaborated on other issues quite a few times and people keep ignoring those comments and projecting āPR riskā on to everything. I feel incapable of being heard correctly at this point, so I guess it was a mistake to speak up at all and Iām going to stop now. [Sorry I got frustrated; everyone is trying their best to do the most good here] I would appreciate if people did not paraphrase me from these comments and instead used actual quotes.
I want to echo the other replies here, and thank you for how much youāve already engaged on this post, although I can see why you want to stop now.
I did in fact round off what you were saying as being about PR risk yesterday, and I commented as such, and you replied to correct that, and I found that really helpfulāIām guessing a lot of others did too. I suppose if I had already understood, I wouldnāt have commented.
Iām not detailing specific decisions for the same reason I want to invest in fewer focus areas: additional information is used as additional attack surface area. The attitude in EA communities is āgive an inch, fight a mileā. So Iāll choose to be less legible instead
At the risk of overstepping or stating the obvious:
It seems to me like thereās been less legibility lately, and I think that means that a lot more confusion brews under the surface. So more stuff boils up when there is actually an outlet.
Thatās definitely not your responsibility, and itās particularly awkward if you end up taking the brunt of it by actually stepping forward to engage. But from my perspective, you engaging here has been good in most regards, with the notable exception that it might have left you more wary to engage in future.
I read the part after āorā as extending the frame beyond reputation risks, and I was pleased to see that and chose to engage with it.
Ah, gotcha. This makes senseā thanks for the clarification.
If you look at my comments here and in my post, Iāve elaborated on other issues quite a few times and people keep ignoring those comments and projecting āPR riskā on to everything
Iāve looked over the comments here a few times, and I suspect you might think youāre coming off more clearly than you actually are. Itās plausible to me that since you have all the context of your decision-making, you donāt see when youāre saying things that would genuinely confuse others.
For example, even in statement you affirmed, I see how if one is paying attention to the āorā, one could see you technically only/āprimarily endorsing the non-PR part of the phrase.
But in general, I think itās pretty reasonable and expected that people ended up focusing on the PR part.
More broadly, I think some of your statements have been kind of short and able to be interpreted in many ways. EG, I donāt get a clear sense of what you mean by this:
Itās not just ālower riskā but more shared responsibility and energy to engage with decision making, persuading, defending, etc.
I think itās reasonable for you to stop engaging here. Communication is hard and costly, misinterpretations are common and drain energy, etc. Just noting thatā from my POVā this is less of a case of āpeople were interpreting you uncharitablyā and more of a case of āit was/āis genuinely kind of hard to tell what you believe, and I suspect people are mostly engaging in good faith here.ā
I feel incapable of being heard correctly at this point, so I guess it was a mistake to speak up at all and Iām going to stop now.
Sorry to hear that, several people Iāve spoken to about this offline also feel that you are being open and agreeable and the discussion reads from the outside as fairly civil, so except perhaps with the potential heat of this exchange with Ollie, Iād say most people get it and are happy you participated, particularly given that you didnāt need to. For myself, the bulk of my concern is with how I perceive OP to have handled this given their place in the EA community, rather than my personal and irrelevant partial disagreement with your personal funding decisions.
[edited to add āpartialā in the last sentence]
I feel incapable of being heard correctly at this point, so I guess it was a mistake to speak up at all and Iām going to stop now.
Noooo, sorry you feel that way. T_T I think you sharing your thinking here is really helpful for the broader EA and good-doer field, and I think itās an unfortunate pattern that online communications quickly feels (or even is) somewhat exhausting and combative.
Just an idea, maybe you would have a much better time doing an interview with e.g. Spencer Greenberg on his Clearer Thinking podcast, or Robert Wiblin on the 80,000 Hours podcast? I feel like they are pretty good interviewers who can ask good questions that make for accurate and informative interviews.
To be clear, I definitely didnāt just mean PR risks! (Or I meant them in a way that was intended in a quite broad way that includes lots of the other things you talk about). I tried to be quite mindful of that in for example my latest comment.
Do you think that these āPRā costs would be mitigated if there were more large (perhaps more obscure) donors? Also, do you think that āweirdā stuff like artificial sentience should be funded at all or just not by Good Ventures?
[edit: see this other comment by Dustin]
Yes, Iām explicitly pro-funding by others. Framing the costs as āPRā limits the way people think about mitigating costs. Itās not just ālower riskā but more shared responsibility and energy to engage with decision making, persuading, defending, etc.
@Dustin Moskovitz I think some of the confusion is resulting from this:
In my reading of the thread, you first said āyeah, basically I think a lot of these funding changes are based on reputational risk to me and to the broader EA movement.ā
Then, people started challenging things like āhow much should reputational risk to the EA movement matter and what really are the second-order effects of things like digital minds research.ā
Then, I was expecting you to just say something like āyeah, we probably disagree on the importance of reputation and second-order effects.ā
But instead, it feels (to me) like you kind of backtracked and said āno actually, itās not really about reputation. Itās more about limited capacityā we have finite energy, attention, stress, etc. Also shared responsibility.ā
Itās plausible that Iām misunderstanding something, but it felt (at least to me) like your earlier message made it seem like PR/āreputation was the central factor and your later messages made it seem like itās more about limited capacity/āenergy. These feel like two pretty different rationales, so it might be helpful for you to clarify which one is more influential (or present a clearer synthesis of the two rationales).
(Also, I donāt think you necessarily owe the EAF an explanationā itās your money etc etc.)
>> In my reading of the thread, you first said āyeah, basically I think a lot of these funding changes are based on reputational risk to me and to the broader EA movement.ā
I agree people are paraphrasing me like this. Letās go back to the quote I affirmed: āSeparately, my guess is one of the key dimensions on which Dustin/āCari have strong opinions here are things that affect Dustin and Cariās public reputation in an adverse way, or are generally āweirdā in a way that might impose more costs on Dustin and Cari.ā
I read the part after āorā as extending the frame beyond reputation risks, and I was pleased to see that and chose to engage with it. The example in my comment is not about reputation. Later comments from Oliver seem to imply he really did mean just PR risk so I was wrong to affirm this.
If you look at my comments here and in my post, Iāve elaborated on other issues quite a few times and people keep ignoring those comments and projecting āPR riskā on to everything.
I feel incapable of being heard correctly at this point, so I guess it was a mistake to speak up at all and Iām going to stop now.[Sorry I got frustrated; everyone is trying their best to do the most good here] I would appreciate if people did not paraphrase me from these comments and instead used actual quotes.I want to echo the other replies here, and thank you for how much youāve already engaged on this post, although I can see why you want to stop now.
I did in fact round off what you were saying as being about PR risk yesterday, and I commented as such, and you replied to correct that, and I found that really helpfulāIām guessing a lot of others did too. I suppose if I had already understood, I wouldnāt have commented.
At the risk of overstepping or stating the obvious:
It seems to me like thereās been less legibility lately, and I think that means that a lot more confusion brews under the surface. So more stuff boils up when there is actually an outlet.
Thatās definitely not your responsibility, and itās particularly awkward if you end up taking the brunt of it by actually stepping forward to engage. But from my perspective, you engaging here has been good in most regards, with the notable exception that it might have left you more wary to engage in future.
Ah, gotcha. This makes senseā thanks for the clarification.
Iāve looked over the comments here a few times, and I suspect you might think youāre coming off more clearly than you actually are. Itās plausible to me that since you have all the context of your decision-making, you donāt see when youāre saying things that would genuinely confuse others.
For example, even in statement you affirmed, I see how if one is paying attention to the āorā, one could see you technically only/āprimarily endorsing the non-PR part of the phrase.
But in general, I think itās pretty reasonable and expected that people ended up focusing on the PR part.
More broadly, I think some of your statements have been kind of short and able to be interpreted in many ways. EG, I donāt get a clear sense of what you mean by this:
I think itās reasonable for you to stop engaging here. Communication is hard and costly, misinterpretations are common and drain energy, etc. Just noting thatā from my POVā this is less of a case of āpeople were interpreting you uncharitablyā and more of a case of āit was/āis genuinely kind of hard to tell what you believe, and I suspect people are mostly engaging in good faith here.ā
Sorry to hear that, several people Iāve spoken to about this offline also feel that you are being open and agreeable and the discussion reads from the outside as fairly civil, so except perhaps with the potential heat of this exchange with Ollie, Iād say most people get it and are happy you participated, particularly given that you didnāt need to. For myself, the bulk of my concern is with how I perceive OP to have handled this given their place in the EA community, rather than my personal and irrelevant partial disagreement with your personal funding decisions.
[edited to add āpartialā in the last sentence]
Noooo, sorry you feel that way. T_T I think you sharing your thinking here is really helpful for the broader EA and good-doer field, and I think itās an unfortunate pattern that online communications quickly feels (or even is) somewhat exhausting and combative.
Just an idea, maybe you would have a much better time doing an interview with e.g. Spencer Greenberg on his Clearer Thinking podcast, or Robert Wiblin on the 80,000 Hours podcast? I feel like they are pretty good interviewers who can ask good questions that make for accurate and informative interviews.
To be clear, I definitely didnāt just mean PR risks! (Or I meant them in a way that was intended in a quite broad way that includes lots of the other things you talk about). I tried to be quite mindful of that in for example my latest comment.
Can you give an example of a non-PR risk that you had in mind?