$28,000 to print hardback copies of fanfiction. $20,000 to someone who was feeling “burnt out” so they can learn to ride a bike (an actual measure of success from an adult in a grant application!) and be unemployed. $30,000 to someones friend on the basis they are good at “Facilitating conversations”. $39,000 to make unsuccessful youtube videos. And these are “a strong set of grants” according to the top upvoted post. Wow.
I think this comment, while quite rude, does get at something valuable. There’s an argument that goes “hmm, the outside view says this is absurd, we should be really sure of our inside view before proceeding” and I think that’s sometimes a bit of a neglected perspective in rationalist/EA spaces.
I happen to know that the inside view on HPMoR bringing people into the community is very strong, and that the inside view on Eli Tyre doing good and important work is also very strong. I’m less familiar with the details behind the other grants that anoneaagain highlighted, but I do think that being aware and recognizing the… unorthodoxy of these proposals is important, even if the inside view does end up overriding that.
I think there is something going on in this comment that I wouldn’t put in the category of “outside view”. Instead I would put it in the category of “perceiving something as intuitively weird, and reacting to it”.
I think weirdness is overall a pretty bad predictor of impact, both in the positive and negative direction. I think it’s a good emotion to pay attention to, because often you can learn valuable things from it, but I think it only sometimes tends to give rise to real arguments in favor or against an idea.
It is also very susceptible to framing effects. The comment above says “$39,000 to make unsuccessful youtube videos”. That sure sounds naive and weird, but the whole argument relies on the word “unsuccessful” which is a pure framing device and fully unsubstantiated.
And, even though I think weirdness is only a mediocre predictor of impact, I am quite confident that the degree to which a grant or a grantee is perceived as intuitively weird by broad societal standards, is still by far the biggest predictor of whether your project can receive a grant from any major EA granting body (I don’t think this is necessarily the fault of the granting bodies, but is instead a result of a variety of complicated social incentives that force their hand most of the time).
I think this has an incredibly negative effect on the ability of the Effective Altruism community to make progress on any of the big problems we care about, and I really don’t think we want to push further in that direction.
I think you want to pay attention to whether you perceive something as weird, but I don’t think that feeling should be among your top considerations when evaluating an idea or project, and I think right now it is usually the single biggest consideration in most discourse.
After chatting with you about this via PMs, I think you aren’t necessarily making that mistake, since I think you do emphasize that there are many arguments that could convince you that something weird is still a good idea.
I think in particular it is important that “something being perceived as weird is definitely not sufficient reason to dismiss it as an effective intervention” to be common knowledge and part of public discourse. As well as “if someone is doing something that looks weird to me, without me having thought much about it or asked them much about their reasons for doing things, then that isn’t super much evidence about what they are doing being a bad idea”.
I think there is something going on in this comment that I wouldn’t put in the category of “outside view”. Instead I would put it in the category of “perceiving something as intuitively weird, and reacting to it”.
I think there’s two things going on here.
The first is that weirdness and outside view are often deeply correlated, although not the same thing. In many ways the feeling of weirdness is a schelling fence. It protects people from sociopaths, joining cults, and other things that are a bad idea but they can’t quite articulate in words WHY it’s a bad idea.
I think you’re right that the best interventions will many times be weird, so in this case its’ a schelling fence that you have to ignore if you want to make any progress from an inside view… but it’s still worth noting that weirdness is there and good data.
The second thing going on is that it seems like many EA institutions have adopted the neoliberal stategy of gaining high status, infiltrating academia, and using that to advance EA values. From this perspective, it’s very important to avoid an aura of weirdness for the movement as a whole, even if any given individual weird intervention might have high impact. This is hard to talk about because being too loud about the strategy makes it less effective, which means that sometimes people have to say things like “outside view” when what they really mean is “you’re threatening our long term strategy but we can’t talk about it.” Although obviously in this particular case the positive impact on this strategy outweighs the potential negative impact of the weirdness aura.
I feel comfortable stating this because it’s a random EA forum post and I’m not in a position of power at an EA org, but were I in that position, I’d feel much less comfortable posting this.
The main thing that pinged me about anoneaagain’s comment was that it’s saying things that aren’t true, and saying them in ways that aren’t epistemically cooperative, more so than that it’s merely unkind. If you’re going to assert ‘this person’s youtube videos are unsuccessful’, you should say what you mean by that and why you think it. If the thing you’re responding to is a long, skimmable 75-page post, you should make sure your readers didn’t miss the fact that the person you’re alluding to is a Computerphile contributor whose videos there tend to get hundreds of thousands of views, and you should say something about why that’s not relevant to your success metric (or to the broader goals LTFF should be focusing on).
Wink-and-nudge, connotation-based argument makes it hard to figure out what argument’s being made, which makes it hard to have a back-and-forth. If we strip aside the connotation, it’s harder to see what’s laughable about ideas like “it can be useful to send people books” or “it can be useful to send people books that aren’t textbooks, essay collections, or works of original fiction”. Likewise, it doesn’t seem silly to me for people with disabilities to work on EA projects, or to suggest that disability treatment could be relevant to some EA projects. But I have no idea where to go from there in responding to anoneaagain, because the comment’s argument structure is hidden.
$28,000 to print hardback copies of fanfiction. $20,000 to someone who was feeling “burnt out” so they can learn to ride a bike (an actual measure of success from an adult in a grant application!) and be unemployed. $30,000 to someones friend on the basis they are good at “Facilitating conversations”. $39,000 to make unsuccessful youtube videos. And these are “a strong set of grants” according to the top upvoted post. Wow.
I think this comment, while quite rude, does get at something valuable. There’s an argument that goes “hmm, the outside view says this is absurd, we should be really sure of our inside view before proceeding” and I think that’s sometimes a bit of a neglected perspective in rationalist/EA spaces.
I happen to know that the inside view on HPMoR bringing people into the community is very strong, and that the inside view on Eli Tyre doing good and important work is also very strong. I’m less familiar with the details behind the other grants that anoneaagain highlighted, but I do think that being aware and recognizing the… unorthodoxy of these proposals is important, even if the inside view does end up overriding that.
I think there is something going on in this comment that I wouldn’t put in the category of “outside view”. Instead I would put it in the category of “perceiving something as intuitively weird, and reacting to it”.
I think weirdness is overall a pretty bad predictor of impact, both in the positive and negative direction. I think it’s a good emotion to pay attention to, because often you can learn valuable things from it, but I think it only sometimes tends to give rise to real arguments in favor or against an idea.
It is also very susceptible to framing effects. The comment above says “$39,000 to make unsuccessful youtube videos”. That sure sounds naive and weird, but the whole argument relies on the word “unsuccessful” which is a pure framing device and fully unsubstantiated.
And, even though I think weirdness is only a mediocre predictor of impact, I am quite confident that the degree to which a grant or a grantee is perceived as intuitively weird by broad societal standards, is still by far the biggest predictor of whether your project can receive a grant from any major EA granting body (I don’t think this is necessarily the fault of the granting bodies, but is instead a result of a variety of complicated social incentives that force their hand most of the time).
I think this has an incredibly negative effect on the ability of the Effective Altruism community to make progress on any of the big problems we care about, and I really don’t think we want to push further in that direction.
I think you want to pay attention to whether you perceive something as weird, but I don’t think that feeling should be among your top considerations when evaluating an idea or project, and I think right now it is usually the single biggest consideration in most discourse.
After chatting with you about this via PMs, I think you aren’t necessarily making that mistake, since I think you do emphasize that there are many arguments that could convince you that something weird is still a good idea.
I think in particular it is important that “something being perceived as weird is definitely not sufficient reason to dismiss it as an effective intervention” to be common knowledge and part of public discourse. As well as “if someone is doing something that looks weird to me, without me having thought much about it or asked them much about their reasons for doing things, then that isn’t super much evidence about what they are doing being a bad idea”.
I think there’s two things going on here.
The first is that weirdness and outside view are often deeply correlated, although not the same thing. In many ways the feeling of weirdness is a schelling fence. It protects people from sociopaths, joining cults, and other things that are a bad idea but they can’t quite articulate in words WHY it’s a bad idea.
I think you’re right that the best interventions will many times be weird, so in this case its’ a schelling fence that you have to ignore if you want to make any progress from an inside view… but it’s still worth noting that weirdness is there and good data.
The second thing going on is that it seems like many EA institutions have adopted the neoliberal stategy of gaining high status, infiltrating academia, and using that to advance EA values. From this perspective, it’s very important to avoid an aura of weirdness for the movement as a whole, even if any given individual weird intervention might have high impact. This is hard to talk about because being too loud about the strategy makes it less effective, which means that sometimes people have to say things like “outside view” when what they really mean is “you’re threatening our long term strategy but we can’t talk about it.” Although obviously in this particular case the positive impact on this strategy outweighs the potential negative impact of the weirdness aura.
I feel comfortable stating this because it’s a random EA forum post and I’m not in a position of power at an EA org, but were I in that position, I’d feel much less comfortable posting this.
Downvoted for an unnecessarily unkind tone.
The main thing that pinged me about anoneaagain’s comment was that it’s saying things that aren’t true, and saying them in ways that aren’t epistemically cooperative, more so than that it’s merely unkind. If you’re going to assert ‘this person’s youtube videos are unsuccessful’, you should say what you mean by that and why you think it. If the thing you’re responding to is a long, skimmable 75-page post, you should make sure your readers didn’t miss the fact that the person you’re alluding to is a Computerphile contributor whose videos there tend to get hundreds of thousands of views, and you should say something about why that’s not relevant to your success metric (or to the broader goals LTFF should be focusing on).
Wink-and-nudge, connotation-based argument makes it hard to figure out what argument’s being made, which makes it hard to have a back-and-forth. If we strip aside the connotation, it’s harder to see what’s laughable about ideas like “it can be useful to send people books” or “it can be useful to send people books that aren’t textbooks, essay collections, or works of original fiction”. Likewise, it doesn’t seem silly to me for people with disabilities to work on EA projects, or to suggest that disability treatment could be relevant to some EA projects. But I have no idea where to go from there in responding to anoneaagain, because the comment’s argument structure is hidden.
If you don’t mind me asking, what did goal did you intend to achieve or accomplish with this comment?