I found a lot of this post disconcerting because of how often you linked to LessWrong posts, even when doing so didn’t add anything. I think it would be better if you didn’t rely on LW concepts so much and just say what you want to say without making outside references.
[I]magine that you are at some point on a long road, truly in the middle of nowhere, and you see a man whose car has a flat tire. You know that someone else may not drive by for hours, and you don’t know how well-prepared the man is for that eventuality. You consider stopping your car to help; you have a spare, you know how to change tires, and you’ve seen it work before. And if you don’t do it right the first time for some weird reason, you can always try again.
But suddenly, you notice that there is a person lying motionless on the ground, some ways down the road; far, but visible. There’s no cellphone service, it would take an ambulance hours to get here unless they happened to be driving by, and you have no medical training or experience.
I don’t know about you, but even if I’m having an extremely hard time thinking of things to do about a guy dying on my watch in the middle of nowhere, the last thing I do is say, “I have no idea what to do if I try to save that guy, but I know exactly how to change a tire, so why don’t I just change the tire instead.” Because even if I don’t know what to do, saving a life is so much more important than changing a tire that I don’t care about the uncertainty.
I found a lot of this post disconcerting because of how often you linked to LessWrong posts, even when doing so didn’t add anything. I think it would be better if you didn’t rely on LW concepts so much and just say what you want to say without making outside references.
I mulled over this article for quite awhile before posting it, and this included the pruning of many hyperlinks deemed unnecessary. Of course, the links that remain are meant to produce a more concise article, not a more opaque one, so what you say is unfortunate to read. I would be interested in some specific examples of links or idiosyncratic language that either don’t add value to or subtract value from the article.
It sure isn’t good if I’m coming off as a crank though. I consider the points within this article very important.
Linking to the Wikipedia pages for effective altruism, existential risk, etc. is unnecessary because almost all of your audience will be familiar with these terms.
For lots of your links, I had no problem understanding what you meant without reading the associated LW post.
You used a lot of LW jargon where you could have phrased things differently to avoid it: “dissolve the question”, “disguised queries”, “taboo”, “confidence levels outside of an argument”.
Lots of your links were tangential or just didn’t add anything to what you already said: “a wise outsider”, your three links for “save the world”, “the commonly used definition”, “you can arrive at true beliefs...”, “but they took the risk of riding...”, “useless sentiment”, “and it’s okay”.
I believe the following links were fine and you could leave them in: “mind-killed”, “eschatology”, “a common interest of many causes”, “you can see malaria evaporating”, “Against Malaria Foundation” (although I’d link to the website rather than the Wikipedia page), “Existential Strategy Research”. I’d remove all the others. Although you might want to remove some of these too—each of links to LessWrong posts on this list is fine on its own, but you probably don’t want to have more than one or two links to the same website/author in an article of this length. Hope that helps.
You can rephrase LW jargon with what the jargon represents (in LW jargon, “replace the symbol with the substance”):
For one example, instead of saying:
I’m not that familiar with the EA community, but I predict that debates about cause prioritization, especially when existential risk mitigation is among the causes being discussed, can become mind-killed extremely quickly. And I don’t mean to convey that in the tone of a wise outsider. It makes sense, considering the stakes at hand and the eschatological undertones of existential risk. (That is to say that the phrase ‘save the world’ can be sobering or gross, depending on the individual.) So, as is always implicit, but is sometimes worth making explicit, I’m criticizing some arguments as I understand them, not any person. I write this precisely because rationality is a common interest of many causes. I’ll be focusing on the part about existential risk, as well as the parts that it is dependent upon. Lastly, I’d be interested in knowing if anyone else has criticized this speech in writing or come to conclusions similar to mine. Without further ado:
Say:
I’m not that familiar with the EA community, but I predict that debates about cause prioritization, especially when existential risk mitigation is among the causes being discussed, can become the kinds of conversations where biases make it too hard to have a discussion based just on the facts. And I don’t mean to convey that in the tone of someone outside the EA movement trying to appear smart. It makes sense, considering the stakes at hand and the connections between existential risk and weird beliefs of “life after death”. (That is to say that the phrase ‘save the world’ can be sobering or gross, depending on the individual.) So, as is always implicit, but is sometimes worth making explicit, I’m criticizing some arguments as I understand them, not any person. I write this precisely because having more rationality is important for advancing every EA cause. I’ll be focusing on the part about existential risk, as well as the parts that it is dependent upon. Lastly, I’d be interested in knowing if anyone else has criticized this speech in writing or come to conclusions similar to mine. Without further ado:
Seconded. Maybe it’s normal on LW, but rather than being merely disconcerting, it’s sort of worrying when people rely on an entire edifice of concepts derived from a rather controversial website.
I found a lot of this post disconcerting because of how often you linked to LessWrong posts, even when doing so didn’t add anything. I think it would be better if you didn’t rely on LW concepts so much and just say what you want to say without making outside references.
I really like this bit.
Thank you.
I mulled over this article for quite awhile before posting it, and this included the pruning of many hyperlinks deemed unnecessary. Of course, the links that remain are meant to produce a more concise article, not a more opaque one, so what you say is unfortunate to read. I would be interested in some specific examples of links or idiosyncratic language that either don’t add value to or subtract value from the article.
It sure isn’t good if I’m coming off as a crank though. I consider the points within this article very important.
Specific examples:
Linking to the Wikipedia pages for effective altruism, existential risk, etc. is unnecessary because almost all of your audience will be familiar with these terms.
For lots of your links, I had no problem understanding what you meant without reading the associated LW post.
You used a lot of LW jargon where you could have phrased things differently to avoid it: “dissolve the question”, “disguised queries”, “taboo”, “confidence levels outside of an argument”.
Lots of your links were tangential or just didn’t add anything to what you already said: “a wise outsider”, your three links for “save the world”, “the commonly used definition”, “you can arrive at true beliefs...”, “but they took the risk of riding...”, “useless sentiment”, “and it’s okay”.
I believe the following links were fine and you could leave them in: “mind-killed”, “eschatology”, “a common interest of many causes”, “you can see malaria evaporating”, “Against Malaria Foundation” (although I’d link to the website rather than the Wikipedia page), “Existential Strategy Research”. I’d remove all the others. Although you might want to remove some of these too—each of links to LessWrong posts on this list is fine on its own, but you probably don’t want to have more than one or two links to the same website/author in an article of this length. Hope that helps.
You can rephrase LW jargon with what the jargon represents (in LW jargon, “replace the symbol with the substance”):
For one example, instead of saying:
Say:
Seconded. Maybe it’s normal on LW, but rather than being merely disconcerting, it’s sort of worrying when people rely on an entire edifice of concepts derived from a rather controversial website.