I found a lot of this post disconcerting because of how often you linked to LessWrong posts, even when doing so didn’t add anything. I think it would be better if you didn’t rely on LW concepts so much and just say what you want to say without making outside references.
I mulled over this article for quite awhile before posting it, and this included the pruning of many hyperlinks deemed unnecessary. Of course, the links that remain are meant to produce a more concise article, not a more opaque one, so what you say is unfortunate to read. I would be interested in some specific examples of links or idiosyncratic language that either don’t add value to or subtract value from the article.
It sure isn’t good if I’m coming off as a crank though. I consider the points within this article very important.
Linking to the Wikipedia pages for effective altruism, existential risk, etc. is unnecessary because almost all of your audience will be familiar with these terms.
For lots of your links, I had no problem understanding what you meant without reading the associated LW post.
You used a lot of LW jargon where you could have phrased things differently to avoid it: “dissolve the question”, “disguised queries”, “taboo”, “confidence levels outside of an argument”.
Lots of your links were tangential or just didn’t add anything to what you already said: “a wise outsider”, your three links for “save the world”, “the commonly used definition”, “you can arrive at true beliefs...”, “but they took the risk of riding...”, “useless sentiment”, “and it’s okay”.
I believe the following links were fine and you could leave them in: “mind-killed”, “eschatology”, “a common interest of many causes”, “you can see malaria evaporating”, “Against Malaria Foundation” (although I’d link to the website rather than the Wikipedia page), “Existential Strategy Research”. I’d remove all the others. Although you might want to remove some of these too—each of links to LessWrong posts on this list is fine on its own, but you probably don’t want to have more than one or two links to the same website/author in an article of this length. Hope that helps.
You can rephrase LW jargon with what the jargon represents (in LW jargon, “replace the symbol with the substance”):
For one example, instead of saying:
I’m not that familiar with the EA community, but I predict that debates about cause prioritization, especially when existential risk mitigation is among the causes being discussed, can become mind-killed extremely quickly. And I don’t mean to convey that in the tone of a wise outsider. It makes sense, considering the stakes at hand and the eschatological undertones of existential risk. (That is to say that the phrase ‘save the world’ can be sobering or gross, depending on the individual.) So, as is always implicit, but is sometimes worth making explicit, I’m criticizing some arguments as I understand them, not any person. I write this precisely because rationality is a common interest of many causes. I’ll be focusing on the part about existential risk, as well as the parts that it is dependent upon. Lastly, I’d be interested in knowing if anyone else has criticized this speech in writing or come to conclusions similar to mine. Without further ado:
Say:
I’m not that familiar with the EA community, but I predict that debates about cause prioritization, especially when existential risk mitigation is among the causes being discussed, can become the kinds of conversations where biases make it too hard to have a discussion based just on the facts. And I don’t mean to convey that in the tone of someone outside the EA movement trying to appear smart. It makes sense, considering the stakes at hand and the connections between existential risk and weird beliefs of “life after death”. (That is to say that the phrase ‘save the world’ can be sobering or gross, depending on the individual.) So, as is always implicit, but is sometimes worth making explicit, I’m criticizing some arguments as I understand them, not any person. I write this precisely because having more rationality is important for advancing every EA cause. I’ll be focusing on the part about existential risk, as well as the parts that it is dependent upon. Lastly, I’d be interested in knowing if anyone else has criticized this speech in writing or come to conclusions similar to mine. Without further ado:
Thank you.
I mulled over this article for quite awhile before posting it, and this included the pruning of many hyperlinks deemed unnecessary. Of course, the links that remain are meant to produce a more concise article, not a more opaque one, so what you say is unfortunate to read. I would be interested in some specific examples of links or idiosyncratic language that either don’t add value to or subtract value from the article.
It sure isn’t good if I’m coming off as a crank though. I consider the points within this article very important.
Specific examples:
Linking to the Wikipedia pages for effective altruism, existential risk, etc. is unnecessary because almost all of your audience will be familiar with these terms.
For lots of your links, I had no problem understanding what you meant without reading the associated LW post.
You used a lot of LW jargon where you could have phrased things differently to avoid it: “dissolve the question”, “disguised queries”, “taboo”, “confidence levels outside of an argument”.
Lots of your links were tangential or just didn’t add anything to what you already said: “a wise outsider”, your three links for “save the world”, “the commonly used definition”, “you can arrive at true beliefs...”, “but they took the risk of riding...”, “useless sentiment”, “and it’s okay”.
I believe the following links were fine and you could leave them in: “mind-killed”, “eschatology”, “a common interest of many causes”, “you can see malaria evaporating”, “Against Malaria Foundation” (although I’d link to the website rather than the Wikipedia page), “Existential Strategy Research”. I’d remove all the others. Although you might want to remove some of these too—each of links to LessWrong posts on this list is fine on its own, but you probably don’t want to have more than one or two links to the same website/author in an article of this length. Hope that helps.
You can rephrase LW jargon with what the jargon represents (in LW jargon, “replace the symbol with the substance”):
For one example, instead of saying:
Say: