I have some ideas and drafts for posts to the EA Forums and Less Wrong that I’ve been sitting on because I feel somewhat intimidated by the level of intellectual rigor I would need to put into the final drafts to ensure I’m not downvoted into oblivion (particularly on Less Wrong, where a younger me experienced such in the early days).
Should I try to overcome this fear, or is it justified?
For the EA Forums, I was thinking about explaining my personal practical take on moral philosophy (Eudaimonic Utilitarianism with Kantian Priors), but I don’t know if that’s actually worth explaining given that EA tries to be inclusive and not take particular stands on morality, and it might not be relevant enough to the forum.
For Less Wrong, I have a draft of a response to Eliezer’s List of Lethalities post that I’ve been sitting on since 2022/04/11 because I doubted it would be well received given that it tries to be hopeful and, as a former machine learning scientist, I try to challenge a lot of LW orthodoxy about AGI in it. I have tremendous respect for Eliezer though, so I’m also uncertain if my ideas and arguments aren’t just hairbrained foolishness that will be shot down rapidly once exposed to the real world, and the incisive criticism of Less Wrongers.
The posts in both places are also now of such high quality that I feel the bar is too high for me to meet with my writing, which tends to be more “interesting train-of-thought in unformatted paragraphs” than the “point-by-point articulate with section titles and footnotes” style that people in both places tend to employ.
As someone who most of my time here critiquing EA/rationalist orthodoxy, I don’t think you have much to worry about, besides annoying comments. A good faith critique presented politely is rarely downvoted.
Also, I feel like there’s selection bias going on around the quality of posts. The best, super highly upvoted posts may be extremely high quality, but there are still plenty of posts that aren’t (and that’s fine, this is an open forum, not an academic journal).
I’d be interested in reading your list of lethalities response. I’m not sure it would be that badly recieved, for example, this response by quinton pope got 360 upvotes. List of lethalities seems to be a fringe view even among AI x-risk researchers, let alone the wider machine learning community.
This is one reason why it’s very common for people to write a Google doc first, share it around, update it based on feedback and then post. But this only works if you know enough people who are willing to give you feedback.
An additional option: if you don’t know people who are willing to review a document and give you feedback, you could ask people in the Effective Altruism Editing and Review Facebook group to review it.
On this Forum, it is rather rare for good-faith posts to end up with net negative karma. The “worst” reasonably likely outcome is to get very little engagement with your post, which is still more engagement than it will get in your drafts folder. I can’t speak to LW, though.
I also think that the appropriate reference point is not the median level of the average post here, but much of the range of first posts from people who have developed into recognized successful posters.
From your description, my only concern would be whether your post sufficiently relates to EA. If it’s ~80-90 percent a philosophy piece, maybe there’s a better outlet for it. If it’s ~50-70 percent, maybe it would work here with a brief summary of the philosophical position upfront and an internal link for the reader who wants to jump directly to the more directly EA-relevant content?
I’ve often felt a similar my thoughts aren’t valuable enough to share feeling. I tend to write these thoughts as a quick take rather than as a normal forum post, and I also try to phrase my words in a manner to indicate that I am writing rough thoughts, or observations, or something similarly non-rigorous (as sort of a signal to the reader that it shouldn’t be evaluated by the same standard).
Either it’ll be recieved well, or you get free criticism on your ideas, or a blend of the two. You win in all cases. If it get down voted into oblivion you can always delete it; how many deleted posts can you tie to an author? I can’t name one.
Ultimately, nobody cares about you (or me, or any other random forum user). They’re too busy worrying about how they’ll be perceived. This is a blessing. You can take risks and nobody will really care if you fail.
Either it’ll be recieved well, or you get free criticism on your ideas, or a blend of the two.
A tough pill for super-sensitives like me to swallow, but I can see it as an exceptionally powerful one. I surely sympathize with OP on the fear of being downvoted—it’s what kept me away from this site for months and from Reddit entirely—but valid criticism on many occasions has influenced me for the better, even if I’m scornful of the moments. Maybe my hurt with being wrong will lessen someday or maybe not, but knowing why can serve me well in the end, I can admit that.
My view is you should write/post something if you believe it’s an idea that people haven’t sufficiently engaged with in the past. Both of your post ideas sound like that to me.
If you have expertise on AI, don’t be shy about showing it. If you aren’t confident, you can frame your critiques as pointed questions, but personally I think it’s better to just make your argument.
As for style, I think people will respond much better to your argument if it’s clear. Clear is different from extensive; I think your example of many-sections-with-titles-and-footnotes conflates those two. That format is valuable for giving structure to your argument, not for being a really extensive argument that covers every possible ground. I agree that “interesting train of thought in unformatted paragraphs” won’t likely be received well in either venue. I think it’s good communication courtesy to make your ideas clear to people who you are trying to convey them to. Clear structure is your friend, not a bouncer keeping you out of the club.
I have some ideas and drafts for posts to the EA Forums and Less Wrong that I’ve been sitting on because I feel somewhat intimidated by the level of intellectual rigor I would need to put into the final drafts to ensure I’m not downvoted into oblivion (particularly on Less Wrong, where a younger me experienced such in the early days).
Should I try to overcome this fear, or is it justified?
For the EA Forums, I was thinking about explaining my personal practical take on moral philosophy (Eudaimonic Utilitarianism with Kantian Priors), but I don’t know if that’s actually worth explaining given that EA tries to be inclusive and not take particular stands on morality, and it might not be relevant enough to the forum.
For Less Wrong, I have a draft of a response to Eliezer’s List of Lethalities post that I’ve been sitting on since 2022/04/11 because I doubted it would be well received given that it tries to be hopeful and, as a former machine learning scientist, I try to challenge a lot of LW orthodoxy about AGI in it. I have tremendous respect for Eliezer though, so I’m also uncertain if my ideas and arguments aren’t just hairbrained foolishness that will be shot down rapidly once exposed to the real world, and the incisive criticism of Less Wrongers.
The posts in both places are also now of such high quality that I feel the bar is too high for me to meet with my writing, which tends to be more “interesting train-of-thought in unformatted paragraphs” than the “point-by-point articulate with section titles and footnotes” style that people in both places tend to employ.
Anyone have any thoughts?
Short form/quick takes can be a good compromise, and sources of feedback for later versions.
As someone who most of my time here critiquing EA/rationalist orthodoxy, I don’t think you have much to worry about, besides annoying comments. A good faith critique presented politely is rarely downvoted.
Also, I feel like there’s selection bias going on around the quality of posts. The best, super highly upvoted posts may be extremely high quality, but there are still plenty of posts that aren’t (and that’s fine, this is an open forum, not an academic journal).
I’d be interested in reading your list of lethalities response. I’m not sure it would be that badly recieved, for example, this response by quinton pope got 360 upvotes. List of lethalities seems to be a fringe view even among AI x-risk researchers, let alone the wider machine learning community.
This is one reason why it’s very common for people to write a Google doc first, share it around, update it based on feedback and then post. But this only works if you know enough people who are willing to give you feedback.
An additional option: if you don’t know people who are willing to review a document and give you feedback, you could ask people in the Effective Altruism Editing and Review Facebook group to review it.
On this Forum, it is rather rare for good-faith posts to end up with net negative karma. The “worst” reasonably likely outcome is to get very little engagement with your post, which is still more engagement than it will get in your drafts folder. I can’t speak to LW, though.
I also think that the appropriate reference point is not the median level of the average post here, but much of the range of first posts from people who have developed into recognized successful posters.
From your description, my only concern would be whether your post sufficiently relates to EA. If it’s ~80-90 percent a philosophy piece, maybe there’s a better outlet for it. If it’s ~50-70 percent, maybe it would work here with a brief summary of the philosophical position upfront and an internal link for the reader who wants to jump directly to the more directly EA-relevant content?
I encourage you to share your ideas.
I’ve often felt a similar my thoughts aren’t valuable enough to share feeling. I tend to write these thoughts as a quick take rather than as a normal forum post, and I also try to phrase my words in a manner to indicate that I am writing rough thoughts, or observations, or something similarly non-rigorous (as sort of a signal to the reader that it shouldn’t be evaluated by the same standard).
Either it’ll be recieved well, or you get free criticism on your ideas, or a blend of the two. You win in all cases. If it get down voted into oblivion you can always delete it; how many deleted posts can you tie to an author? I can’t name one.
Ultimately, nobody cares about you (or me, or any other random forum user). They’re too busy worrying about how they’ll be perceived. This is a blessing. You can take risks and nobody will really care if you fail.
A tough pill for super-sensitives like me to swallow, but I can see it as an exceptionally powerful one. I surely sympathize with OP on the fear of being downvoted—it’s what kept me away from this site for months and from Reddit entirely—but valid criticism on many occasions has influenced me for the better, even if I’m scornful of the moments. Maybe my hurt with being wrong will lessen someday or maybe not, but knowing why can serve me well in the end, I can admit that.
My view is you should write/post something if you believe it’s an idea that people haven’t sufficiently engaged with in the past. Both of your post ideas sound like that to me.
If you have expertise on AI, don’t be shy about showing it. If you aren’t confident, you can frame your critiques as pointed questions, but personally I think it’s better to just make your argument.
As for style, I think people will respond much better to your argument if it’s clear. Clear is different from extensive; I think your example of many-sections-with-titles-and-footnotes conflates those two. That format is valuable for giving structure to your argument, not for being a really extensive argument that covers every possible ground. I agree that “interesting train of thought in unformatted paragraphs” won’t likely be received well in either venue. I think it’s good communication courtesy to make your ideas clear to people who you are trying to convey them to. Clear structure is your friend, not a bouncer keeping you out of the club.
Post links to google docs as quick takes if posting posts proper feels like a high bar?