I don’t fully agree with this article. The main issue is that a lot of good research (especially that done by Rethink Priorities) is not specifically aimed at impact through a broad audience but rather by changing the opinions of a few specifically interested people with a lot of resources and influence (e.g., Open Phil staff). You mention “impact weighting” audience numbers but this is actually a much bigger point than you’re making it to be—an audience of ten people can very frequently be a lot more important than an audience of 10,000 if the audience of ten is carefully selected and targeted. For these audiences, I think the engagement tips you’d get from Copyblogger are much less important than being accurate, being well calibrated, choosing a useful question and answering it well, and practicing good reasoning transparency.
More generally, choosing the most impactful questions and answering them accurately and throughly is far more important than being engaging. Given that good research question selection and achieving strong accuracy and calibration are very difficult skills to find and no one is skilled at everything, I’d rather hire people who are better at choosing research questions and answering them—even if they do so in an unengaging way—than to hire people who are engaging (but less good at choosing research questions and answering them).
Some investments in engagement are pretty simple and easy to make. I think it’s particularly important to make strong summaries and make work skimmable, for example. Also more useful and easy than making writing more engaging is to invest in distribution—for example, making sure to directly share your work with important people in your audience.
A last warning—while being engaging and being true are not particularly correlated, I do worry that people are more likely to believe falsehoods when they are presented in an engaging way.
That’s not to say that being more engaging isn’t important all else being equal. And obviously you need to at least meet a minimum bar of engagingness to make your research readable. And I do think it’s good for some people to focus on broad outreach and these people really should focus on being as engaging as possible. I just that I don’t think being engaging is particularly important for EA Research and I think it’s overly sensational to call not focusing on engagement as unethical.
I did this for a number of reasons. First and foremost, I think that it hurt my case more than helped it. It put people on the defensive right from the get go and was harsh if you didn’t have the whole context of my background to know I was joking, which of course, most readers wouldn’t. This is not the best way to be persuasive.
I think there are many ways to be engaging, and being controversial is just one way. However, I think being controversial is a risky and generally suboptimal way of getting people interested in something. It mostly just promotes polarization, which is counterproductive. I wouldn’t want EA to start doing more of this compared to all of the other ways you can be more engaging (like jokes, bullet points, smaller paragraphs, stories, pull quotes, pictures, etc).
In general, a lesson I pulled from this is to have more people look at my writing before I post. I had three people check it and one person expressed reservations about the title, but they didn’t push hard and it seemed like it was an idiosyncratic preference. In fact, some suggested even more controversial ones, like “Bad writing kills babies”.
If I’d had more people I would have been able to tell in advance that I should brainstorm more titles that were both interesting, accurate, and persuasive. In my little social bubble where everybody knew me, they just thought that the title was hilarious, which led me to be a bit blindsided by the reaction.
Fortunately, this is the EA movement, and one of my favorite things about this community is that people appreciate when you update based on evidence. So, despite a rather large part of my brain that is saying “You said this thing publicly! Defend it to the ends of the earth”, I’d rather the earth not end, so I’ll update the title, and hopefully we can have a more engaging EA blogosphere while also keeping sensationalism out of it.
Fortunately, this is the EA movement, and one of my favorite things about this community is that people appreciate when you update based on evidence. So, despite a rather large part of my brain that is saying “You said this thing publicly! Defend it to the ends of the earth”, I’d rather the earth not end, so I’ll update the title, and hopefully we can have a more engaging EA blogosphere while also keeping sensationalism out of it.
I think this should be applauded. Thanks so much for engaging with your critics and learning from your mistakes, as well as teaching the rest of us something important. It definitely makes you a good person, not a bad person.
I’ve changed the title of this post. [...] First and foremost, I think that it hurt my case more than helped it. It put people on the defensive right from the get go and was harsh if you didn’t have the whole context of my background to know I was joking [...] I think there are many ways to be engaging, and being controversial is just one way. However, I think being controversial is a risky and generally suboptimal way of getting people interested in something. It mostly just promotes polarization, which is counterproductive.
I think this is a really important lesson. Optimizing for clickthroughs is not the same as optimizing for impact, and the former can be dangerous.
In general, a lesson I pulled from this is to have more people look at my writing before I post. I had three people check it and one person expressed reservations about the title, but they didn’t push hard and it seemed like it was an idiosyncratic preference. [...] In my little social bubble where everybody knew me, they just thought that the title was hilarious, which led me to be a bit blindsided by the reaction.
Another important lesson seems to be to have the post read by one or two people who aren’t that familiar with you.
In fact, some suggested even more controversial ones, like “Bad writing kills babies”.
For some reason, this title feels over the top to the point where it does make me know you are kidding. It’s also memorable. But maybe it would backfire with other people—I don’t know.
A last warning—while being engaging and being true are not particularly correlated, I do worry that people are more likely to believe falsehoods when they are presented in an engaging way.
I want to spend a large fraction of my reading time asking “wait, is this true, actually?” Many ways of making posts more engaging make it harder for me to maintain this vigilance. This includes humor, even when it is devoid of sarcasm and mockery.
Jokes, flourishes, and especially emotions often make a post’s case seem stronger to me than it actually is, even when the substance of the post contains nuance, e.g. in the form of an epistemic status. I have noticed this in writing I otherwise often find useful and insightful, such as Eliezer Yudkowsky’s, Zvi’s, or Gregory Lewis’.
I completely agree that it depends on the intended audience.
I think for RP, you’re often researching a particular question for a very particular small audience that is more or less guaranteed to read your results. It actually is far more similar to school than most EA Forum writing. In such (and all cases), definitely cater it to your audience.
One thing I think I should’ve made more clear in my comment was that I think it is, as far as I can tell and at least for right now, it is typically better for the marginal EA to invest in “find a small, powerful niche audience (i.e., writing for 10-100 people) and cater your content specifically to them” than to invest in broad outreach (i.e., writing for >5000 people). I think it is easier to do the former (at least within EA) and that, impact-weighted, you often achieve more impactful results.
Personal fit and interest, though, would be a very important consideration though and I definitely endorse those who are more interested and skilled at broad outreach to do that. I certainly wouldn’t tell Robert Miles or Scott Alexander to quit their broad outreach work!
I agree with most of this (Note: Peter is my manager).
You mention “impact weighting” audience numbers but this is actually a much bigger point than you’re making it to be—an audience of ten people can very frequently be a lot more important than an audience of 10,000 if the audience of ten is carefully selected and targeted.
I think if anything this understates things. I think my most impactful reports at RP have most of their impact come from improving the decision quality of <5 people, and most of those <5 people are ones we’ve ex ante identified well in advance (ie, whoever commissioned the report).
From the article:
One might make the case that the highest impact audience to target usually are those high achievers we all know and want to hate but can’t. Those terrible humans who seem to be in perfect control of their lives, who work 80 hours on a treadmill desk eating only the healthiest foods, whose idea of a vacation is a 10-day Vipassana retreat. Those sorts of people won’t care if it’s written in a dry style, and since it’s a power law of impact, then it mostly matters how these people respond.
The first argument against this is that even amongst the most conscientious people[...]
The second argument is that a lot of the highest impact people are, in fact, human
I understand that the article is exaggerating for comedic effect, but I think there’s an important reasoning slip. Namely, people with lots of decision-making power in EA are both trained and selected heavily for their ability to read lots of dense arguments and coming to a reasoned conclusion, and only moderately selected for general conscientiousness. So I would expect e.g. the top ~100 EAs in decision-making power to be much higher than average in revealed ability/inclination to read fairly dense arguments compared to eg the typical EAF reader, and only moderately or slightly higher in propensity to eat healthy foods or exercise regularly.
So in that regard, being more engaging in the sense that willbradshaw defines it as “reducing mental effort per unit information transferred”[1] to the target audience is great, but broad engagement isn’t as worthwhile.
That said, I think you (Peter) underestimate the value of broad (or relatively broad, e.g. “to highly educated Westerners”) engagement in improving the value of our own thoughts via public feedback, particularly in fields where EA does not have many of the relevant domain experts. For example, some of the comments on Neil and my summary on cultured meat TEAs were mildly helpful for us, as non-experts wading into and attempting to come to a reasoned conclusion on a deeply technical field, and I imagine it would be moderately helpful in improving our judgement if we got 10x the comments drawn from the same distribution.
In addition, I do think we are currently underutilizing resources on communicating ideas better/more efficiently to key stakeholders, though it appears that there are works in the plans for RP to be better at this in2022.
[1] Which is NOT, as I noted, how I would naturally define “interesting” or “engaging.”
I largely agree with your points, particularly the idea that certain audiences have different preferences than the ‘general public’ and that rigor is more key than engagement in research.
However, my main takeaway from your comment is that research is an extremely broad term and can be relevant in many different contexts, some of which would benefit from a more engaging communication style (e.g., lobbying). So, whether comms for EA research would benefit from being more engaging really depends on the context in which that research will be communicated.
On that note, I’d like clarification on the sentence below
“I’d rather hire people who are better at choosing research questions and answering them—even if they do so in an unengaging way—than to hire people who are engaging.”
Are you referring specifically to research roles or to any role (including comms-specific roles)?
research is an extremely broad term and can be relevant in many different contexts, some of which would benefit from a more engaging communication style (e.g., lobbying). So, whether comms for EA research would benefit from being more engaging really depends on the context in which that research will be communicated.
I agree.
“I’d rather hire people who are better at choosing research questions and answering them—even if they do so in an unengaging way—than to hire people who are engaging.”
Are you referring specifically to research roles or to any role (including comms-specific roles)?
I’m referring specifically to research roles (not comms roles) that are at Rethink Priorities, where we usually (though not always) aim to influence more insular EA-oriented actors and thus (typically) prioritize rigor over engagingness.
Yes! I strongly agree with your follow-up. I think that more EA orgs should invest in communications strategy, which typically looks very different from mass outreach (where engagingness is more important). Correspondingly, I think we need more EAs who understand comms as well as EAs who can do mass outreach.
I don’t fully agree with this article. The main issue is that a lot of good research (especially that done by Rethink Priorities) is not specifically aimed at impact through a broad audience but rather by changing the opinions of a few specifically interested people with a lot of resources and influence (e.g., Open Phil staff). You mention “impact weighting” audience numbers but this is actually a much bigger point than you’re making it to be—an audience of ten people can very frequently be a lot more important than an audience of 10,000 if the audience of ten is carefully selected and targeted. For these audiences, I think the engagement tips you’d get from Copyblogger are much less important than being accurate, being well calibrated, choosing a useful question and answering it well, and practicing good reasoning transparency.
More generally, choosing the most impactful questions and answering them accurately and throughly is far more important than being engaging. Given that good research question selection and achieving strong accuracy and calibration are very difficult skills to find and no one is skilled at everything, I’d rather hire people who are better at choosing research questions and answering them—even if they do so in an unengaging way—than to hire people who are engaging (but less good at choosing research questions and answering them).
Some investments in engagement are pretty simple and easy to make. I think it’s particularly important to make strong summaries and make work skimmable, for example. Also more useful and easy than making writing more engaging is to invest in distribution—for example, making sure to directly share your work with important people in your audience.
A last warning—while being engaging and being true are not particularly correlated, I do worry that people are more likely to believe falsehoods when they are presented in an engaging way.
That’s not to say that being more engaging isn’t important all else being equal. And obviously you need to at least meet a minimum bar of engagingness to make your research readable. And I do think it’s good for some people to focus on broad outreach and these people really should focus on being as engaging as possible. I just that I don’t think being engaging is particularly important for EA Research and I think it’s overly sensational to call not focusing on engagement as unethical.
I’ve changed the title of this post.
I did this for a number of reasons. First and foremost, I think that it hurt my case more than helped it. It put people on the defensive right from the get go and was harsh if you didn’t have the whole context of my background to know I was joking, which of course, most readers wouldn’t. This is not the best way to be persuasive.
I think there are many ways to be engaging, and being controversial is just one way. However, I think being controversial is a risky and generally suboptimal way of getting people interested in something. It mostly just promotes polarization, which is counterproductive. I wouldn’t want EA to start doing more of this compared to all of the other ways you can be more engaging (like jokes, bullet points, smaller paragraphs, stories, pull quotes, pictures, etc).
In general, a lesson I pulled from this is to have more people look at my writing before I post. I had three people check it and one person expressed reservations about the title, but they didn’t push hard and it seemed like it was an idiosyncratic preference. In fact, some suggested even more controversial ones, like “Bad writing kills babies”.
If I’d had more people I would have been able to tell in advance that I should brainstorm more titles that were both interesting, accurate, and persuasive. In my little social bubble where everybody knew me, they just thought that the title was hilarious, which led me to be a bit blindsided by the reaction.
Fortunately, this is the EA movement, and one of my favorite things about this community is that people appreciate when you update based on evidence. So, despite a rather large part of my brain that is saying “You said this thing publicly! Defend it to the ends of the earth”, I’d rather the earth not end, so I’ll update the title, and hopefully we can have a more engaging EA blogosphere while also keeping sensationalism out of it.
I think this should be applauded. Thanks so much for engaging with your critics and learning from your mistakes, as well as teaching the rest of us something important. It definitely makes you a good person, not a bad person.
I think this is a really important lesson. Optimizing for clickthroughs is not the same as optimizing for impact, and the former can be dangerous.
Another important lesson seems to be to have the post read by one or two people who aren’t that familiar with you.
For some reason, this title feels over the top to the point where it does make me know you are kidding. It’s also memorable. But maybe it would backfire with other people—I don’t know.
I agree with this comment, especially this part:
I want to spend a large fraction of my reading time asking “wait, is this true, actually?” Many ways of making posts more engaging make it harder for me to maintain this vigilance. This includes humor, even when it is devoid of sarcasm and mockery.
Jokes, flourishes, and especially emotions often make a post’s case seem stronger to me than it actually is, even when the substance of the post contains nuance, e.g. in the form of an epistemic status. I have noticed this in writing I otherwise often find useful and insightful, such as Eliezer Yudkowsky’s, Zvi’s, or Gregory Lewis’.
I completely agree that it depends on the intended audience.
I think for RP, you’re often researching a particular question for a very particular small audience that is more or less guaranteed to read your results. It actually is far more similar to school than most EA Forum writing. In such (and all cases), definitely cater it to your audience.
One thing I think I should’ve made more clear in my comment was that I think it is, as far as I can tell and at least for right now, it is typically better for the marginal EA to invest in “find a small, powerful niche audience (i.e., writing for 10-100 people) and cater your content specifically to them” than to invest in broad outreach (i.e., writing for >5000 people). I think it is easier to do the former (at least within EA) and that, impact-weighted, you often achieve more impactful results.
Personal fit and interest, though, would be a very important consideration though and I definitely endorse those who are more interested and skilled at broad outreach to do that. I certainly wouldn’t tell Robert Miles or Scott Alexander to quit their broad outreach work!
I agree with most of this (Note: Peter is my manager).
I think if anything this understates things. I think my most impactful reports at RP have most of their impact come from improving the decision quality of <5 people, and most of those <5 people are ones we’ve ex ante identified well in advance (ie, whoever commissioned the report).
From the article:
I understand that the article is exaggerating for comedic effect, but I think there’s an important reasoning slip. Namely, people with lots of decision-making power in EA are both trained and selected heavily for their ability to read lots of dense arguments and coming to a reasoned conclusion, and only moderately selected for general conscientiousness. So I would expect e.g. the top ~100 EAs in decision-making power to be much higher than average in revealed ability/inclination to read fairly dense arguments compared to eg the typical EAF reader, and only moderately or slightly higher in propensity to eat healthy foods or exercise regularly.
So in that regard, being more engaging in the sense that willbradshaw defines it as “reducing mental effort per unit information transferred”[1] to the target audience is great, but broad engagement isn’t as worthwhile.
That said, I think you (Peter) underestimate the value of broad (or relatively broad, e.g. “to highly educated Westerners”) engagement in improving the value of our own thoughts via public feedback, particularly in fields where EA does not have many of the relevant domain experts. For example, some of the comments on Neil and my summary on cultured meat TEAs were mildly helpful for us, as non-experts wading into and attempting to come to a reasoned conclusion on a deeply technical field, and I imagine it would be moderately helpful in improving our judgement if we got 10x the comments drawn from the same distribution.
In addition, I do think we are currently underutilizing resources on communicating ideas better/more efficiently to key stakeholders, though it appears that there are works in the plans for RP to be better at this in 2022.
[1] Which is NOT, as I noted, how I would naturally define “interesting” or “engaging.”
FWIW I broadly agree with Peter here (more so than the original post).
I largely agree with your points, particularly the idea that certain audiences have different preferences than the ‘general public’ and that rigor is more key than engagement in research.
However, my main takeaway from your comment is that research is an extremely broad term and can be relevant in many different contexts, some of which would benefit from a more engaging communication style (e.g., lobbying). So, whether comms for EA research would benefit from being more engaging really depends on the context in which that research will be communicated.
On that note, I’d like clarification on the sentence below
Are you referring specifically to research roles or to any role (including comms-specific roles)?
I agree.
I’m referring specifically to research roles (not comms roles) that are at Rethink Priorities, where we usually (though not always) aim to influence more insular EA-oriented actors and thus (typically) prioritize rigor over engagingness.
This comment I wrote is relevant to your comment too, as a follow-up to my other answer.
Yes! I strongly agree with your follow-up. I think that more EA orgs should invest in communications strategy, which typically looks very different from mass outreach (where engagingness is more important). Correspondingly, I think we need more EAs who understand comms as well as EAs who can do mass outreach.