AMA: Tom Chivers, science writer, science editor at UnHerd
Hi everyone! I’m Tom Chivers, and I’ll be doing an AMA here. I plan to start answering questions on Wednesday 17 March at 9am UK: I reckon I can comfortably spend three hours doing it, and if I can’t get through all the questions, I’ll try to find extra time.
Who I am: a science writer, and the science editor at UnHerd.com. I wrote a book, The Rationalist’s Guide to the Galaxy – originally titled The AI Does Not Hate You – in 2019, which is about the rationalist movement (and, therefore, the EA movement), and about AI risk and X-risk.
My next book, How to Read Numbers, written with my cousin David, who’s an economist, is about how stats get misrepresented in the news and what you can do to spot it when they are. It’s out on March 18.
Before going freelance in January 2018, I worked at the UK Daily Telegraph and BuzzFeed UK. I’ve won two “statistical excellence in journalism” awards from the Royal Statistical Society, and in 2013 Terry Pratchett told me I was “far too nice to be a journalist”.
Ask me anything you like, but I’m probably going to be best at answering questions about journalism.
If you were given several million dollars to structure a media vertical on “news that actually matters”*, what would you do differently from Vox’s Future Perfect?
*By EAish lights
Honestly I’m not sure. I think those guys do a good job. I might try to approach more rationalist/EA bloggers for external freelance pieces: I’d love to commission, say, Kevin Simler or Sarah Constantin or John Nerst to write things, rather than just do it all in-house. (But maybe they already do that and I haven’t noticed?)
Are there journalists or outlets you think EAs or rationalists should especially be following? Particularly ones who might not already be on our radar?
Hi Rob! Loved your post about politics being too meta.
OK so you’re all obviously aware of Kelsey Piper, Dylan Matthews etc. And probably Zeynep Tufekci.
In the UK, I think it’s worth paying attention to Ian Leslie, who is very interested in these topics (he’s a big fan of Julia Galef’s podcast, for instance). The (London) Times’s science correspondent, Tom Whipple, thinks in quite an EA-ish sort of way, I think.
People who are less obviously EA-ish but make me think might include Helen Lewis and Alex Hern – Alex for instance is quite a long way to the left of me but always thoughtful; I love his The World is Yours* newsletter because it either explains tech stuff that I wouldn’t understand, or it approaches things from some quite left-wing angles in ways that make me think about stuff.
That’s entirely off the top of my head. If I think of any more while I’m writing other replies I’ll come back and add them.
It’s worth adding that both Stephen Bush and Jeremy Cliffe at the New Statesman both do prediction posts and review them at the end of each year. The meme is spreading! They’re also two of the best journalists to follow about UK Labour politics (Bush) and EU politics (Cliffe) - if you’re interested in those topics, as I am.
https://www.newstatesman.com/politics/staggers/2020/12/what-i-got-right-and-wrong-2020
https://www.newstatesman.com/international/places/2020/12/january-i-made-ten-predictions-2020-how-did-they-turn-out
gah I’m annoyed I didn’t think of Stephen! A great journalist. I don’t know Jeremy’s work well but I’ve heard good things
Thanks, Tom. :) I’m interested to hear about reporters who aren’t “EA-ish” but are worth paying attention to anyway — I think sometimes EA’s blind spots arise from things that don’t have the EA “vibe” but that would come up in a search anyway if you just classified writers by “awesome”, “insightful”, “unusually rigorous and knowledgeable”, “getting at something important”, etc.
For people who missed my post: Politics Is Far Too Meta
This might be a bit personal, but how much of your writing do you consider to be under the bucket of “this is the most impactful work I could be doing right now”, versus work that you are assigned to do or have to do?
How do you choose what to write about, especially when it comes to shorter-form articles? How do you think independent writers/bloggers should decide what to write about?
What topics do you wish to do more writing or research on in the next 1-3 years, but haven’t yet?
1. this is a totally fair question! But … hmm. I am (these days) rarely “assigned” work, but obviously I do have to get my editors to agree to let me write things, and I can’t just write “buy antimalarial bednets” every week forever. That said, if I did, no one would read me, so I need to write interesting things that get an audience. So there’s a theoretical tradeoff between “say what’s important” and “say what will get read”, and the greatest impact means finding the right balance.
That said, in reality, it’s not often as though I have two brilliant ideas for a column in the same week and I have to choose between them; it always feels like a miracle that I usually manage to have at least one. And since I don’t always have a really clear idea about what will have the most impact, I rarely feel like I’m writing something I don’t think matters.
(I guess when I do something fun, like this about Warhammer, it doesn’t “matter” in some sense that it probably won’t save lives. But I enjoyed writing it, and people enjoyed reading it, and maybe it gave a few people an idea for how to have a hobby. I doubt it was the most impactful thing I could have done, but I still feel it was net positive in the world just by cheering a few people up.)
2. It’s a conversation with the editors. I come up with ideas, or they suggest something (maybe pegged to some news piece, or a new book out, or whatever. Quite often I’ll say “I’d like to write about X” and the editors will be unconvinced by my pitch, and since that is literally the purpose of editors, I can’t really complain.
But the key thing is I have to find it interesting. I don’t mind whether I’m writing about sperm counts or deworming or stupid economists, it needs to be something I enjoy learning about and then can enjoy telling other people about, even if I’m telling people that this is stupid and you don’t need to worry about it. I think that would be my main recommendation to bloggers/independent writers too: if you aren’t interested, don’t write about it, because it will be obvious, and no one will read it. (And you’ll be bored.)
3. Hmm. You know, this is really hard to answer! I have been lucky in that I’ve almost always been able to write about what I’m interested in, and what I’m interested in shifts over time: I went through a phase a few years ago of being fascinated with linguistics, for instance. Now I’m super obsessed with the replication crisis and statistics. So I have no idea what I’ll be fascinated by in three years time, and the things I’m fascinated by now, I can usually write about without too
much difficulty. This is unusual in journalism and I am super lucky.
Sorry, this is a really rambly answer; I hope it makes some sense.
I managed to press some button there; I wanted to include this paragraph somewhere:
I suppose that it’s worth being clear that I am mainly an opinion writer; I’ve done news reporting in the past, but my main job is commenting/analysing things in the news, so it’s not that I’m usually out there huntin’ down leads and meeting anonymous sources in underground car parks. I don’t know if the distinction between reporting and comment is really obvious to outsiders.
Why don’t more journalists make concrete, verifiable, quantitative forecasts and then retrospectively assess their own accuracy, like you did here (also see more examples)? Is there anything that could be done to encourage you and other journalists to do more of that?
Ah man! I have THOUGHTS about this.
So. First up, I have indeed made a few concrete forecasts, and it is worth noting that they absolutely stink. I don’t know what my Brier score is but it will definitely suck. That does not exactly make me want to do them more, because it’s a bit embarrassing, although there is a nice virtuous feeling when you hold your hand up and say “I got another one wrong”. And the EA/rationalist community is really good at socially rewarding that behaviour, and being on the fringes of the community I do get some good social feedback for doing it, so it’s not too cringey.
But it’s related to another problem, which is that when I make forecasts, it’s not the day job. I write some article, say, talking about the incentive problems in vaccine manufacturing. Do I stick a forecast on the end of that? Well, I could – “I think it is 60% likely that, I dunno, Covax will use advanced market commitment structures to purchase vaccines by the end of 2021”. But it’s kind of artificial. I’m just whacking it on the end of the piece.
And it also means they will, usually, suck. I know a few superforecasters, and I gather that one of the best predictors of the accuracy of a forecast is how long you spend making it. If I’ve just spent a day writing a piece, interviewing scientists or whatever, and my deadline is 5pm, then I won’t be able to spend much time doing a good forecast. It’s not what I’m being paid for and it’s not what the readers want from me.
I do think it’s valuable, and it means that I have to think carefully about what I actually mean when I say “it’s likely that schools will reopen in May” or whatever. So I try to do it. And sometimes pieces are more about forecasting, and they lend themselves more naturally to concrete predictions (although the problem of me doing it quickly, having spent most of my time chasing interviewees and writing the piece, is still there). I’ll definitely try to keep doing it. But I think the value isn’t always as huge as EAs/forecasters think, or in fact as huge as I used to think before I tried doing them more, so I understand journalists not being super interested. I hope more start doing it, but I doubt it will ever be a standard procedure in every opinion piece.
(That said, maybe I just suck and that’s what a person who sucks would say.)
If you haven’t spent time on calibration training, I recommend it! Open Phil has a tool here: https://www.openphilanthropy.org/blog/new-web-app-calibration-training. Making good forecasts is a mix of ‘understand the topic you’re making a prediction about’ and ‘understand yourself well enough to interpret your own feelings of confidence’. Even if they mostly don’t have expertise in the topic they’re writing about, I think most people can become pretty well-calibrated with an hour or two of practice.
And that’s a valuable service in its own right, I think. It would be a major gift to the public even if the only take-away readers got from predictions at the end of articles were ‘wow, even though these articles sound confident, the claims almost always tend to be 50% or 60% probable according to the reporter; guess I should keep in mind these topics are complex and these articles are being banged out in a few hours rather than being the product of months of study, so of course things are going to end up being pretty uncertain’.
If you also know enough about a topic to make a calibrated 80% or 90% (or 99%!) prediction about it, that’s great. But one of the nice things about probabilities is just that they clarify what you’re saying—they can function like an epistemic status disclaimer that notes how uncertain you really are, even if it was hard to make your prose flow without sounding kinda confident in the midst of the article. Making probabilistic predictions doesn’t have to be framed as ‘here’s me using my amazing knowledge of the world to predict the future’; it can just be framed as an attempt to disambiguate what you were saying in the article.
Relatedly, in my experience ‘writing an article or blog post’ can have bad effects on my ability to reason about stuff. I want to say things that are relevant and congruent and that flow together nicely; but my actual thought process includes a bunch of zig-zagging and updating and sorting-through-thoughts-that-don’t-initially-make-perfect-crisp-sense. So focusing on the writing makes me focus less on my thought process, and it becomes tempting to for me confuse the writing process or written artifact for my thought process or beliefs.
You’ve spent a lot of time living and breathing EA/rationalist stuff, so I don’t know that I have any advice that will be useful to you. But if I were giving advice to a random reporter, I’d warn about the above phenomenon and say that this can lead to overconfidence when someone’s just getting started adding probabilistic forecasts to their blogging.
I think this calibration-and-reflection bug is important—it’s a bug in your ability to recognize what you believe, not just in your ability to communicate it—and I think it’s fixable with some practice, without having to do the superforecaster ‘sink lots of hours into getting expertise about every topic you predict’ thing.
(And I don’t know, maybe the journey to fixing this could be an interesting one that generates an article of its own? Maybe a thing that could be linked to at the bottom of posts to give context for readers who are confused about why the numbers are there and why they’re so low-confidence?)
all this makes a lot of sense, by the way, and I will take it on board.
I agree with these comments, and think the first one—“If you haven’t spent time on calibration training...”—makes especially useful points.
Readers of this thread may also be interesting in a previous post of mine on Potential downsides of using explicit probabilities. (Though be warned that the post is less concise and well-structured than I’d aim for nowadays.) I ultimately conclude that post by saying:
(That quote and post is obviously is somewhat tangential to this thread, but also somewhat relevant. I lightly edited that quote to make it make more sense of of context.)
I will look at that OpenPhil thing! I did do a calibration exercise with GJP (and was, to my surprise, both quite good and underconfident!) but I’d love to have another go.
Meaning you think there is a 95% chance that within five years, it won’t be the case that The New York Times, The Atlantic, and The Washington Post will include a quantitative, testable forecast in at least one fifth of their collective articles?
...Just kidding. Thanks for the well-written and illuminating answer.
hahaha!
(Just want to mention that a recent Scott Alexander post contains an interesting discussion of the topic of forecasting by journalists/pundits, and so may be of interest to readers of this thread.)
Out of curiosity:
What made you change your book’s title from The AI Does Not Hate You to The Rationalist’s Guide to the Galaxy?
What were the positive and negative effects or implications of changing your book’s title after publication? (This is one of the first times I’ve encountered a book’s title being changed after publication, so I’m just curious to hear about it.)
No big story! We changed editors between the hardback and the paperback, and the new editor decided that the book wasn’t really centrally about AI. She felt that it was more about the rationalist community, so she wanted to change the title to reflect that. I know sod-all about the publishing industry and I trust her judgment better than mine, so I said “fine”.
If I’m starkly honest with myself I think it’s probably because the sales weren’t great and she thought this would help, but I don’t know.
The positive impacts I guess will be if it sells more copies, but one negative impact is that a few people thought I’d written a new book. I hope no one accidentally buys a second copy but I imagine it might happen.
(It’s not that unusual, I don’t think. At least two of my journalist friends have books with different paperback titles to the original hardback titles.)
*loads* of people saw the title and thought “oh, this is a book about how AI is Good, Actually”. For anyone who doesn’t know, the full quote is Eliezer’s: “The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else.”. I much preferred the old title but I guess I shouldn’t be surprised people didn’t get it!
What would your advice be for talking to the media about EA? (And when to figure out whether to do it at all!)
How would you frame the message of EA to go down well with a large audience? (Eg, in an article in a major newspaper). How would this change with the demographics/political bias of that audience? Do you think it’s possible to convey longtermist ideas in such a setting?
Being ahead of the curve on COVID-19/pandemics seems like a major win for EA, but it has also been a major global tragedy. How do you think we can best talk about COVID when selling EA, that is both tactful and reflects well on EA?
Honestly, I think this is all about finding journalists who you trust. The Vox lot, for instance, you know they’re not going to write a “ha ha look at the weirdos who want you to donate money to stop humans going extinct” piece, even if they don’t agree with the specific position you’re supporting.
“How to frame it” comes down to the same thing. In the end, assuming that you give interviews to journalists, the journalists will be the ones framing it, so choose journalists you trust and feel you can speak freely to. I know that’s kind of unhelpful advice—“be good at choosing people you trust”—but it’s really important.
That said. There are some general tips if you’re, say, writing research papers (and press releases for those research papers). One of them is including a nice clear list at the top of things that your paper doesn’t say: if your paper finds a correlation between doing crosswords and brain health, for instance, it’s worth saying “this does not mean that crosswords prevent alzheimer’s” at the top. That’s been shown (Chris Chambers at Cardiff did some great research) to avoid misunderstandings without reducing press coverage. So I guess that could be relevant. With longtermist ideas, say, you could say prominently “this doesn’t mean we need to dedicate all of our charity resources to preventing X-risk” or whatever. (Assuming you do think that.)
My only advice for talking to different demographics/political biases is to signal that you are on the side of the reader. If you’re trying to convince a right-wing person of a stereotypically left-wing point, or vice versa, it’s worth starting out with some “I come in peace” stuff about how yes, the things right-wing people worry about are OK to worry about, etc. (Scott Alexander is really good at this.) That’s obviously more relevant if you’re writing the piece yourself, but i guess if you’re talking to journalists, you can aim to do something similar.
Is any of that helpful?
What do you think people in journalism think of EA, if anything? Are there any key errors that EA is currently making or would be likely to make in the future when dealing with journalists?
I think there are a lot of journalists who think EA is very wise and sensible, and there are a lot of journalists who think it’s all neckbeard rationalist techbros, or whatever dismissive term they might use. I think the first journalists are right and the second journalists are wrong.
(TBF I think the largest group of journalists is probably the ones who’ve never heard of EA, and don’t write about things that are anything to do with EA.)
Key errors: ah man. I don’t really want to advise after I got it so badly wrong with the Scott Alexander/NYT stuff, and also I don’t feel I know the EA community well enough to say what they do now. But I do think they could do with finding a few more media-savvy, personable spokespeople who can help you get your stuff into the media when you want it there. I’m always surprised that, say, 80,000 Hours doesn’t act more like a think tank, trying to get journalists reading their latest work. But maybe if EAs start playing the game like think tanks do, they’ll end up drifting away from their purpose and start chasing headlines and so on, and that would be a shame.
And by “reading their work” you mean, be emailing and tweeting articles at journalists asking what they think? Ie what would 80,000 Hours acting like a thnk tank look like?
Though I’m curious as to what would happen here. I would probably tweet 80k stuff in different circumstances if they became written for experts rather than amateurs. There is something very successful about engaged individuals taking and sharing 80k’s stuff.
What are the best changes (in terms of tractability and importance) that you think could take place in the journalism industry in the next 20 years, and how can people help make them happen?
What’s the biggest bottleneck on the positive impact of your work?
How did you make the choice to go freelance?
I’ll answer the last question first, because it’s easiest: BuzzFeed UK had a load of cuts in late 2017, they offered voluntary redundancy to anyone who wanted it, and I took it because I had a book to write (and because I may well have ended up losing the job anyway). After that it turned out to be pretty straightforward to get people to give me money to write things and I’m really glad I did it, but it was a half-jump half-push situation.
Biggest bottleneck: sheesh. Honestly it’s probably coming up with interesting, worthwhile ideas fast enough. But then it’s also having the time to write about them all, so there’s a tension there. I feel like I’m always desperately trying to think of the next piece, while also somehow not having time to do the longer-term projects. So, and this sounds like I’m being flippant, but I think honestly the biggest bottleneck is my own tendency to procrastinate.
HANG ON. I’m going to leave that paragraph up because I think it’s interesting, but I’ve just realised it doesn’t quite answer your question. The biggest bottleneck on IMPACT is probably readership. I can get a few tens of thousands to read a piece; a few hundreds of thousands if it goes really, really well. Assuming that piece is something I think is really important and I want to change some minds with it, what percentage of people actually will change minds? Is it double-digits? Is it even greater than one? I don’t know. So I suppose that’s the most important thing. How do I write something that is 1) about something super important 2) persuasive enough to change minds and 3) exciting enough to be widely read? there are trade-offs there.
The best changes: Hmmmmmmmmm. I am sorry, I really don’t know. I think the lack of numeracy among journalists is a real problem, and that’s why I’ve just written a book about how numbers go wrong in the news. I don’t know if it’s the most important thing, but I think it would make a real difference to public understanding if journalists started thinking about putting numbers in context, absolute vs relative risk, linking to original sources, etc etc. Just being more comfortable with and transparent about the numbers they use in reporting.
From your Twitter, it appears that you think a lot about covid-19. So why is the UK response to covid-19 so bad*?
Sometimes my American friends will blame US covid failures on US-specific factors (eg, our FDA, presidential system, Trump). But of course the UK is a (by international standards) culturally similar entity that does not share many of those factors, and still appears to have outcomes at least as bad if not worse. So why?
*I admit this is a bit of a leading question. My stance is something like With the major asterisk of vaccinations, it appears that UK outcomes of covid are quite bad by international standards. Moreover, we can trace certain gov’t actions (eg “eat out to help out”) as clearly bad in a way that was ex ante predictable. But feel free to instead respond “actually you’re wrong and the UK response isn’t so bad due to XYZ contextual factors!” :)
The UK’s outcomes have certainly been bad! Absolutely no argument there.
Before I go into why I think it was, I will say that I suspect there’s quite a lot of randomness in these outcomes, and it’s not straightforward to say it’s because of political responses or whatever.
But that said I think I am pretty comfortable saying that the political response has, in fact, been bad. I think failure to lock down early not once but twice (maybe three times?), the eat out to help out nonsense, the CONTINUED insistence on “washing hands” and so little emphasis on meeting outside, ventilating areas, etc, is really bad and stupid. I think a lot of it stems from bad political leadership (the cabinet mainly chosen for loyalty over brexit rather than talent, a PM who never wants to deliver bad news and always wants to say “I’ve saved Christmas”). And it also probably stems from years of chipping away at state capacity, having a really centralised state with a paradoxically weak centre, etc.
I am, however, quite low-confidence in all this stuff. I think that this set of politicians has been bad, and a randomly selected cabinet from the last 50 years would on average do better, but I don’t know.
Right guys I’m going to have to stop for now because I need to go and help with my kids’ dinner. I’ll try to answer some more tomorrow. I’m really sorry I didn’t do this yesterday: it totally slipped my mind among all the various things I’m doing at the moment (I have a new book out today!). Sorry if I haven’t got to you yet
Thank you for your contributions! It was very kind of you to take time out of your schedule to chat with us, especially during a book-release period :-)
If you could snap your fingers and change some things about journalistic norms, what kinds of articles tend to get written, how articles and news sites are structured, etc., what would you change?
And: Are there common proposals for reforming or improving journalism that you don’t think are good ideas?
Since after writing your previous book, do you still follow EA or rationalist content? (eg this Forum or LessWrong)? If so, what things do you find the most helpful?
yes! Mainly Scott, of course, but also various other rationalist and rationalist-adjacent blogs; Everything Studies, Put A Number On It, the 80,000 Hours podcast, Julia Galef’s podcast. I find LessWrong itself a bit hard to navigate but if an interesting piece comes around I’ll read it. I even try to read Gwern sometimes but usually I’m out of my depth.
How nervous should we be about talking about/recommending action on AI risk?
I think a lot of people in the EA community worry that AI risk is “weird”, sufficiently weird that you should probably be careful talking about it to a broad audience or recommending what they donate to. Many would fear alienating people or damaging credibility. (Especially when “AI risk” refers to the existential risks from AI, as opposed to, e.g., how algorithms could cause inadvertent bias/prejudice)
A thought experiment to make this more concrete: imagine you were organising a big sponsored event where lots of people would see 3 recommended charities. Would you recommend that (say) MIRI would be one of the recommended charities?
this is a complex question. But I think I agree with whoever it was (Eliezer?) who said that there are weirdness points: you are allowed to be only so weird before people stop taking you seriously. You can decide to spend those weirdness points how you like, but once you spend them, they’re gone. AI risk is obviously a lot more expensive in weirdness points than, say, deworming. So you’ll be able to talk about it less before people start thinking of you as the weird AI-obsessed guy.
I do think, though, that you can still do it, if you can explain that you’re using the same processes—expected value etc—to reach the conclusions in AI as you did with more prosaic things like bednets or deworming. That’s sort of what I did here. And if you try to pretend that AI/X-risk isn’t part of what you’re worrying about, then it looks like you’re doing a scientology and hiding the weird stuff behind a friendly facade.
All that being said, in your concrete example, I wouldn’t include MIRI unless you’re really sure that that is where you want to go. I speak as someone who really likes MIRI! But if it’s a “this is your first taste of effective altruism” deal, then you’re already asking people to take on board the idea that, actually, donating to Cancer Research UK is severely suboptimal and you should give it all to very specific infectious-disease charities in sub-Saharan Africa or whatever. That’s weird and counterintuitive enough already, and I think taking people along that route one step at a time is probably wisest.
The most often cited post on this is Peter Hurford’s You have a set amount of “weirdness points”. Spend them wisely. But the concept/term isn’t original to Peter, given that:
Peter opens the post by writing “I’ve heard of the concept of “weirdness points” many times before, but after a bit of searching I can’t find a definitive post describing the concept, so I’ve decided to make one”
A commenter on the post notes that “The idiom used to describe that concept in social psychology is “idiosyncrasy credits”, so searching for that phrase produces more relevant material (though as far as I can tell nothing on Less Wrong specifically).”
What are two problems/bottlenecks you wish EAs spent more time thinking/working on?
This is going to sound like I’m blowing smoke, but: you guys know better than me! I get really interested in various topics, but I’m not systematic about it. I try not to get blown around by whatever the current news fascination is, but I’m not immune to being swept along by whatever media Twitter cares about that day. The value of EA is that you run the numbers and continue to care about it, so that even though British media Twitter has today decided that X is the most important thing in the world, EAs will continue to say “no actually you can probably do the most good by donating money to the Malaria Foundation”.
Would you consider/label yourself as a rationalist? Why or why not?
Would you consider/label yourself as an effective altruist? Why or why not?
No, but I’d say I’m rationalist-adjacent. I find the topics interesting and I love the community, but I’m not really part of it. I imagine there being circles of rationalism, with the outermost circle being people who read I Can Tolerate Anything But The Outgroup once and quite enjoyed it, and the innermost circle being literally a room with Eliezer, Luke Muehlhauser, Gwern and Scott Alexander playing Dungeons & Discourse. I’m in a medium-sized circle in between those two.
No, partly for similar reasons, and partly because I don’t think I deserve it because I don’t donate enough to charity. I have a small monthly direct debit going to GiveWell and it should be more but it isn’t. In fact I will go and double it after I’ve said bye here because it really should be more.
Thanks for doing this—I’m a big fan of your book!
I’m interested to hear what you think this post about how media works gets right and gets wrong. In particular: (1)
and (2)
and (3)
and (4)
So, this is a massive and hard-to-summarise question.
You’re right, a lot of what goes wrong in the media isn’t lies. It’s not even deliberate misinformation. The Chinese Robber fallacy is spot on, but it’s also: if you, say, hear that 180 women are murdered in the UK each year, that sounds dreadful (obviously it is dreadful). And if you’re not used to asking questions like “is that a big number” or “what’s the base rate” then you can easily be misled by large-sounding numbers.
A lot of it is just being unfamiliar with numbers (book plug alert! Out today! In the UK at least). Most journalists aren’t very good with numbers, or with thinking about how they reach the news. Sure, there are plenty of grifters and boring contrarians, but there are also a lot of well-intentioned people who want to do good in the world but aren’t brilliant at thinking “wait, that SOUNDS important, but how would I go about checking whether it is or not?”
(Also there are quite a lot of people who are quite unreflective and just think that fighting the culture war is the most important thing you can do, although they wouldn’t necessarily say it that way.)
Does that sort of answer your question?
Yes, thanks! Some follow-ups:
1. To what extent do some journalists use the Chinese Robber Fallacy deliberately—they know that they have a wide range of even-worse, even-bigger tragedies and scandals to report on, but they choose to report on the ones that let them push their overall ideology or political agenda? (And they choose not to report on the ones that seem to undermine or distract from their ideology/agenda)
2. Do you agree with the “The parity inverse of a meme is the same meme in a different point of its life cycle” idea? In other words, do you agree with the “Toxoplasma of Rage” thesis?
I certainly agree with the Toxoplasma thesis, or I should say it sounds very plausible to me. I don’t think it’s unique to journalism at all—I remember in my MA reading about Israel and Palestine, a book called Through Different Eyes I think, and it fitted a very similar mechanism. Each side would highlight the other side’s “atrocities” as justification for their own retaliation, which would then become “atrocities” which the other side would use as justification for their retaliation, etc. Same thing here: some, I dunno, gender-critical feminist tweets something angry in response to some trans-rights activist; that tweet is then held up to show how awful the gender-critical types are and excuses a bunch of horrible comments; round and round we go.
Re 1), I think that’s rarer than you think. But as rationalist-adjacent types you’ll know that it doesn’t have to be deliberate. We’re extremely good at only noticing the data that is convenient, and fooling ourselves in the service of fooling others. I’m sure there are some cynics and grifters, but they’re nowhere near as common as people honestly saying what they believe. Debate is war, arguments are soldiers, etc, and you have to kill the other soldiers, but it’s not usually a conscious thing to think “I know that is true but I have to pretend it’s not,” it’s more “That is an enemy soldier, therefore it is bad, therefore I must destroy it.”
How could those reading this better support you? Is there anyone you’d like to be in touch with?
Oh I never replied to this one. BUY MY BOOK (either one, but especially the new one: https://www.howtoreadnumbers.com/ ). Or, I dunno, read my stuff and be available to chat if I need to speak to someone about some topic.
If you were given several million dollars to run a news organisation with positive impact, how do you think that would compare to GiveWell’s top charities? Do you think there is a way for funding a news org to be an impactful donation?
I think this is a really interesting question, but I don’t know enough about the business of journalism to have good answers. Somewhere else in the Qs I’ve talked a bit about the impact of journalism and why it’s important, which is probably relevant.
[meta comment]
I think it’s better to put every question as a separate comment since that allows us to up or downvote them individually. Sorry to be that guy.
The majority of the public (I guess) still gets all their news from a couple of sources. None of the major news orgs have good fact checking and most have blindspots. Do you have any thoughts on how a news organisation could provide an accurate view of the world whilst gaining a large readership?
Short answer: no.
Long answer: I think that what is underappreciated about journalism is the time pressure. If you’re writing for a daily paper, or some equivalent, you often have to become passably expert in some topic in a few hours. It is a miracle that, say, the Times puts together enough material for a medium-sized novel every 24 hours. Some of it is longer lead times, the mags and features sometimes you have a week or so, and the real glossy mags like the Atlantic and the Economist and so on you might have weeks or more to research. But it is absolutely incredible to witness a paper go from “literally nothing exists” at 11am to “here is a full paper, with relatively few typos and hopefully no libels and tens of thousands of words of news, analysis, criticism, a topical cartoon, sports reports, the weather” by 9pm. There just isn’t time to do a full fact-check. The sheer just-in-time nature of it is incredible. Same with TV and radio; everything is spinning like a gyroscope, seconds away from going wrong.
(The Atlantic and the Economist and the New Yorker, etc, it’s totally different, everyone sits around in oak-panelled rooms thinking deep thoughts and writing one piece a month which is then picked to the bone by fact-checkers like vultures on a dead buffalo. I exaggerate somewhat.)
And then you get the incentive issues that while journalists definitely think of themselves as performing a public service, and we are, we’re also contributing to a business and that business needs to sell papers or get clicks or whatever so “1,000 people died of malaria today” can’t be the splash headline every day. It just can’t. Journalism is in the public interest, but it ALSO needs to provide what the public is interested in, and if it doesn’t do the latter then it can’t be the former. And there’s a coordination problem that if I say “I will do only high-minded journalism that is in the public interest”, the next guy can say “great, I’ll do scurrilous celebrity hackery and sell 40 times what you do and put you out of business”.
AND THEN you have to remember it’s staffed by humans with biases and political opinions and families to feed and social status to uphold.
So with those limitations in mind, I think journalism does a pretty good job of giving readers/viewers/listeners a rough picture of reality. But they’re really big limitations and it’s unrealistic to expect anything approaching a clear, unbiased window on the world.
If you could re-write The Rationalist’s Guide to the Galaxy today (and you weren’t too worried about making the book too long), what are the ~three things you’d add that aren’t covered there?
Hmm. I don’t know, I’m afraid. I took a chapter on Occam’s Razor out which I rather liked. But I’m quite proud of the book and I think it works fairly well as an intro to rationalist thinking. I’m sure there are some things I could think of and if I do I’ll come back and add them.
How one should go about learning how to write high-quality material? And what is the way to get it published?
I wish I had a better answer to the first one than “become good at writing”. My own pathway was reading loads and loads, and writing loads and loads, and then essentially mimicking the writing that I liked (mainly Pratchett tbh) until eventually I noticed that I’d stopped doing that and had a recognisable style of my own. I sometimes go through my old emails from before I was a journalist and see I’ve just written needlessly long show-offy emails to friends, which I cringe about a bit now, but they were clearly practice for when I had to do it for real.
Actually, also, I did philosophy at uni and MA, and I found that the way I learnt to structure an argument in those essays has been really helpful.
Oh and this might sound silly but become good at typing. If you can type as fast as you think then when the ideas are flowing quickly then they just sort of appear on the page. I used to work as a medical secretary for a long time and I swear that helped me an awful lot, not least in transcribing interviews but also just in being able to get ideas down quickly.
As for getting it published: pitch! Ideally start by developing a relationship with some editor somewhere. It might be a good idea to blog as well, so that you can point people to stuff you’ve written.
OK I think that’s all the questions! Thanks for that and sorry again for the delay. I hope it’s been interesting! Best, Tom
Thanks for doing this AMA!
What do you believe that seems important and that you think most effective altruists / rationalists would disagree with you about?
What do you believe that seems important and that you think most people in journalism/the media would disagree with you about?
What do you think effective altruists / rationalists are most often, most significantly, or most annoyingly wrong about?
What’s an important way your own views/beliefs have changed recently?
(I’m perhaps most interested in your independent impression—i.e., what you’d believe before updating on the fact that other reasonable people believe other things.)
I’ve said elsewhere in this AMA, but I suspect that journalism, for all its many flaws, actually is really important, and that democratic societies would be much harder to organise without a free press. I suspect that journalism isn’t hugely popular among rationalists (and since I tend to think of EAs as being the same people, I assume it’s not hugely popular with them either), but I think it’s really important.
I think there’s a huge miscalibration about what’s “actually” important in the media – there’s a huge focus on problems of the Western world, and especially US problems (which then get exported to the UK and other places; eg I think problems with race in the UK are very different to the ones in the US, but we see everything through a US lens). A murder in the UK (or a school shooting in the US) is objectively less important than 1,000 people dying every day of malaria, but it gets many times the coverage. But it’s hard to say “actually this doesn’t matter” when it’s some named person dead in a horrible way, and compare it to thousands of real but faceless individuals dying off-camera.
Funnily enough I think Covid has shifted this a bit because it’s a genuinely global and important story about infectious diseases and vaccinations. But I’d love to see more focus in the mainstream media on diseases of poverty in the developing world, things like that. That said—as I’ve said elsewhere, you can’t run a media industry on the things that you OUGHT to write about, you have to give readers what they want as well.
Since this is about journalism, I’ll stick to that example: I think EAs/rationalists tend to assume journalists are out to destroy people. It’s not that that’s wrong exactly, but it’s incomplete. For instance, when I’ve written about quack charities – autism charities telling people not to get vaccinated, say, or to have awful heavy-metal chelation therapy or whatever – it is 100% my intention to damage those charities’ ability to function, or get their CEOs to resign, because I believe that they are damaging children, and that revealing their bad practices is good for society. I think most people here would agree with me. But other journalists are doing the same thing—they’ve just sometimes made different judgments about what “good for society” is, and I very often disagree with those judgments. But they’re not out to destroy for the fun of it; they’re usually trying to do good. (That said, there is a tendency to measure journalistic impact in how many people you’ve forced to resign, which is understandable but kind of icky.)
Oh dammit I forgot 4). Hmm. This is such a big and important question and I should have some ready answer for it.
I suppose my most general answer, and it’s not all that recent, is that I’ve become MUCH less trusting of the scientific literature in loads of fields, especially social sciences, because I’ve become much more aware of the statistical problems. But that’s a bit of a dodge, isn’t it.
And relatedly I think in Covid times I’ve become less happy with the public-health-institutions model of scientific/health evidence, of thinking “there isn’t an RCT supporting it” equals “it doesn’t work”; I’ve become much more of a Bayesian, or at least I try to think in terms of probabilities and best guesses and cost-benefit analyses rather than “this works” and “this has not been rigorously shown to work ergo we will say it doesn’t work”. I was already on that route I think but it’s become very obvious in the last year
Thanks for your answers! (Both here and elsewhere in this AMA.)
Not sure if you’ll come back to answer followups, but on the off chance you might:
Regarding your answer to 1, what do you think the implications of that should be for rationalist/EA beliefs or behaviours? E.g., is it that you think we should ourselves spend more time reading journalists’ writing? That more of us should go into journalism ourselves? That we should donate more to support high-quality journalism? That we should just probably be doing something more in relation to journalism, so we should do some research or thinking to figure out what that is? Something else?
And regarding your answer to 3, it sounds like you’re actually saying that journalists quite often are out to destroy people, and that the mistake is just to assume that this is for the fun of it rather than because the journalist sincerely thinks it’s right to destroy those people?
Do you have an explicit theory of change for your work as a whole, or for specific projects/books/articles?
no I don’t, and I was unaware of the idea. I am very unsystematic in my choices of what I write about and I don’t have any good way of measuring how many minds I change (just the hopelessly survivorship-bias-tainted one of people telling me that they’ve changed their minds). So it’s all very finger-in-the-wind.
Thanks for the work you do I consider you a top tier journalist.
Thanks particularly for making testable forecasts in your articles.
Thanks for doing this also. I know it has benefits to you, but coming to a forum for an AMA is a cool practice.
thanks Nathan!
How do stats get misrepresented in the news? What can you do to spot it when they are? :)
FUNNY YOU SHOULD ASK! https://www.howtoreadnumbers.com/
(thank you for this obvious setup)
What’s UnHerd? Is there anything unusual about it, or should I approximately treat it as “typical news outlet that happens to host your content”?
It’s not a news site; it’s comment and analysis (I never know how much the distinction between reporting and commentary is clear to non-journalists). Essentially, UnHerd rarely breaks scoops, although I have done once or twice; it’s analysis of things that are already public knowledge, essays, opinion. Like a blog site in a way.
But it’s – I guess it thinks of itself as heterodox? In the Haidt sense. Saying the, ah, “unheard” things (also “not part of the herd”, it was a bit of a laboured pun). So it’s generally politically unaligned but has a fondness for the stuff that it is hard to say in The Current Moment, which often but not always cashes out as criticism of the social justice movement. I think that’s a fair description.
What books or articles (or movies, podcast episodes, etc.) would you most recommend to a random thinker (or, say, a random science journalist) if they wanted to level up their rationality and/or their altruistic efficacy?
there’s this great book, The Rationalist’s Guide to – never mind.
I genuinely think that the Sequences are still great for that! And Scott’s Nobody is Perfect, Everything is Commensurable. I love 80,000 Hours’ podcast and Rationally Thinking. But if someone’s in the market, I’d start with Rationality: From AI to Zombies.
Right! Sorry. I got a bit swamped but am going to have a go at answering questions now! Sorry about the delay.
What are your thoughts on solutions journalism? Does it have much traction among science writers you know? Do you personally use it or promote it as a framework for writing?
Do you think this is a good/bad idea?:
I have the hunch that EA and solutions journalism could be a good match. E.g. EAs in journalism could join the solutions journalism network and seek solutions journalism angles to their editors. EA projects that think they would be well-served by public media coverage could seek to build relationships with strong solutions journalists and make themselves available for stories when they have something going on that the journalists are interested in. I’m not a journalist myself, and think the SJN approach is still small, so I’m curious if you see this area growing.
Approaching journalists:
How can we reach them?
What is the best way to pitch an idea to a journalist?
Do press releases still work? (Is it worth sending them to the contact emails, since it’s hard often to get direct contact)
Maybe Twitter is a good idea?
For me, at least, Twitter is the way to get hold of me—my DMs are open, and most journalists’ are. But emails are good too and most journalists will make them publicly available.
Send us an email or a DM! But first make sure that the journalist in question is interested in the sort of thing you’re pitching. I keep marking PR emails as spam, because they’re obviously just auto-sending to some list, and I don’t give a toss if some tech company is having a roundtable meeting about some acquisition or whatever. If, however, I get a personalised email from someone who obviously knows my work and has thought that whatever they’re emailing might actually be of interest, I’ll always at least reply, even if I don’t use the info in a piece.
Yes, but don’t spam them out—see 2. Get the journalist’s attention and ask if he or she would be interested. (Although: it’s just occurred to me that this is like the breeding habits of elephants vs the breeding habits of frogs. Either spend years raising the baby and getting high rates of survival but low rates of actual birth, or fire out tens of thousands of eggs and fertilise them all at once and hope that one or two survive. Maybe the latter works and I am giving you a survivorship bias-tainted account because I, obviously, ignore the large majority of them, as they expect.)
Indeed it is: see 1.
What do you see as the consequentialist value of doing journalism? What are the ways in which journalists can improve the world? And do you believe these potential improvements are measurable?
I think it would be very hard to have a functioning democratic state without journalism of some kind. I may be overstating my industry’s importance, but if you don’t know what the government is doing, or how the machinery of the state is operating, or what the lives of the citizens are like, then how can you participate in your democracy? And you can’t rely on the government to tell you. So even though most journalism is not vital to democracy, if there was no journalism, there would be very little to stop the government from behaving how it liked.
I also think that in my field of journalism, science writing, there’s a lot of value in translating abstruse-but-important research and thinking into readable or even enjoyable material for laypeople. Also, you can convince people of things that are true, and help people make good decisions (I hope).
Plus, things like criticism are helpful for readers in allowing them to find the books/films/theatre they might enjoy (and I think they have a value even given the existence of crowdsourced review sites like Rotten Tomatoes). And, of course, people enjoy reading/watching, and that is a good in itself.
For someone like me who has no other skills than interviewing clever people and writing down what they say, I suspect that journalism is one of the places I could do the most good, because I can do it well. Of course, all the good outcomes I just mentioned are reliant on the journalist in question being good at their job, but that’s probably true of all careers, isn’t it?
On measuring it: re democracy at least, I guess you could try to do some sort of study looking at countries with strong independent journalism vs those without, but they would be so horribly confounded I doubt you could get good numbers on it.