Research advice from Carl Shulman

Link post

This is a crosspost for Research advice from Carl Shulman, which was written by Claire Zabel in 2016. I really like the sections “Thinking habits”, “Research project process”, and “Mindset”. Relatedly, I recommend Michael Aird’s post about how to do high-quality efficient research, which is where I 1st found Claire’s post.

[Disclaimer: off-the-cuff, unverified, personal habits. No warranties that these are beneficial.]

Setting & setup

  • Big clean workspace.

  • Multiple very large monitors.

  • Gaming mouse and keyboard with giant mousepad (these turned out to be surprisingly better quality for me).

  • Tea in an electric kettle on my desk with boxes of tea for uninterrupted access.

Tools

  • Citation manager software is pretty useful.

  • Workflowy.

  • Beeminder.

Good habits

Thinking habits

  • Frequently imagine what someone you respect, thinking you were wrong, would say/​try to make the best argument against what you are currently thinking.

  • Pay attention to the use of contradictory epistemic standards and premises on different arguments/​pattern recognize them. Reconcile them or adjust your confidence in them.

  • I have gotten a fair amount out of trying to understand the epistemology and basic worldview of multiple movements. I like ideological turing tests, mainly because they will highlight the best evidence for the things they like, and against the things they hate, which helps in identifying areas to look into.

  • Make a habit of checking factual claims that you hear with a short Google search, and/​or Wikipedia.

  • Use basic arithmetic and statistics frequently and briefly as part of thinking rather than as occasional or separate exercises.

  • Try to convert qualitative claims into quantitative Fermi estimates whenever possible. Fermi calculations with quick Wikipedia/​Google/​Wolfram/​standard sources go far, and one can get into the habit and see how they typically evolve with different levels of depth.

    • E.g. I was recently talking to people about OpenPhil animal stuff, and this habit led to me sharing global animal populations rather than just U.S. ones, which were higher than people had thought. Or peak phosphorus, just checking the distribution of densities of phosphorus in rock, and the share of phosphorus in agricultural costs puts that into context. See this comment.

    • In my experience people almost never do enough Fermis with look-ups from Wiki//​government sources. And especially they have too high a barrier to doing it and won’t do it in casual conversation or exploration.

    • Sometimes it’s because talking is not about information and other things are shinier. Some people are afraid they’ll be wrong, and don’t trust their ability to do it (and don’t test that). Sometimes it’s because of a lack of affordance/​habit/​knowing about the bigger and standard resources or the basic toolkits of Fermis: prices, populations, sales. Sometimes it’s issues with arithmetic and basic statistics fluency/​aversion. Sometimes because it gives an unwelcome answer.

    • Recently I encountered a talk from an academic and environmental activist about existential risk, who rejected the use of numbers (e.g. the IPCC report, which estimated too little climate change risk for his taste) and probabilities in evaluating the importance of risks, so that one could use intuition (to produce risk rankings policy recommendations matching the current affiliations of his movement, regardless of evidence).

    • Worked examples with numbers and realistic figures erode plausible deniability and attractive lies, and force explicit claims, use of evidence, and argument.

Research management

  • Make a bookmarks folder to note critical arguments against positions you hold, and read it, so you don’t get isolated from different views.

  • Write casual thoughts up in explicit emails or blog posts to get feedback from people.

  • Have a giant folder of hundreds of blog post drafts to throw interesting ideas, citations, and links to. I may not have published anything about X, but I have a draft in my blogging folder where I have stored assorted information about it.

  • Write down your views and check against your old views to see when you were wrong and when you were right.

  • Offer and accept bets about observables or change your predictions.

  • Read very quickly since childhood and consume large amounts of good quality nonfiction.

Information exposure

  • Reading broadly using multiple separate quality filters, that are independent, at least partially so, to escape systematic biases of one or the other. For example, use multiple good link aggregation sites with different authors and biases.

  • Look up expert opinion data (favoring subject expertise, science, IQ, incentives, track records), with an emphasis on trends as one looks at more elite groups (one has to go famous scientists to find really low belief in religion and psychic powers, so I try to predict what a better expert class would believe when there is a consistent trend).

  • Try to get datasets (Wikipedia lists, World Bank info, USDA, etc.) as a primary step in thinking about a question.

  • Use Google Scholar, and especially the ability to search for papers citing influential papers in a field.

Research project process

  • You can quickly generate ideas by explaining to a questioner [GW style conversation notes, online conversations.]

  • Google doc bullet point lists are a great way to do group meeting brainstorms (multiple simultaneous channels, with readable notes/​records), compared to taking turns speaking.

  • I also like making hopefully-comprehensive taxonomies that one can work through in full to avoid selection biases, e.g. looking through all the sectors of the economy, or all the major categories of philanthropy, or all the major academic fields or think tanks. These are tools for hypothesis generation, and hitting on low-hanging fruit knowledge sources.

  • Thinking about end-goals and back-chaining to see things that would be relevant, rather than just using vague correlational criteria.

  • Look for high order bits, the biggest effects, as first priority, then incrementally add lower order refinements in accord with value of information.

  • I don’t wholly endorse the framework in this post, but you could say this is something like ’don’t be too easily satisfied with cluster thinking.

  • A random recent example of some of the techniques above: this comment.

Mindset

  • Don’t dismiss ideas as unthinkable (rather than actions as subject to strong injunctions): things that people are afraid of thinking about (because it might make them look bad, might imply bad news, is unpopular) have an elevated chance of offering low-hanging fruit for thinking.

  • Have a strong emotional revulsion to self-delusion and sloppy reasoning/​research, including people Wrong on the Internet within communities you have some affiliation with.

  • Listen to yourself if something seems troubling, and try articulating, exploring, and steel-manning that intuition in multiple ways until it makes sense in a way that can be integrated with other knowledge (with whatever updates/​revisions follow) or goes away. Don’t just run roughshod over ‘system 1’ feelings.

  • Being comfortable with your own personality, emotions, and desires can help with being willing to do that kind of analysis, by making fewer conclusions unacceptable to you (empirical ones in particular).

  • Rigid ideological systems in a lot of tension with your real goals can be a problem there. E.g. in Mormonism or utilitarianism or social justice, various empirical conclusions combine with the ideology to recommend ruining your life, and people are strongly conditioned to avoid them. This is actually a pretty good bit on it: Leave a Line of Retreat

  • Recognizing partial, as opposed to impartial, motives (personal projects, selfishness, family, tribalism) and not trying to rationalize everything with a 100% impartial facade, can help more comfortably think about questions like average well-being, or the real trade-off between burnout and effort, etc.

Further reading

  • Regarding rationality: the tips in Superforecasting are pretty good. Also The Signal and the Noise. Maybe Thinking, Fast and Slow. Rationality: From AI to Zombies has a lot of good points too (although it can be long-winded).

  • For examples of Fermi calculations, my blog has many. Bryan Caplan’s empirical posts often show some of the same habits regarding Googling, seeking out data, betting, and so forth (although he doesn’t apply the same rigor everywhere). Vaclav Smil’s books do a lot of this, and are praised by Bill Gates for highlighting interesting facts, although they are not always reliable.

No comments.