Learnings about literature review strategy from research practice sessions

Ear­lier this year, my­self and a few other early-ca­reer re­searchers de­cided to try cre­at­ing ses­sions where we could de­liber­ately prac­tice cer­tain re­search skills by cre­at­ing a high-feed­back en­vi­ron­ment where we could build in­tu­itions and re­fine tech­niques. This grew out of a frus­tra­tion with the slow feed­back loops typ­i­cal of re­search, the ex­cit­ing prospect of im­prov­ing skills that should pay off con­sis­tently through an en­tire ca­reer, and a lack of good re­sources on how to ac­tu­ally do re­search.

This doc­u­ment col­lects some of the les­sons learned from our first re­search prac­tice ex­per­i­ment, a se­ries of five liter­a­ture re­view prac­tice ses­sions with be­tween 3-6 longter­mist re­searchers. Note though that many of the benefits were from im­proved in­tu­itions or small de­tails about how to do liter­a­ture re­views and were hard to boil down into con­crete ad­vice. For ex­am­ple, we found this to be a use­ful en­vi­ron­ment to ex­plore the finer de­tails of how to do re­search that do not oth­er­wise come up eas­ily in con­ver­sa­tion (e.g. how many searches on Google Scholar do you do be­fore you start read­ing pa­pers?), build in­tu­itions on re­search taste (i.e. judg­ing trust­wor­thi­ness of a pa­per quickly), and to bet­ter un­der­stand what we ac­tu­ally do when we do liter­a­ture re­views (think for a mo­ment, do you ac­tu­ally know what steps you take when start­ing a re­view?).

As such, I think it is likely worth­while to try the ex­er­cise for your­self with a group of peers. The struc­ture we used was as fol­lows:

  • Iden­tify an in­ter­est­ing ques­tion (be­fore start­ing): Th­ese can be al­most any­thing so long as the ques­tion is well-struc­tured (e.g. try to avoid hav­ing two ques­tions in one). It’s worth play­ing around with the type of ques­tion too and see­ing how that af­fects the liter­a­ture re­view strat­egy.

  • Do a quick liter­a­ture re­view on the topic (50-60 min­utes): Fo­cus on an­swer­ing the ques­tion and pro­duc­ing a read­able out­put. A good fram­ing for the out­put is to make some­thing you could con­fi­dently come back to af­ter a month of not think­ing about the topic and be ready to con­tinue the pro­ject/​re­view.

  • Read and com­ment on each oth­ers’ work (15 min­utes): While read­ing and cri­tiquing oth­ers’ work, each per­son would fo­cus also on what they could have done bet­ter given what they see oth­ers have accomplished

  • Come to­gether to dis­cuss (45 min­utes): This would likely touch on what the oth­ers found, how they found it, ways we got stuck, and po­ten­tial meth­ods to im­prove on the tech­nique which was used. We would gen­er­ally not spend much time on the sub­ject mat­ter and try to main­tain fo­cus on meth­ods rather than con­tent once the re­view was done.

The fol­low­ing are tech­niques we iden­ti­fied as be­ing use­ful. We be­lieve that these should be valid for liter­a­ture re­views on al­most any topic. Note though that we fo­cused pri­mar­ily on ques­tions within the so­cial sci­ences, so take this ad­vice with a grain of salt if you are do­ing tech­ni­cal work.

  • Fo­cus on breadth first! It’s best to have a pile of pa­pers ready to look at be­fore div­ing too deep into any given pa­per. Some pa­pers are far more valuable than oth­ers and if you dive into the first you find, you might miss the high­est value pa­pers. Search­ing smartly is of­ten more effec­tive than go­ing down the cita­tion trail (un­less you find the right meta-anal­y­sis that is). Be aware how­ever that it’s pos­si­ble to spend too much time on this step (like any of the oth­ers), though we tend to think peo­ple usu­ally spend too lit­tle. It might even be worth set­ting a timer for this step to make sure you don’t cut it off too soon or too late. For the ex­er­cise I’d sug­gest (with high un­cer­tainty) 15 min­utes, for a real liter­a­ture re­view it re­ally de­pends on con­text.

  • Try a va­ri­ety of search terms: It’s al­most always use­ful to spend a few min­utes gen­er­at­ing syn­onyms and po­ten­tially use­ful terms to search for early on. You prob­a­bly want to be think­ing about good search terms con­sis­tently in the early stages of your lit re­view. It’s easy to think you’ve searched for all the rele­vant terms when you’ve already found some­thing a bit promis­ing, so it’s good to think a bit out­side the box early on and keep look­ing for new terms to use for search through­out the pro­cess. Some search terms can yield much bet­ter re­sults than oth­ers, some­times sur­pris­ingly. For ex­am­ple, we once found that “pri­vate tu­tor­ing effect” yielded ir­rele­vant pa­pers while “one-on-one tu­tor­ing effect” gave ex­actly what we were look­ing for.

  • Think be­fore you search: For some ques­tions it can help to make a rough model/​writeup of how you would an­swer the ques­tion first or to break down the ques­tion into its con­stituent parts so that you can search through the topic in a way that makes sense to you and/​or an­swers ques­tions of in­ter­est posed by the model. E.g. if you are re­search­ing what makes peo­ple good re­searchers you might find it use­ful to first write out what you think the an­swer is, which fac­tors seem most im­por­tant, etc. While this may seem to carry the risk of bi­as­ing your search, I’d ar­gue that it’s bet­ter to have ex­plicit, known pos­si­bil­ities of bias than to let im­plicit as­sump­tions drive your search with­out you be­ing aware of them. This way you can:

    • Be wrong and no­tice that you were wrong and thus up­date your think­ing. Other­wise you might read through new in­for­ma­tion and think you already knew it, pos­si­bly lead­ing to your not in­cor­po­rat­ing the new in­for­ma­tion well

    • Con­sciously look for in­for­ma­tion that con­trasts with what you think (as op­posed to the pos­si­bil­ity of hav­ing an im­plicit bias in your search)

    • Find use­ful search terms by break­ing down the area into com­po­nent parts which seem more likely to have a solid aca­demic liter­a­ture around them

  • Con­nect­ed­pa­pers.com is a use­ful search tool in al­most all oc­ca­sions once you’ve found a rele­vant pa­per to feed into it. Keep in mind though that it some­times fails to make the right con­nec­tions so it’s good prac­tice to gen­er­ate graphs with a few differ­ent pa­pers.

    • In or­der to find meta-analy­ses with this tool it can be helpful to use the deriva­tive works tab once you’ve gen­er­ated a graph with a use­ful pa­per.

  • To sum­ma­rize find­ings I (Alex) gen­er­ally make a list of po­ten­tial an­swers to the ques­tion and in­clude the cita­tions for each. This might be worth try­ing, though it was not proven to re­li­ably be the ideal method of con­soli­dat­ing a liter­a­ture review

Thanks Nora Am­man, Jan Brauner, and Ben Sn­odin for feed­back. Thanks to Nora Am­man, On­drej Ba­j­gar, Jan Brauner, Lukas Fin­nve­den, and oth­ers for at­tend­ing the ses­sions, com­ing up with some of the above ideas, and helping im­prove the pro­cess!