Interview with Prof Tetlock on epistemic modesty, predicting catastrophic risks, AI, and more

80,000 Hours did an in­ter­view with Pro­fes­sor Tet­lock, one of the world’s top ex­perts on how to have ac­cu­rate be­liefs about the fu­ture. We asked him about a bunch of ques­tions of in­ter­est to the com­mu­nity here:

  • Should peo­ple who want to be right just adopt the views of ex­perts rather than ap­ply their own judge­ment?

  • Why are Berkeley un­der­grads worse fore­cast­ers than dart-throw­ing chimps?

  • How can listen­ers con­tribute to his lat­est cut­ting-edge re­search?

  • What do we know about our ac­cu­racy at pre­dict­ing low-prob­a­bil­ity high-im­pact dis­asters?

  • Does his re­search provide an in­tel­lec­tual ba­sis for pop­ulist poli­ti­cal move­ments?

  • Was the Iraq War caused by bad poli­tics, or bad in­tel­li­gence meth­ods?

  • Can ex­pe­rience help peo­ple avoid over­con­fi­dence and un­der­con­fi­dence?

  • When does an AI eas­ily beat hu­man judge­ment?

  • Could more ac­cu­rate fore­cast­ing meth­ods make the world more dan­ger­ous?

  • What are the odds we’ll go to war with China?

  • Should we let pre­dic­tion tour­na­ments run most of the gov­ern­ment?

Here’s a pre­view:

Robert Wiblin: There’s a very ac­tive de­bate in the effec­tive al­tru­ism com­mu­nity at the mo­ment about how much peo­ple should adopt the in­side view ver­sus the out­side view, and how much they should just defer to mass opinion on im­por­tant ques­tions, or just defer to the av­er­age view of a bunch of ex­perts. Do you have any views on that? There’s some peo­ple pro­mot­ing a very rad­i­cal view, ba­si­cally, that you should al­most ig­nore your own in­side view and only look at the re­ported views of other peo­ple, or give your own in­side view no more weight than any­one else’s. Do you think that’s a good ap­proach to hav­ing more ac­cu­rate be­liefs?

Philip Tet­lock: I’ve never been able to im­pose that kind of monas­tic dis­ci­pline on my­self. The di­vi­sion be­tween the in­side and the out­side view is blurry on close in­spec­tion. I mean, if you start off your date with a base rate prob­a­bil­ity of di­vorce for the cou­ple be­ing 35%, then you … In­for­ma­tion comes in about quar­rels or about this or about that, you’re go­ing to move your prob­a­bil­ities up or down. That’s kind of in­side view in­for­ma­tion, and that’s proper be­lief up­dat­ing.

Get­ting the me­chan­ics of be­lief up­dat­ing are very tricky and there’s a prob­lem of both cog­ni­tive con­ser­vatism, un­der ad­just­ing your be­liefs and re­sponse to new ev­i­dence, and also the prob­lem of ex­cess volatility, over ad­just­ing and spik­ing around too much. Both of which can ob­vi­ously de­grade ac­cu­racy...”

Con­tinue read­ing...

No comments.