[Link] Book Review: The Secret Of Our Success | Slate Star Codex


Why are peo­ple so bad at rea­son­ing? For the same rea­son they’re so bad at let­ting poi­sonous spi­ders walk all over their face with­out freak­ing out. Both “skills” are re­ally bad ideas, most of the peo­ple who tried them died in the pro­cess, so evolu­tion re­moved those genes from the pop­u­la­tion, and suc­cess­ful cul­tures stig­ma­tized them enough to give peo­ple an in­ter­nal­ized fear of even try­ing.

I’m re­ally glad that I read this, and to be hon­est, a lit­tle dis­turbed by it. I was left with the sense that it was im­por­tant knowl­edge that seemed to be un­der­val­ued /​ wasn’t some­thing I’d been pre­vi­ously ex­posed to.

Sum­mary of why this is a worth­while read for peo­ple in­ter­ested in EA:

  • EA in­volves or at least seems in­ter­twined with heavy use of ra­tio­nal­ity to best iden­tify the prob­lems to solve and to best solve them.

  • This post by Scott Alexan­der pre­sents a com­pel­ling case of some of the down­sides of ra­tio­nal­ity.

  • It also pre­sents a case for cul­tural evolu­tion be­ing a, or the, key force for hu­man progress.

  • In the pur­suit of do­ing as much good as pos­si­ble, with the as­sis­tance of ra­tio­nal­ity, it seems use­ful for EAs and the EA com­mu­nity to have an un­der­stand­ing of the his­tor­i­cal challenges with ra­tio­nal­ity, as well as the im­por­tance of cul­tural evolu­tion to hu­man progress over the long-term fu­ture.

For what it’s worth, I still have lots of open ques­tions. But it seems like this book, and the re­view, both con­tain po­ten­tially im­por­tant and un­der-dis­cussed ideas.