[This comment isn’t meant to signal any opinion about the rest of your post.]
Carlsmith’s report in particular is highly interdisciplinary and draws on technical AI, economics, and also philosophy. It doesn’t have much in the way of technical AI or economics claims. It’s not really clear who would be most qualified to write this, but in general a philosopher doesn’t seem like such a bad choice. In fact, I’d think the average philosopher with strong quantitative skills would be better at this than the average economist or certainly AI researcher.
Whether a more experienced philosopher should have done it is another question, but I’d imagine that even with money Open Phil cannot summon very experienced experts to write reports for them at the drop of a hat.
The flip side here is that What We Owe the Future isn’t really a philosophy book, or at the very least it reads pretty differently to me than other analytical philosophy books.
My argument here is that Will probably has one of the if not the best understanding of longtermism and EA at a ‘theoretical’ level than anyone else in the world. This made him incredibly well-placed to essentially ‘set the direction’ of the research and identify what to focus on in WWOTF. He was then able to engage with individual experts to write the individual chapters. He has demonstrated an ability to write compelling, engaging books (Doing Good Better) so should be able to tie up expert research into a readable book. Overall he seems like an incredibly good choice to write WWOTF.
I’d imagine that even with money Open Phil cannot summon very experienced experts to write reports for them at the drop of a hat.
Maybe. Maybe not. This makes me think of the Stern Review which incidentally wasn’t really written by a world-renowned expert but was led by one:
On 19 July 2005 the Chancellor of the Exchequer, Gordon Brown announced that he had asked Sir Nicholas Stern to lead a major review of the economics of climate change, to understand more comprehensively the nature of the economic challenges and how they can be met, in the UK and globally.[13] The Stern Review was prepared by a team of economists at HM Treasury; independent academics were involved as consultants only. The scientific content of the Review was reviewed by experts from the Walker Institute.[14]
Maybe this would be a good model for research for EA organisations?
[This comment isn’t meant to signal any opinion about the rest of your post.]
Carlsmith’s report in particular is highly interdisciplinary and draws on technical AI, economics, and also philosophy. It doesn’t have much in the way of technical AI or economics claims. It’s not really clear who would be most qualified to write this, but in general a philosopher doesn’t seem like such a bad choice. In fact, I’d think the average philosopher with strong quantitative skills would be better at this than the average economist or certainly AI researcher.
Whether a more experienced philosopher should have done it is another question, but I’d imagine that even with money Open Phil cannot summon very experienced experts to write reports for them at the drop of a hat.
The flip side here is that What We Owe the Future isn’t really a philosophy book, or at the very least it reads pretty differently to me than other analytical philosophy books.
And indeed Will consulted many experts extensively.
My argument here is that Will probably has one of the if not the best understanding of longtermism and EA at a ‘theoretical’ level than anyone else in the world. This made him incredibly well-placed to essentially ‘set the direction’ of the research and identify what to focus on in WWOTF. He was then able to engage with individual experts to write the individual chapters. He has demonstrated an ability to write compelling, engaging books (Doing Good Better) so should be able to tie up expert research into a readable book. Overall he seems like an incredibly good choice to write WWOTF.
Maybe. Maybe not. This makes me think of the Stern Review which incidentally wasn’t really written by a world-renowned expert but was led by one:
Maybe this would be a good model for research for EA organisations?