“For instance if a field has lots of money and not much talent this shouldn’t be lumped together into ‘this field is somewhat neglected’, it implies that putting in talent should be prioritised.”
I think this is a good point to raise, especially since it can be easy to forget/overlook.
“The existing EA view of neglectedness I think is that the less resources a field has going into it the more worthwhile it is to work in the field because there are diminishing marginal returns to inputs but broadly constant returns to U from each u.“
I do think that more EAs (or at least those on the forum) have started recognizing this as an oversimplification.
“ Within a good context this could be EA ideas. For instance within climate change applying EA ideas could dramatically increase the productivity of the money and talent already in the field.”
I think it’s good to consider this line of reasoning in various cases, but in this specific case of climate change I feel there’s good reason to expect that the degree of attention and idea crowding in the climate change realm will significantly limit the potential influence of EA ideas.
Overall, I have long been a proponent of thinking more carefully about the meaning/impact of neglectedness, and it’s interesting to see an attempt at more formalization. That being said, I don’t know if it’s just because I’m on my phone or not, but I think it would be helpful to make the equations/functions more readable and distinct from the rest of the text (e.g., as images or through markdown if possible to apply subscript formatting). Also, it may just be me, but I think it’s sometimes helpful to explain the concepts in plain writing before jumping into the equations (at least for less mathematically oriented people).
“For instance if a field has lots of money and not much talent this shouldn’t be lumped together into ‘this field is somewhat neglected’, it implies that putting in talent should be prioritised.”
I think this is a good point to raise, especially since it can be easy to forget/overlook.
“The existing EA view of neglectedness I think is that the less resources a field has going into it the more worthwhile it is to work in the field because there are diminishing marginal returns to inputs but broadly constant returns to U from each u.“
I do think that more EAs (or at least those on the forum) have started recognizing this as an oversimplification.
“ Within a good context this could be EA ideas. For instance within climate change applying EA ideas could dramatically increase the productivity of the money and talent already in the field.”
I think it’s good to consider this line of reasoning in various cases, but in this specific case of climate change I feel there’s good reason to expect that the degree of attention and idea crowding in the climate change realm will significantly limit the potential influence of EA ideas.
Overall, I have long been a proponent of thinking more carefully about the meaning/impact of neglectedness, and it’s interesting to see an attempt at more formalization. That being said, I don’t know if it’s just because I’m on my phone or not, but I think it would be helpful to make the equations/functions more readable and distinct from the rest of the text (e.g., as images or through markdown if possible to apply subscript formatting). Also, it may just be me, but I think it’s sometimes helpful to explain the concepts in plain writing before jumping into the equations (at least for less mathematically oriented people).