First, my understanding of what you mean by this is that exploration involves taking time to learn more about an area, whereas exploitation involves focusing on trying to make an impact within that area. On one hand, it can be important to learn more in order to better orient oneself in the right direction. On the other hand, spending too much time on exploration can mean not making much of an impact. My apologies if this is not what you intended.
There often is a need for balancing between learning more and getting things done. In theory, one could calculate the optimal balance; the math would look similar to optimal economic growth models, e.g. the Ramsey–Cass–Koopmans model. In practice, it’s perhaps a bit more of an art. It’s just not worth it to build advanced mathematical models for our various decisions, and we might not know what to put into the models anyway.
It is sometimes possible to avoid this sort of tradeoff. This occurs when people can “learn on the job”, making an impact while improving their understanding of the area. This is one concept behind the GCRI Advising and Collaboration Program. I think it’s good to pursue these sorts of opportunities wherever reasonably possible.
That said, I would generally advise early career people to invest heavily in learning more and in professional development more generally. Integrated across a career, I think there’s a lot of value added to being a more high-skilled contributor. Exactly how much should be invested will be a case-by-case matter.
Regarding the idea that my advice isn’t good:
I would expect that my advice is probably pretty good, and that other knowledgeable people probably also have good advice, and the best insights will come from considering a mix of different advice.
Regarding roles for software developers:
The most obvious is on AI safety and ethics. My understanding is that a lot of people with software backgrounds are in this space. This is one area in which other people would probably have better advice than me, because this isn’t my own focus.
Within my areas, there’s relatively little computational work because we don’t have much data to crunch numbers etc. Global catastrophes fortunately don’t happen particularly often, but it does create a data scarcity that makes the analysis less computational and more interpretive.
Thank you for these thoughtful comments.
Regarding exploration vs. exploitation:
First, my understanding of what you mean by this is that exploration involves taking time to learn more about an area, whereas exploitation involves focusing on trying to make an impact within that area. On one hand, it can be important to learn more in order to better orient oneself in the right direction. On the other hand, spending too much time on exploration can mean not making much of an impact. My apologies if this is not what you intended.
There often is a need for balancing between learning more and getting things done. In theory, one could calculate the optimal balance; the math would look similar to optimal economic growth models, e.g. the Ramsey–Cass–Koopmans model. In practice, it’s perhaps a bit more of an art. It’s just not worth it to build advanced mathematical models for our various decisions, and we might not know what to put into the models anyway.
It is sometimes possible to avoid this sort of tradeoff. This occurs when people can “learn on the job”, making an impact while improving their understanding of the area. This is one concept behind the GCRI Advising and Collaboration Program. I think it’s good to pursue these sorts of opportunities wherever reasonably possible.
That said, I would generally advise early career people to invest heavily in learning more and in professional development more generally. Integrated across a career, I think there’s a lot of value added to being a more high-skilled contributor. Exactly how much should be invested will be a case-by-case matter.
Regarding the idea that my advice isn’t good:
I would expect that my advice is probably pretty good, and that other knowledgeable people probably also have good advice, and the best insights will come from considering a mix of different advice.
Regarding roles for software developers:
The most obvious is on AI safety and ethics. My understanding is that a lot of people with software backgrounds are in this space. This is one area in which other people would probably have better advice than me, because this isn’t my own focus.
Within my areas, there’s relatively little computational work because we don’t have much data to crunch numbers etc. Global catastrophes fortunately don’t happen particularly often, but it does create a data scarcity that makes the analysis less computational and more interpretive.