How do you decide on which research areas to focus on and, relatedly, how do you decide how to allocate money to them?
We do broadly aim to maximize the cost-effectiveness of our research work and so we focus on allocating money to opportunities that we think are most cost-effective on the margin.
Given that, it may be surprising that we work in multiple cause areas, but we face some interesting constraints and considerations:
There is significant uncertainty about which priority area is most impactful. The general approach to RP has been that we can scale up multiple high-quality research teams in a variety of cause areas easier than we can figure out which cause area we ought to prioritize. Though we recently hired a Worldview Investigations Team to work a lot more on the broader question of how to allocate an EA portfolio. We also are investing a lot more in our own impact assessment. Together we hope that these will give us more insights into how to allocate our work going forward.
There may be diminishing returns to RP focusing on any one priority area.
A large amount of resources are not fungible across these different areas. The marginal opportunity cost to taking restricted funding is pretty low as we cannot easily allocate these resources to other areas, even if we were convinced they were higher impact.
Work on any single area might gain from our working on multiple areas as teams have much greater access to centralized resources, staff, funding, and productive oversight than what they would receive if the team existed independently and solely focused on that priority. Relationships within an area could poten be useful for work in another area.
Working across different priorities allows the organization to build capacity, reputation, and relations, and maintain option value for the future.
Thanks for this. I notice that all of these reasons are points in favor of working on multiple causes and seem to neglect considerations that would go in the other direction. And clearly, you take this considerations seriously too (e.g., scale and urgency) as you recently decided to focus exclusively on AI within the longtermism team now.
We do broadly aim to maximize the cost-effectiveness of our research work and so we focus on allocating money to opportunities that we think are most cost-effective on the margin.
Given that, it may be surprising that we work in multiple cause areas, but we face some interesting constraints and considerations:
There is significant uncertainty about which priority area is most impactful. The general approach to RP has been that we can scale up multiple high-quality research teams in a variety of cause areas easier than we can figure out which cause area we ought to prioritize. Though we recently hired a Worldview Investigations Team to work a lot more on the broader question of how to allocate an EA portfolio. We also are investing a lot more in our own impact assessment. Together we hope that these will give us more insights into how to allocate our work going forward.
There may be diminishing returns to RP focusing on any one priority area.
A large amount of resources are not fungible across these different areas. The marginal opportunity cost to taking restricted funding is pretty low as we cannot easily allocate these resources to other areas, even if we were convinced they were higher impact.
Work on any single area might gain from our working on multiple areas as teams have much greater access to centralized resources, staff, funding, and productive oversight than what they would receive if the team existed independently and solely focused on that priority. Relationships within an area could poten be useful for work in another area.
Working across different priorities allows the organization to build capacity, reputation, and relations, and maintain option value for the future.
Thanks for this. I notice that all of these reasons are points in favor of working on multiple causes and seem to neglect considerations that would go in the other direction. And clearly, you take this considerations seriously too (e.g., scale and urgency) as you recently decided to focus exclusively on AI within the longtermism team now.