Interesting to see Economic Growth as an area of interest. Generally the EA movement has preferred more targeted ways to improve the far future such as working to reduce specific existential risks, or differential technological development. This seems to me to be a slight deviation from the norm.
Do you have a sense of the relative importance of work on economic growth vs working on a specific existential risk such as AI or biosecurity?
Are those areas of interest on your website in rough order of importance, or is the order random?
We have a more robust interest in neglected existential risks, such as AI and bio. However, we think the issues discussed in our economic growth section are good from a longtermist POV, and we’d like to see what ideas people put forward.
Our areas of interest aren’t in order of priority, and there’s internal disagreement about the order of priority.
Interesting to see Economic Growth as an area of interest. Generally the EA movement has preferred more targeted ways to improve the far future such as working to reduce specific existential risks, or differential technological development. This seems to me to be a slight deviation from the norm.
Do you have a sense of the relative importance of work on economic growth vs working on a specific existential risk such as AI or biosecurity?
Are those areas of interest on your website in rough order of importance, or is the order random?
We have a more robust interest in neglected existential risks, such as AI and bio. However, we think the issues discussed in our economic growth section are good from a longtermist POV, and we’d like to see what ideas people put forward.
Our areas of interest aren’t in order of priority, and there’s internal disagreement about the order of priority.