Relevant context for those unaware: supposedly, Good Ventures (and by extension OpenPhil) has recently decided to pull out of funding artificial sentience.
Can you give some examples of topics that qualify and some that don’t qualify as “EA priorities”?
I feel like for the purpose of getting the debate started, the vague question is fine. For the purpose of measuring agreement/disagreement and actually directly debating the statement, it’s potentially problematic. Does EA as a whole have priorities? How much of a priority should it be?
Interesting distinction, thank you! I’m thinking of a chart like this, which represents descriptive or revealed “EA Priorities”
(Link to spreadsheet here, and original Forum post here). The question is (roughly) whether Artificial Welfare should take up 5% of that right hand side bar or not. And also similar for EA talent distribution (which I don’t have a graph to hand for).
As a more general point- I think we can say that EA has priorities, insofar as funders and individuals, in their self-reported EA decisions, clearly have priorities. We will be arguing about prescriptive priorities (what EAs should do), but paying attention to descriptive priorities (what EAs already do).
I like this!
Relevant context for those unaware: supposedly, Good Ventures (and by extension OpenPhil) has recently decided to pull out of funding artificial sentience.
Can you give some examples of topics that qualify and some that don’t qualify as “EA priorities”?
I feel like for the purpose of getting the debate started, the vague question is fine. For the purpose of measuring agreement/disagreement and actually directly debating the statement, it’s potentially problematic. Does EA as a whole have priorities? How much of a priority should it be?
Interesting distinction, thank you!
I’m thinking of a chart like this, which represents descriptive or revealed “EA Priorities”
(Link to spreadsheet here, and original Forum post here). The question is (roughly) whether Artificial Welfare should take up 5% of that right hand side bar or not. And also similar for EA talent distribution (which I don’t have a graph to hand for).
As a more general point- I think we can say that EA has priorities, insofar as funders and individuals, in their self-reported EA decisions, clearly have priorities. We will be arguing about prescriptive priorities (what EAs should do), but paying attention to descriptive priorities (what EAs already do).