Hereâs a thought you might have: âAI timelines are so short, itâs significantly reducing the net present value[1] of {high school, college} students.â I think that view is tempting based on eg (1, 2). However, I claim that thereâs a âplay to your outsâ effect here. Not in a âAGI is hardâ way, but âwe slow down the progress of AI capability developmentâ.
In those worlds, we get more time. And, with that more time, it sure would seem great if we could have a substantial fraction of the young workforce care about X-Risk /â be thinking with EA principles. Given the success weâve had historically in convincing young people of the value of these ideas, it still seems pretty promising to have some portion of our community continue to put work into doing so.
Many young people (zoomers) care a lot about climate change, afaik. I think adding the hopefully-small thought of âhow about going over all things that might destroy the world and prioritizing them (instead of staying with the first one you found)â might go a long way, maybe
Hereâs a thought you might have: âAI timelines are so short, itâs significantly reducing the net present value[1] of {high school, college} students.â I think that view is tempting based on eg (1, 2). However, I claim that thereâs a âplay to your outsâ effect here. Not in a âAGI is hardâ way, but âwe slow down the progress of AI capability developmentâ.
In those worlds, we get more time. And, with that more time, it sure would seem great if we could have a substantial fraction of the young workforce care about X-Risk /â be thinking with EA principles. Given the success weâve had historically in convincing young people of the value of these ideas, it still seems pretty promising to have some portion of our community continue to put work into doing so.
By this I mean, the lifetime impact of the student, discounted over time.
Many young people (zoomers) care a lot about climate change, afaik. I think adding the hopefully-small thought of âhow about going over all things that might destroy the world and prioritizing them (instead of staying with the first one you found)â might go a long way, maybe