Ya, those were some of the kinds of things I had in mind, and also the possibility of contributing to or reducing s-risks, and adjustable weights to s-risks vs extinction:
Because of the funding situation, taking resources away from other actions to reduce extinction risks would probably mostly come in people’s time, e.g. the time of the people supervising you, reading your work or otherwise engaging with you. If an AI safety org hires you or you get a grant to work on something, then presumably they think you’re worth the time, though! And one more person going through the hiring or grant process is not that costly for those managing it.
Ya, those were some of the kinds of things I had in mind, and also the possibility of contributing to or reducing s-risks, and adjustable weights to s-risks vs extinction:
https://arbital.com/p/hyperexistential_separation/
https://reducing-suffering.org/near-miss/
Because of the funding situation, taking resources away from other actions to reduce extinction risks would probably mostly come in people’s time, e.g. the time of the people supervising you, reading your work or otherwise engaging with you. If an AI safety org hires you or you get a grant to work on something, then presumably they think you’re worth the time, though! And one more person going through the hiring or grant process is not that costly for those managing it.