RSS

Aaron_Scher

Karma: 555

I’m Aaron, I’ve done Uni group organizing at the Claremont Colleges for a bit. Current cause prioritization is AI Alignment.

Five ne­glected work ar­eas that could re­duce AI risk

Aaron_Scher24 Sep 2023 2:09 UTC
22 points
0 comments9 min readEA link

It’s not ob­vi­ous that get­ting dan­ger­ous AI later is better

Aaron_Scher23 Sep 2023 5:35 UTC
23 points
9 comments16 min readEA link

The Unilat­er­al­ist’s Curse, An Explanation

Aaron_Scher9 Jun 2022 18:25 UTC
27 points
2 comments7 min readEA link

Why didn’t we turn it off? A cre­ative fic­tional story of AI takeover

Aaron_Scher3 May 2022 14:13 UTC
5 points
0 comments6 min readEA link

A vi­su­al­iza­tion of some orgs in the AI Safety Pipeline

Aaron_Scher10 Apr 2022 16:52 UTC
11 points
8 comments1 min readEA link

[Question] Anal­ogy of AI Align­ment as Rais­ing a Child?

Aaron_Scher19 Feb 2022 21:40 UTC
4 points
2 comments1 min readEA link

EA Clare­mont Win­ter 21/​22 In­tro Fel­low­ship Retrospective

Aaron_Scher21 Jan 2022 6:15 UTC
14 points
0 comments11 min readEA link

We should be pay­ing In­tro Fellows

Aaron_Scher25 Dec 2021 10:23 UTC
28 points
11 comments6 min readEA link

Pilot study re­sults: Cost-effec­tive­ness in­for­ma­tion did not in­crease in­ter­est in EA

Aaron_Scher19 Dec 2021 8:22 UTC
33 points
4 comments4 min readEA link

Aaron_Scher’s Quick takes

Aaron_Scher27 Oct 2021 7:32 UTC
1 point
13 comments1 min readEA link