RSS

Stan­ford Ex­is­ten­tial Risks Initiative

TagLast edit: 24 Jul 2022 12:23 UTC by Leo

Stanford Existential Risks Initiative (SERI) is a collaboration between faculty and students of Stanford University dedicated to mitigating global catastrophic risks.[1] It was launched on 15 May 2020.[2]

SERI runs an undergraduate summer research program, various speaker events and discussions, and a class by Professors Stephen Luby and Paul Edwards on “preventing human extinction”.[3]

Funding

As of June 2022, SERI has received over $1.5 million in funding from Open Philanthropy,[4] and $125,000 from the Survival and Flourishing Fund,[5] and $60,000 from Effective Altruism Funds.[6]

External links

Stanford Existential Risks Initiative. Official website.

Apply for a job.

Related entries

Cambridge Existential Risks Initiative | fellowships & internships | field building | research training programs

  1. ^

    Stanford Existential Risks Initiative (2021) Our mission, Stanford Existential Risks Initiative.

  2. ^

    Veit, Cooper (2020) Stanford Existential Risk Initiative tackles global threats, The Stanford Daily, May 20.

  3. ^

    Luby, Stephen & Paul Edwards (2020) Preventing human extinction, Course syllabus, Stanford University.

  4. ^

    Open Philanthropy (2022) Grants database: Stanford Existential Risks Initiative, Open Philanthropy.

  5. ^

    Survival and Flourishing Fund (2019) SFF-2020-H2 S-process recommendations announcement, Survival and Flourishing Fund.

  6. ^

    Long-Term Future Fund (2021) May 2021: Long-Term Future Fund grants, Effective Altruism Funds, May.

Les­sons from Run­ning Stan­ford EA and SERI

kuhanj20 Aug 2021 14:51 UTC
264 points
26 comments23 min readEA link

Sum­mer Re­search In­tern­ship: Stan­ford Ex­is­ten­tial Risks Ini­ti­a­tive, Dead­line April 21st

kuhanj16 Apr 2021 22:33 UTC
68 points
2 comments1 min readEA link

Feed­back I’ve been giv­ing to ju­nior x-risk researchers

Will Aldred15 Aug 2022 20:46 UTC
145 points
2 comments5 min readEA link

2023 Stan­ford Ex­is­ten­tial Risks Conference

elizabethcooper24 Feb 2023 17:49 UTC
29 points
5 comments1 min readEA link

Ap­ply to the Stan­ford Ex­is­ten­tial Risks Con­fer­ence! (April 17-18)

kuhanj26 Mar 2021 18:28 UTC
26 points
2 comments1 min readEA link

Ap­ply for Stan­ford Ex­is­ten­tial Risks Ini­ti­a­tive (SERI) Postdoc

Vael Gates14 Dec 2021 21:50 UTC
28 points
2 comments1 min readEA link

Fa­nat­i­cism in AI: SERI Project

Jake Arft-Guatelli24 Sep 2021 4:39 UTC
7 points
2 comments5 min readEA link

2021 AI Align­ment Liter­a­ture Re­view and Char­ity Comparison

Larks23 Dec 2021 14:06 UTC
176 points
18 comments73 min readEA link

How can economists best con­tribute to pan­demic pre­ven­tion and pre­pared­ness?

Rémi T22 Aug 2021 20:49 UTC
56 points
3 comments23 min readEA link

What Ques­tions Should We Ask Speak­ers at the Stan­ford Ex­is­ten­tial Risks Con­fer­ence?

kuhanj10 Apr 2021 0:51 UTC
21 points
2 comments2 min readEA link

Long-Term Fu­ture Fund: May 2021 grant recommendations

abergal27 May 2021 6:44 UTC
110 points
17 comments57 min readEA link

SERI ML ap­pli­ca­tion dead­line is ex­tended un­til May 22.

Viktoria Malyasova22 May 2022 0:13 UTC
13 points
3 comments1 min readEA link

Stan­ford Ex­is­ten­tial Risks Conference

Jordan Pieters 🔸21 Apr 2023 20:32 UTC
6 points
0 comments1 min readEA link

[Job Ad] SERI MATS is hiring for our sum­mer program

annashive26 May 2023 4:51 UTC
8 points
1 comment7 min readEA link

Does se­quence obfus­ca­tion pre­sent a bio-threat? Prob­a­bly not (yet)

Ben Stewart3 Sep 2021 9:33 UTC
18 points
0 comments4 min readEA link

SERI ML Align­ment The­ory Schol­ars Pro­gram 2022

Ryan Kidd27 Apr 2022 16:33 UTC
57 points
2 comments3 min readEA link

Assess­ing SERI/​CHERI/​CERI sum­mer pro­gram im­pact by sur­vey­ing fellows

L Rudolf L26 Sep 2022 15:29 UTC
102 points
11 comments15 min readEA link

Launch­ing the SERI Sum­mer Re­search Fellowship

Sage Bergerson1 Apr 2022 6:38 UTC
28 points
2 comments1 min readEA link

Nu­clear Es­pi­onage and AI Governance

GAA4 Oct 2021 18:21 UTC
32 points
3 comments24 min readEA link

Trans­for­ma­tive AI and Com­pute [Sum­mary]

lennart23 Sep 2021 13:53 UTC
61 points
5 comments9 min readEA link

SERI MATS—Sum­mer 2023 Cohort

a_e_r8 Apr 2023 15:32 UTC
36 points
2 comments1 min readEA link

De­com­pos­ing Biolog­i­cal Risks: Harm, Po­ten­tial, and Strategies

simeon_c14 Oct 2021 7:09 UTC
26 points
3 comments9 min readEA link

Com­pute Re­search Ques­tions and Met­rics—Trans­for­ma­tive AI and Com­pute [4/​4]

lennart28 Nov 2021 22:18 UTC
18 points
2 comments1 min readEA link

What is Com­pute? - Trans­for­ma­tive AI and Com­pute [1/​4]

lennart23 Sep 2021 13:54 UTC
48 points
6 comments18 min readEA link

Fore­cast­ing Com­pute—Trans­for­ma­tive AI and Com­pute [2/​4]

lennart1 Oct 2021 8:25 UTC
39 points
6 comments19 min readEA link

Com­pute Gover­nance and Con­clu­sions—Trans­for­ma­tive AI and Com­pute [3/​4]

lennart14 Oct 2021 7:55 UTC
20 points
3 comments5 min readEA link

SERI MATS Pro­gram—Win­ter 2022 Cohort

Ryan Kidd8 Oct 2022 19:09 UTC
50 points
4 comments1 min readEA link
No comments.