RSS

Align­ment Re­search Center

TagLast edit: 3 Jul 2022 18:12 UTC by Leo

The Alignment Research Center (ARC) is a non-profit research organization focused on AI alignment. It was founded in 2021 by Paul Christiano.[1]

Funding

As of July 2022, ARC has received over $260,000 in funding from Open Philanthropy.[2]

Further reading

Christiano, Paul (2021) Announcing the Alignment Research Center, AI Alignment, April 27.

External links

Alignment Research Center. Official website.

Apply for a job.

  1. ^

    Christiano, Paul (2021) Announcing the Alignment Research Center, AI Alignment, April 27.

  2. ^

    Open Philanthropy (2022) Grants database: Alignment Research Center, Open Philanthropy.

ARC is hiring al­ign­ment the­ory researchers

Paul_Christiano14 Dec 2021 20:17 UTC
89 points
4 comments2 min readEA link

2021 AI Align­ment Liter­a­ture Re­view and Char­ity Comparison

Larks23 Dec 2021 14:06 UTC
176 points
18 comments75 min readEA link

Chris­ti­ano (ARC) and GA (Con­jec­ture) Dis­cuss Align­ment Cruxes

Andrea_Miotti24 Feb 2023 23:03 UTC
16 points
1 comment1 min readEA link

What I would do if I wasn’t at ARC Evals

Lawrence Chan6 Sep 2023 5:17 UTC
130 points
4 comments13 min readEA link
(www.lesswrong.com)

ARC Evals: Re­spon­si­ble Scal­ing Policies

Zach Stein-Perlman28 Sep 2023 4:30 UTC
16 points
1 comment1 min readEA link
(evals.alignment.org)

ARC is hiring the­o­ret­i­cal researchers

Jacob_Hilton12 Jun 2023 19:11 UTC
78 points
0 comments4 min readEA link
(www.lesswrong.com)

Safety eval­u­a­tions and stan­dards for AI | Beth Barnes | EAG Bay Area 23

Beth Barnes16 Jun 2023 14:15 UTC
27 points
0 comments17 min readEA link
No comments.