RSS

Align­ment Re­search Center

TagLast edit: Jul 3, 2022, 6:12 PM by Leo

The Alignment Research Center (ARC) is a non-profit research organization focused on AI alignment. It was founded in 2021 by Paul Christiano.[1]

Funding

As of July 2022, ARC has received over $260,000 in funding from Open Philanthropy.[2]

Further reading

Christiano, Paul (2021) Announcing the Alignment Research Center, AI Alignment, April 27.

External links

Alignment Research Center. Official website.

Apply for a job.

  1. ^

    Christiano, Paul (2021) Announcing the Alignment Research Center, AI Alignment, April 27.

  2. ^

    Open Philanthropy (2022) Grants database: Alignment Research Center, Open Philanthropy.

ARC is hiring al­ign­ment the­ory researchers

Paul_ChristianoDec 14, 2021, 8:17 PM
89 points
4 comments1 min readEA link

2021 AI Align­ment Liter­a­ture Re­view and Char­ity Comparison

LarksDec 23, 2021, 2:06 PM
176 points
18 comments73 min readEA link

Chris­ti­ano (ARC) and GA (Con­jec­ture) Dis­cuss Align­ment Cruxes

Andrea_MiottiFeb 24, 2023, 11:03 PM
16 points
1 comment1 min readEA link

What I would do if I wasn’t at ARC Evals

Lawrence ChanSep 6, 2023, 5:17 AM
130 points
4 comments13 min readEA link
(www.lesswrong.com)

ARC Evals: Re­spon­si­ble Scal­ing Policies

Zach Stein-PerlmanSep 28, 2023, 4:30 AM
16 points
1 comment1 min readEA link
(evals.alignment.org)

ARC is hiring the­o­ret­i­cal researchers

Jacob_HiltonJun 12, 2023, 7:11 PM
78 points
0 comments4 min readEA link
(www.lesswrong.com)

Safety eval­u­a­tions and stan­dards for AI | Beth Barnes | EAG Bay Area 23

Beth BarnesJun 16, 2023, 2:15 PM
28 points
0 comments17 min readEA link
No comments.