RSS

Greg_Colbourn

Karma: 4,927

Global moratorium on AGI, now (Twitter). Founder of CEEALAR (née the EA Hotel; ceealar.org)

Ap­pli­ca­tions open: Sup­port for tal­ent work­ing on in­de­pen­dent learn­ing, re­search or en­trepreneurial pro­jects fo­cused on re­duc­ing global catas­trophic risks

CEEALAR9 Feb 2024 13:04 UTC
63 points
1 comment2 min readEA link

Fund­ing cir­cle aimed at slow­ing down AI—look­ing for participants

Greg_Colbourn25 Jan 2024 23:58 UTC
92 points
2 comments2 min readEA link

Job Op­por­tu­nity: Oper­a­tions Man­ager at CEEALAR

Beth Anderson21 Dec 2023 14:24 UTC
13 points
1 comment2 min readEA link

Giv­ing away copies of Un­con­trol­lable by Dar­ren McKee

Greg_Colbourn14 Dec 2023 17:00 UTC
39 points
2 comments1 min readEA link

Timelines are short, p(doom) is high: a global stop to fron­tier AI de­vel­op­ment un­til x-safety con­sen­sus is our only rea­son­able hope

Greg_Colbourn12 Oct 2023 11:24 UTC
70 points
84 comments9 min readEA link

Vol­un­teer­ing Op­por­tu­nity: Trus­tee at CEEALAR

Beth Anderson5 Oct 2023 14:55 UTC
16 points
0 comments3 min readEA link

Ap­ply to CEEALAR to do AGI mora­to­rium work

Greg_Colbourn26 Jul 2023 21:24 UTC
60 points
0 comments1 min readEA link

Thoughts on yes­ter­day’s UN Se­cu­rity Coun­cil meet­ing on AI

Greg_Colbourn19 Jul 2023 16:46 UTC
31 points
2 comments1 min readEA link

UN Sec­re­tary-Gen­eral recog­nises ex­is­ten­tial threat from AI

Greg_Colbourn15 Jun 2023 17:03 UTC
58 points
1 comment1 min readEA link

Play Re­grantor: Move up to $250,000 to Your Top High-Im­pact Pro­jects!

Dawn Drescher17 May 2023 16:51 UTC
58 points
2 comments2 min readEA link
(impactmarkets.substack.com)

P(doom|AGI) is high: why the de­fault out­come of AGI is doom

Greg_Colbourn2 May 2023 10:40 UTC
13 points
28 comments3 min readEA link

AGI ris­ing: why we are in a new era of acute risk and in­creas­ing pub­lic aware­ness, and what to do now

Greg_Colbourn2 May 2023 10:17 UTC
68 points
35 comments13 min readEA link

[Question] If your AGI x-risk es­ti­mates are low, what sce­nar­ios make up the bulk of your ex­pec­ta­tions for an OK out­come?

Greg_Colbourn21 Apr 2023 11:15 UTC
60 points
54 comments1 min readEA link

Merger of Deep­Mind and Google Brain

Greg_Colbourn20 Apr 2023 20:16 UTC
11 points
12 comments1 min readEA link
(blog.google)

Re­cruit the World’s best for AGI Alignment

Greg_Colbourn30 Mar 2023 16:41 UTC
34 points
8 comments22 min readEA link

Adam Cochran on the FTX meltdown

Greg_Colbourn17 Nov 2022 11:54 UTC
15 points
7 comments1 min readEA link
(twitter.com)

Why didn’t the FTX Foun­da­tion se­cure its bag?

Greg_Colbourn15 Nov 2022 19:54 UTC
57 points
34 comments2 min readEA link

[Question] Who would you have on your dream team for solv­ing AGI Align­ment?

Greg_Colbourn25 Aug 2022 13:34 UTC
10 points
14 comments1 min readEA link

[Question] What bank ac­counts are UK char­i­ties us­ing?

Greg_Colbourn7 Apr 2022 15:41 UTC
8 points
3 comments1 min readEA link

AGI x-risk timelines: 10% chance (by year X) es­ti­mates should be the head­line, not 50%.

Greg_Colbourn1 Mar 2022 12:02 UTC
67 points
22 comments1 min readEA link