RSS

Greg_Colbourn

Karma: 4,927

Global moratorium on AGI, now (Twitter). Founder of CEEALAR (née the EA Hotel; ceealar.org)

EA Ho­tel with free ac­com­mo­da­tion and board for two years

Greg_Colbourn4 Jun 2018 18:09 UTC
99 points
97 comments37 min readEA link

Fund­ing cir­cle aimed at slow­ing down AI—look­ing for participants

Greg_Colbourn25 Jan 2024 23:58 UTC
92 points
2 comments2 min readEA link

Timelines are short, p(doom) is high: a global stop to fron­tier AI de­vel­op­ment un­til x-safety con­sen­sus is our only rea­son­able hope

Greg_Colbourn12 Oct 2023 11:24 UTC
70 points
84 comments9 min readEA link

AGI ris­ing: why we are in a new era of acute risk and in­creas­ing pub­lic aware­ness, and what to do now

Greg_Colbourn2 May 2023 10:17 UTC
68 points
35 comments13 min readEA link

AGI x-risk timelines: 10% chance (by year X) es­ti­mates should be the head­line, not 50%.

Greg_Colbourn1 Mar 2022 12:02 UTC
67 points
22 comments1 min readEA link

EA Ho­tel Fundraiser 2: Cur­rent guests and their projects

Greg_Colbourn4 Feb 2019 20:41 UTC
66 points
8 comments8 min readEA link

[Question] If your AGI x-risk es­ti­mates are low, what sce­nar­ios make up the bulk of your ex­pec­ta­tions for an OK out­come?

Greg_Colbourn21 Apr 2023 11:15 UTC
60 points
54 comments1 min readEA link

Ap­ply to CEEALAR to do AGI mora­to­rium work

Greg_Colbourn26 Jul 2023 21:24 UTC
60 points
0 comments1 min readEA link

UN Sec­re­tary-Gen­eral recog­nises ex­is­ten­tial threat from AI

Greg_Colbourn15 Jun 2023 17:03 UTC
58 points
1 comment1 min readEA link

Why didn’t the FTX Foun­da­tion se­cure its bag?

Greg_Colbourn15 Nov 2022 19:54 UTC
57 points
34 comments2 min readEA link