Archive
About
Search
Log In
Home
All
Wiki
Shortform
Recent
Comments
RSS
Greg_Colbourn ⏸️
Karma:
5,964
Global moratorium on AGI, now (
Twitter
). Founder of CEEALAR (née the EA Hotel;
ceealar.org
)
All
Posts
Comments
New
Top
Old
Page
1
AI Risk timelines: 10% chance (by year X) should be the headline (and deadline), not 50%. And 10% is _this year_!
Greg_Colbourn ⏸️
5 Jan 2026 11:57 UTC
16
points
19
comments
1
min read
EA
link
Which side of the AI safety community are you in?
Greg_Colbourn ⏸️
23 Oct 2025 14:23 UTC
12
points
2
comments
2
min read
EA
link
(www.lesswrong.com)
Pause House, Blackpool
Greg_Colbourn ⏸️
13 Oct 2025 11:36 UTC
84
points
0
comments
1
min read
EA
link
(gregcolbourn.substack.com)
How We Might All Die in A Year
Greg_Colbourn ⏸️
28 Mar 2025 13:31 UTC
14
points
6
comments
21
min read
EA
link
(x.com)
Frontier AI systems have surpassed the self-replicating red line
Greg_Colbourn ⏸️
10 Dec 2024 16:33 UTC
25
points
14
comments
1
min read
EA
link
(github.com)
“Near Midnight in Suicide City”
Greg_Colbourn ⏸️
6 Dec 2024 19:54 UTC
5
points
0
comments
1
min read
EA
link
(www.youtube.com)
OpenAI’s o1 tried to avoid being shut down, and lied about it, in evals
Greg_Colbourn ⏸️
6 Dec 2024 15:25 UTC
23
points
9
comments
1
min read
EA
link
(www.transformernews.ai)
Applications open: Support for talent working on independent learning, research or entrepreneurial projects focused on reducing global catastrophic risks
CEEALAR
9 Feb 2024 13:04 UTC
63
points
1
comment
2
min read
EA
link
Funding circle aimed at slowing down AI—looking for participants
Greg_Colbourn ⏸️
25 Jan 2024 23:58 UTC
93
points
3
comments
2
min read
EA
link
Job Opportunity: Operations Manager at CEEALAR
Beth Anderson
21 Dec 2023 14:24 UTC
13
points
1
comment
2
min read
EA
link
Giving away copies of Uncontrollable by Darren McKee
Greg_Colbourn ⏸️
14 Dec 2023 17:00 UTC
40
points
2
comments
1
min read
EA
link
Timelines are short, p(doom) is high: a global stop to frontier AI development until x-safety consensus is our only reasonable hope
Greg_Colbourn ⏸️
12 Oct 2023 11:24 UTC
78
points
83
comments
9
min read
EA
link
Volunteering Opportunity: Trustee at CEEALAR
Beth Anderson
5 Oct 2023 14:55 UTC
16
points
0
comments
3
min read
EA
link
Apply to CEEALAR to do AGI moratorium work
Greg_Colbourn ⏸️
26 Jul 2023 21:24 UTC
62
points
0
comments
1
min read
EA
link
Thoughts on yesterday’s UN Security Council meeting on AI
Greg_Colbourn ⏸️
19 Jul 2023 16:46 UTC
31
points
2
comments
1
min read
EA
link
UN Secretary-General recognises existential threat from AI
Greg_Colbourn ⏸️
15 Jun 2023 17:03 UTC
58
points
1
comment
1
min read
EA
link
Play Regrantor: Move up to $250,000 to Your Top High-Impact Projects!
Dawn Drescher
17 May 2023 16:51 UTC
58
points
2
comments
2
min read
EA
link
(impactmarkets.substack.com)
P(doom|AGI) is high: why the default outcome of AGI is doom
Greg_Colbourn ⏸️
2 May 2023 10:40 UTC
15
points
28
comments
3
min read
EA
link
AGI rising: why we are in a new era of acute risk and increasing public awareness, and what to do now
Greg_Colbourn ⏸️
2 May 2023 10:17 UTC
70
points
35
comments
13
min read
EA
link
[Question]
If your AGI x-risk estimates are low, what scenarios make up the bulk of your expectations for an OK outcome?
Greg_Colbourn ⏸️
21 Apr 2023 11:15 UTC
65
points
55
comments
1
min read
EA
link
Back to top
Next