Archive
About
Search
Log In
Home
All
Wiki
Shortform
Recent
Comments
RSS
Greg_Colbourn
Karma:
5,520
Global moratorium on AGI, now (
Twitter
). Founder of CEEALAR (née the EA Hotel;
ceealar.org
)
All
Posts
Comments
New
Top
Old
Page
1
Frontier AI systems have surpassed the self-replicating red line
Greg_Colbourn
10 Dec 2024 16:33 UTC
30
points
14
comments
1
min read
EA
link
(github.com)
“Near Midnight in Suicide City”
Greg_Colbourn
6 Dec 2024 19:54 UTC
5
points
0
comments
1
min read
EA
link
(www.youtube.com)
OpenAI’s o1 tried to avoid being shut down, and lied about it, in evals
Greg_Colbourn
6 Dec 2024 15:25 UTC
23
points
9
comments
1
min read
EA
link
(www.transformernews.ai)
Applications open: Support for talent working on independent learning, research or entrepreneurial projects focused on reducing global catastrophic risks
CEEALAR
9 Feb 2024 13:04 UTC
63
points
1
comment
2
min read
EA
link
Funding circle aimed at slowing down AI—looking for participants
Greg_Colbourn
25 Jan 2024 23:58 UTC
92
points
3
comments
2
min read
EA
link
Job Opportunity: Operations Manager at CEEALAR
Beth Anderson
21 Dec 2023 14:24 UTC
13
points
1
comment
2
min read
EA
link
Giving away copies of Uncontrollable by Darren McKee
Greg_Colbourn
14 Dec 2023 17:00 UTC
39
points
2
comments
1
min read
EA
link
Timelines are short, p(doom) is high: a global stop to frontier AI development until x-safety consensus is our only reasonable hope
Greg_Colbourn
12 Oct 2023 11:24 UTC
73
points
85
comments
9
min read
EA
link
Volunteering Opportunity: Trustee at CEEALAR
Beth Anderson
5 Oct 2023 14:55 UTC
16
points
0
comments
3
min read
EA
link
Apply to CEEALAR to do AGI moratorium work
Greg_Colbourn
26 Jul 2023 21:24 UTC
62
points
0
comments
1
min read
EA
link
Thoughts on yesterday’s UN Security Council meeting on AI
Greg_Colbourn
19 Jul 2023 16:46 UTC
31
points
2
comments
1
min read
EA
link
UN Secretary-General recognises existential threat from AI
Greg_Colbourn
15 Jun 2023 17:03 UTC
58
points
1
comment
1
min read
EA
link
Play Regrantor: Move up to $250,000 to Your Top High-Impact Projects!
Dawn Drescher
17 May 2023 16:51 UTC
58
points
2
comments
2
min read
EA
link
(impactmarkets.substack.com)
P(doom|AGI) is high: why the default outcome of AGI is doom
Greg_Colbourn
2 May 2023 10:40 UTC
13
points
28
comments
3
min read
EA
link
AGI rising: why we are in a new era of acute risk and increasing public awareness, and what to do now
Greg_Colbourn
2 May 2023 10:17 UTC
68
points
35
comments
13
min read
EA
link
[Question]
If your AGI x-risk estimates are low, what scenarios make up the bulk of your expectations for an OK outcome?
Greg_Colbourn
21 Apr 2023 11:15 UTC
62
points
55
comments
1
min read
EA
link
Merger of DeepMind and Google Brain
Greg_Colbourn
20 Apr 2023 20:16 UTC
11
points
12
comments
1
min read
EA
link
(blog.google)
Recruit the World’s best for AGI Alignment
Greg_Colbourn
30 Mar 2023 16:41 UTC
34
points
8
comments
22
min read
EA
link
Adam Cochran on the FTX meltdown
Greg_Colbourn
17 Nov 2022 11:54 UTC
15
points
7
comments
1
min read
EA
link
(twitter.com)
Why didn’t the FTX Foundation secure its bag?
Greg_Colbourn
15 Nov 2022 19:54 UTC
57
points
34
comments
2
min read
EA
link
Back to top
Next