Archive
About
Search
Log In
Home
All
Wiki
Shortform
Recent
Comments
RSS
Vael Gates
Karma:
1,160
All
Posts
Comments
New
Top
Old
Offering AI safety support calls for ML professionals
Vael Gates
Feb 15, 2024, 11:48 PM
52
points
1
comment
1
min read
EA
link
Vael Gates: Risks from Highly-Capable AI (March 2023)
Vael Gates
Apr 1, 2023, 8:54 PM
31
points
4
comments
1
min read
EA
link
(docs.google.com)
Interviews with 97 AI Researchers: Quantitative Analysis
Maheen Shermohammed
Feb 2, 2023, 4:50 AM
76
points
4
comments
7
min read
EA
link
Retrospective on the AI Safety Field Building Hub
Vael Gates
Feb 2, 2023, 2:06 AM
64
points
2
comments
9
min read
EA
link
“AI Risk Discussions” website: Exploring interviews from 97 AI Researchers
Vael Gates
Feb 2, 2023, 1:00 AM
46
points
1
comment
1
min read
EA
link
Predicting researcher interest in AI alignment
Vael Gates
Feb 2, 2023, 12:58 AM
30
points
0
comments
21
min read
EA
link
(docs.google.com)
Resources I send to AI researchers about AI safety
Vael Gates
Jan 11, 2023, 1:24 AM
43
points
0
comments
EA
link
What AI Safety Materials Do ML Researchers Find Compelling?
Vael Gates
Dec 28, 2022, 2:03 AM
130
points
12
comments
EA
link
Announcing the AI Safety Field Building Hub, a new effort to provide AISFB projects, mentorship, and funding
Vael Gates
Jul 28, 2022, 9:29 PM
126
points
6
comments
6
min read
EA
link
Social scientists interested in AI safety should consider doing direct technical AI safety research, (possibly meta-research), or governance, support roles, or community building instead
Vael Gates
Jul 20, 2022, 11:01 PM
65
points
8
comments
18
min read
EA
link
Vael Gates: Risks from Advanced AI (June 2022)
Vael Gates
Jun 14, 2022, 12:49 AM
45
points
5
comments
30
min read
EA
link
Transcripts of interviews with AI researchers
Vael Gates
May 9, 2022, 6:03 AM
140
points
14
comments
2
min read
EA
link
Apply for Stanford Existential Risks Initiative (SERI) Postdoc
Vael Gates
Dec 14, 2021, 9:50 PM
28
points
2
comments
1
min read
EA
link
What would you ask on MTurk? (I could possibly run a study for you)
Vael Gates
Nov 13, 2021, 12:57 AM
13
points
3
comments
1
min read
EA
link
Vael Gates’s Quick takes
Vael Gates
Nov 8, 2021, 2:44 AM
2
points
7
comments
EA
link
Apply to be a Stanford HAI Junior Fellow (Assistant Professor- Research) by Nov. 15, 2021
Vael Gates
Oct 31, 2021, 2:21 AM
15
points
0
comments
1
min read
EA
link
Seeking social science students / collaborators interested in AI existential risks
Vael Gates
Sep 24, 2021, 9:56 PM
58
points
7
comments
3
min read
EA
link
[Question]
People working on x-risks: what emotionally motivates you?
Vael Gates
Jul 5, 2021, 3:16 AM
16
points
8
comments
1
min read
EA
link
Back to top
N
W
F
A
C
D
E
F
G
H
I
Customize appearance
Current theme:
default
A
C
D
E
F
G
H
I
Less Wrong (text)
Less Wrong (link)
Invert colors
Reset to defaults
OK
Cancel
Hi, I’m Bobby the Basilisk! Click on the minimize button (
) to minimize the theme tweaker window, so that you can see what the page looks like with the current tweaked values. (But remember,
the changes won’t be saved until you click “OK”!
)
Theme tweaker help
Show Bobby the Basilisk
OK
Cancel