My first impression of meeting rationalists was at a AI safety retreat a few years ago. I had a bunch of conversations that were decidedly mixed and made me think that they weren’t taking the project of doing a large amount of good seriously, reasoning carefully (as opposed to just parroting rationalist memes) or any better at winning than the standard EA types that I felt were more ‘my crowd’.
I now think that I just met the wrong rationalists early on. The rationalists that I most admire:
Care deeply about their values
Are careful reasoners, and actually want to work out what is true
Are able to disentangle their views from themselves, making meaningful conversations much more accessible
Are willing to seriously consider weird views that run against their current views
Calling yourself a rationalist or EA is a very cheap signal and I made an error early on (insensitivity to small samples sizes etc.) dismissing their community. Whilst there is still some stuff that I would change, I think that the median EA could move several steps in a ’rationalist’ direction.
Having a rationalist/scout mindset + caring a lot about impact are pretty correlated with me finding someone promising. It’s not essential to having a lot of impact but I am starting to think that EA is doing the altruism (A) part of EA super well and the rationalist are doing the effective (E) part of EA super well.
My go to resources are probably:
The scout mindset—Julia Galef
The codex—Scott Alexander
The sequences highlights—Eliezer Yudkowsky/Less Wrong
More EAs should give rationalists a chance
My first impression of meeting rationalists was at a AI safety retreat a few years ago. I had a bunch of conversations that were decidedly mixed and made me think that they weren’t taking the project of doing a large amount of good seriously, reasoning carefully (as opposed to just parroting rationalist memes) or any better at winning than the standard EA types that I felt were more ‘my crowd’.
I now think that I just met the wrong rationalists early on. The rationalists that I most admire:
Care deeply about their values
Are careful reasoners, and actually want to work out what is true
Are able to disentangle their views from themselves, making meaningful conversations much more accessible
Are willing to seriously consider weird views that run against their current views
Calling yourself a rationalist or EA is a very cheap signal and I made an error early on (insensitivity to small samples sizes etc.) dismissing their community. Whilst there is still some stuff that I would change, I think that the median EA could move several steps in a ’rationalist’ direction.
Having a rationalist/scout mindset + caring a lot about impact are pretty correlated with me finding someone promising. It’s not essential to having a lot of impact but I am starting to think that EA is doing the altruism (A) part of EA super well and the rationalist are doing the effective (E) part of EA super well.
My go to resources are probably:
The scout mindset—Julia Galef
The codex—Scott Alexander
The sequences highlights—Eliezer Yudkowsky/Less Wrong
The Less Wrong highlights