I have quite a few ideas for “heretical” articles going against the grain on EA/rationalism/Ai-risk. Here’s a few you can look forward to, although I’ll be working slowly.
A list of unlikely ways that EA could destroy humanity
How motivation gaps explain the disparity in critique quality between pro-anti x-risk people
Why I don’t like single number P(doom estimates)
The flaws with AI risk expert surveys
Why drexler-style nanotech will probably never beat biological nanomachines in our lifetimes (even with AI)
Why the singularity will probably be disappointing
There are very few ways to reliably destroy humanity
Why AGI “boxing” is probably possible and useful
How to make AI “warning shots” more likely
Why I don’t like the “Many worlds” interpretation of quantum mechanics
I always look forward to titotal posts, so I’m very happy to see that there’s a healthy pipeline!
Small tongue-in-cheek-word-of-warning, viciously attacking The Sequences can be a big sign of being a Rationalist in the same way that constantly claiming “oh I’m just EA-adjacent” is a surefire sign that someone is an EA :P
I have quite a few ideas for “heretical” articles going against the grain on EA/rationalism/Ai-risk. Here’s a few you can look forward to, although I’ll be working slowly.
A list of unlikely ways that EA could destroy humanity
How motivation gaps explain the disparity in critique quality between pro-anti x-risk people
Why I don’t like single number P(doom estimates)
The flaws with AI risk expert surveys
Why drexler-style nanotech will probably never beat biological nanomachines in our lifetimes (even with AI)
Why the singularity will probably be disappointing
There are very few ways to reliably destroy humanity
Why AGI “boxing” is probably possible and useful
How to make AI “warning shots” more likely
Why I don’t like the “Many worlds” interpretation of quantum mechanics
Why “the sequences” are overrated
Why Science beats Rationalism
I always look forward to titotal posts, so I’m very happy to see that there’s a healthy pipeline!
Small tongue-in-cheek-word-of-warning, viciously attacking The Sequences can be a big sign of being a Rationalist in the same way that constantly claiming “oh I’m just EA-adjacent” is a surefire sign that someone is an EA :P