Warnings on Weirdness

I respect this post (“You have a set amount of “weirdness points”. Spend them wisely.”) I’ve come to really admire EA’s ability to make first-principles moral philosophy feel normal. But, I Have Some Things To Add. I sense disaster lurking here. Heed me well, EA, and you may avoid it.

The phrase “You have a set amount of weirdness points” seems to promote a mindset of Weirdness Point austerity, and weirdness austerity would be devastating

You need to bear in mind all the ways that amount isn’t fixed.

I forget whether it was Harari or Ada Palmer (maybe it was both), but I have heard a lot about a transition humanity underwent, from a mindset of stasis and finitude to a mindset of growth. Once people started to believe that progress was a real thing, it became a real thing, investments were made, technology grew, methods modernized, and life steadily tremendously improved.

We learned that we could make more of something that we had previously thought fixed.

It’s very important to believe that there can be more. Weirdness points are useful. We should want more. Weirdness is, indeed, incredibly important. The less people are willing to deviate from society and hold weird beliefs, the harder it will be to improve society.

Elon Musk. There’s a sense in which everything he’s done is weird, reusable rockets, electric cars that were nothing like the electric cars we knew, starlink, and producing all that excitement about vacuum trains (which does seem to be amounting to something, gradually), but after a few successes, people updated fast. The more weird things he pulls off, the more people can believe in the next weird thing he wants to do will work. He figured out how to print weirdness points (stock).

Hideo Kojima makes a bizarre game about being a postal worker who hikes, and because he’s Hideo Kojima, every game enjoyer I know assumed it was going to be good and played it because Hideo Kojima is someone who kind of does not have a set number of weirdness points (or money??) to spend. He has as many points as he feels like making. Be like Kojima. Practice drawing peoples’ interest and earning peoples’ faith.

There are also many domains where people exhibit an endless appetite for novelty, weirdness, there, is an attractor. They always want more. There is no set limit. Whatever psychological reaction is going on when a high-openness person pursues endlessly novel food or novel entertainment, we need to learn how to replicate it in civic/​intellectual contexts.

In the same way that it would have been a tragedy if EA had convinced itself that it could not generate wealth and shouldn’t aspire to, EA will be much less than it would have been, if it fails to explore routes to generating more faith or interest in the audience for novel ideas.

There is a wave still rising that must be ridden, weirdness is waxing and shows few signs of waning

The post was written in 2014. Power has shifted since then.

Trump was a weird and unexpected transition in global power that everyone in the world is reminded of the reality of every day. Weirdness is here and real and important. Everyone knows the normal world doesn’t exist any more. The weird thing turned out to be true instead. Covid 19 was a weird and unexpected thing that people like us could have forewarned and in fact, Bill Gates was forewarning it and can now point to those speeches he made and say “I told you so” (though others are doing that for him). Covid 19 is a weird thing that forced its way into peoples lives, challenged them to deny its reality, then smote the people who did. If we forewarn more weird but inevitable things, we can gain massive quantities of credibility in a field where original thinkers like us are the only real players.

Saying all this, I’m starting to think that normalcy might be entirely the wrong aesthetic for anyone who still wants to be listened to 10 years from now. There’s a dang cheeto in the whitehouse and every city in the world is full of scooters, and if Musk’s “full autonomy in 2020” commitment is true then soon the cars will be driving themselves, and it looks like meat will start to grow without animals. There’s chaos out there and whoever can demonstrate the hidden order in it will be recognized as its stewards.

I suspect that few true things are objectively weird or non-weird, it’s all about the framing, and this framing is an art that we should all be practicing

Scott Alexander describing Stuart Russel’s Human Compatible:

HC somehow makes risk from superintelligence not sound weird. I can imagine my mother reading this book, nodding along, feeling better educated at the end of it, agreeing with most of what it says (it’s by a famous professor! I’m sure he knows his stuff!) and never having a moment where she sits bolt upright and goes what? It’s just a bizarrely normal, respectable book.

The best way to communicate a weird theory is to explain why the world would be weirder (make less sense) if the theory weren’t true. Why this theory is inextricable from established normality, to the point of being fundamentally normal itself. To deny the weird thing would be to deny the normal things, which would be even weirder than the weird thing, so it can’t be dismissed for its weirdness. We can get very good at writing in that way. The principle of induction is on our side.

Personal recommendations: Explain cryonics as a contribution to the existing field of archeology, that it would be weird to think that the people of the far future wouldn’t want access to exceptionally well preserved brains of historical people, or that the future wouldn’t be made much richer to have them. Frame simulationism (though I’m aware of no practical reason to talk about simulationism, but if there were) as a continuation of an existing tendency humans have towards simulation, that it would be more weird if we or whoever might have gone before us didn’t eventually build extremely detailed simulated worlds. Explain AGI Misalignment risk as a dreary continuation of humanity’s record of wartime or industrial disasters.

And then, the ultimate de-weirding reframing of all: It would be weirder if the community that got interested in working on our epistemic rationality, listening to applied-philosophers, and trying to do good things, then didn’t end up finding lots and lots of really good ideas.

Once you convey that, you are no longer spending more points when you promote EA ideas, you would have to spend points to not promote them, because you have flipped the final frame.

It may be hard to avoid internalizing professed non-weirdness, and if that happens, EA will rapidly lose its best qualities

EA’s value is that it creates pathway between the ideas of original thinkers (philosophers) and a large audience of people who can act on their concerns. If normy aesthetics lead us to attract and embrace actual consummate normies who don’t learn to like original thinkers, that line will break. EA will stop being interested in hearing from wild thinkers like Robin Hanson enough to tolerate his missteps. It will distance itself from abnormally honest people like Peter Singer.

There is a thin line in a person’s head separating the attitude of avoiding promoting weird ideas on others’ behalf, and avoiding promoting weird ideas because you have internalised the same aversion to weirdness as your audience. It is very easy to lose your way.

For a group of people, it isn’t obvious to me that there can be a line. Maybe! I sure hope so! But I’m definitely not going to be shocked if it turns out that the reflectivists are right and there is no way for a group to look like something it isn’t without eventually becoming it.

If that does happen, if EA shrinks to its least disruptive ideas and ossifies, I will probably not grieve very much, I will be ready to face it, I think, because I saw how the unprecedented thing was precedented by deeper rules.


In sum, if you find yourself highly constrained on weirdness points, there may be an underlying cashflow problem. Novelty tolerance should not be your bottleneck. Fix the cashflow problem, establish confidence with your audience, start the weirdness reinvestment cycles, reward faith, and you will not have to worry about economizing on weirdness barely at all.

As for me, I’m going to be making video games, for games are places people often go when they’re looking for novelty. I have designed one intensely weird game about AI Boxing that people seemed to like. I’m now working on a tabletop game about agency and a puzzle game about induction-techne, which I hope will give people a sense for the sorts of cognitions that will set light to the sky, but these things are just a start.

No comments.