You Have About Five Words
Epistemic Status: all numbers are made up and/or sketchily sourced.
If you want to coordinate with one person on a thing, you can spend as much time as you want talking to them – answering questions in realtime, addressing confusions as you notice them.
You probably speak at around 100 words per minute. That’s 6,000 words per hour. If you talk for 3 hours a day, every workday for a year, you can communicate 4.3 million words worth of nuance.
You can have a real conversation with up to 4 people.
(Last year the small organization I work at considered hiring a 5th person. It turned out to be very costly and we decided to wait, and I think the reasons were related to this phenomenon)
If you want to coordinate with, say, 10 people, you realistically can ask them to read a couple books worth of words. A book is maybe 50,000 words, so you have maybe 200,000 words worth of nuance.
Alternately, you can monologue at people, scaling a conversation past the point where people realistically can ask questions. Either way, you need to hope that your books or your monologues happen to address the particular confusions your 10 teammates have.
If you want to coordinate with 100 people, you can ask them to read a few books, but chances are they won’t. They might all read a few books worth of stuff, but they won’t all have read the same books. The information that they can be coordinate on is more like “several blogposts.” If you’re trying to coordinate nerds, maybe those blogposts add up to one book because nerds like to read.
If you want to coordinate 1,000 people… you realistically get one blogpost, or maybe one blogpost worth of jargon that’s hopefully self-explanatory enough to be useful.
If you want to coordinate thousands of people...
You have about five words.
This has ramifications on how complicated a coordinated effort you can attempt.
What if you need all that nuance and to coordinate thousands of people? What would it look like if the world was filled with complicated problems that required lots of people to solve?
I guess it’d look like this one.
- You Get About Five Words by 12 Mar 2019 20:30 UTC; 221 points) (LessWrong;
- Mentorship, Management, and Mysterious Old Wizards by 24 Feb 2021 21:58 UTC; 88 points) (
- Mentorship, Management, and Mysterious Old Wizards by 24 Feb 2021 22:00 UTC; 85 points) (LessWrong;
- Law school taught me nothing by 11 Apr 2020 14:30 UTC; 21 points) (LessWrong;
- 17 Jul 2021 15:13 UTC; 14 points) 's comment on The case against “EA cause areas” by (
“Donate to Effective Charities.”
“AI will kill us.”
“Consider Earning to Give.”
“EA is Talent Constrained.”
I’ll probably refer people to this post when trying to explain why you totally need complex networks when you are trying to coordinate about absolutely anything more complicated than what you can express in 4 words.
(Also: One friend pointed toward the fact that the word hierarchy comes from organisation coordinating effort of more than a billion of people across long time horizons)
This post inspired by a conversation (conversation partner can reveal themselves if they so choose), who originally claimed (I think off-the-cuff) that that you had four words if you needed to coordinate 100,000 people (resulting in highly simplified strategies).
I updated my own estimate downwards (of the number of people you need to be coordinating to face the four word limit), after observing that EA only has somewhere-on-the-order-of-a-thousand people involved and important concepts often lose their nuance. (Although, to be fair, this is at least in part because there’s multiple concepts that are all nuanced that all need to be kept track of, each of which need to get boiled down to a simple jargon term)
Me, I think? I recall lamenting about how the “game of telephone” implied by memetic dynamics reduces any nuanced message to about 4 or 5 words.
(In general & broadly: you’re welcome to name me as an “inspired by” conversation partner without asking. If you’re interested in paraphrasing my views, you can check the paraphrase with me.)
Nod. My motivation to write the post came in a brief spurt and I wanted to just get it out without subjecting it to a review process, so I erred on the side of wording it the way I endorsed and letting you take credit if you wanted.
(Btw, alternate titles for this post were “you have about 5 words”, “you only have 5 words”, and “you have less than seven words.”) :P
My intuition is that at least 1000 EAs have in common a book length amount of reading on EA. Here at the EA Hotel, we can often discuss things at a high level in groups as there is a lot of common knowledge of EA concepts (and the same goes for EA Globals).
I’m quite neutral on this post. On the one hand, it is short and simple, with a clear concept that has some impact/relevance. On the other hand, I think the concept is somewhat inaccurate (and very imprecise): the reasoning mainly supposes a pattern/relationship that the size of your audience is inversely proportional to the size of your message. This makes some vague sense, but the numbers used seem to fall apart when you think about it and expand it a bit more: if you want to coordinate thousands of people, you have five words… but what if I want to coordinate 100,000 people? Do I have one word? You quickly run into a wall, since you can’t have less than one word.
The specific numbers actually aren’t important here (and to the author’s credit, they point out that the numbers are made up, even though the claims self-referentially produce the title). What does matter is what this crack in the reasoning starts to reveal: there’s way more to what affects audience size, and its relationship to message size is more complicated. For example, millions and millions of people have read book series like Harry Potter, and it has influenced entire fandoms. (This specific example, as opposed to something like some political movements, may not be about detailed “coordination,” but regardless) The point is that if people enjoy the message or the coordination, more people will be likely to read or engage in it. You can have a small message that coordinates nobody, and a large message that coordinates many people. Smaller messages might be more effective ceteris paribus up to a point, but I think you’ll also find that messages that are too small may also be less effective at reaching large audiences because they are too small.
Ultimately, I get the big idea—there is some relationship between content size and audience size—but I don’t think it does much to improve my model of the world, which already vaguely recognized this. Furthermore, unless someone completely didn’t even recognize that basic fuzzy relationship, this post seems like it could actually distort their model of the world. I think a similarly short post could have been less inaccurate while still sparking thought.
I wrote a fairly detailed self-review of this post on the LessWrong 2019 Review last year. Here are some highlights:
I’ve since changed the title to “You have about Five Words” on LessWrong. I just changed it here to keep it consistent.
I didn’t really argue for why “about 5”. My actual guess for the number of words you have is “between 2 and 7.” IConcepts will, in the limit, end up getting compressed into a form that one person can easily/clumsily pass on to another person who’s only kinda paying attention or only reads the headline. It’ll hit some eventual limit, and I think that limit is determined by people’s working memory capacity (about 4-7 chunks)
If you don’t provide a deliberate way to compress the message down, it’ll get compressed for you by memetic selection, and might end up distorting your message.
I don’t actually have strong beliefs about when the 2-7-word limit kicks in. But I observed the EA movement running into problems where nuanced articles got condensed into slogans that EAs misinterpreted (i.e. “EA is Talent Constrained”), so I think it already applies at the level of organization of EA-2018.
See the rest of the review for more nuanced details.