The Sense-Making Web
It is an unfortunate fact that blogposts are an extremely difficult medium with which to convey the sense of being within a particular intellectual scene. Nonetheless, I feel that the Sensemaking scene—however vaguely defined—is becoming important enough that there ought to already be some posts on this topic already.
This article was originally written for Less Wrong. Rationalists and sensemakers provide an interesting comparison as both share the common goal of trying to make sense of the world. So by examining these movements side-by-side we can produce a clearer image of the strengths and limitations of each. In particular, I would suggest that the Sensemaking scene draws better on lessons from spirituality, has a better understanding of narrative, makes better use of recorded conversations and has a stronger understanding of the importance of local community. Beyond this, I suspect it’ll make a valuable contribution towards ending the culture wars. And in some ways, understanding the Sense-Making Web may be more pertinent to EAs than rationalists as EAs like Sensemakers aim to be a movement.
Given the difficulty of describing a scene, I feel it’d be worthwhile to pick out some words that capture my felt sense of being present in it. The first word I’ll pick is “openness”. People who are low on openness tend to react strongly to ideas that are incompatible with their world view or which are too “weird”, whilst people who are high in openness are much more likely to search for aspects of what is being said that resonate with them regardless. The sensemaking scene is highly open in that participants often discourse with those who hold completely different political views than them and in their embrace of spirituality or spirituality-adjacent practises. Much emphasis is put on being able to hold tension or uncomfortableness, which I think is almost definitionally necessary to explore new intellectual territory.
The second word I’ll pick is “coherence” (see also: common knowledge). In some ways this is the opposite of the previous as openness tends to lead to divergence and coherence to convergence. However, this is an important component of the production of new ideas as well. Without any kind of shared beliefs, even at some kind of meta-level or about the purpose of the discussion, productive conversation becomes impossible.
In the sensemaking web, coherence is produced by a shared belief that each party possesses at least a partial truth and that the purpose of conversation is to create greater mutual understanding, rather than to fight each other like soldiers. Participants tend heavily towards the synthesis of various beliefs and even if people synthesise them in different ways, they still have something in common—namely that their beliefs are syntheses.
In addition, there are also a number of common high-level beliefs such as: the world facing some kind of crisis, skepticism towards cancel culture, meditation/mindfulness and other spiritual practises providing insight necessary for navigating this crisis, and of the importance of shifting away from centralised authorities towards decentralised intelligence. Beyond this, there are a large number of shared references—the heavily involved figures would be generally familiar with the core beliefs of the main memetic tribes—which is kind of the price of entry to having a meta-aware discussion.
The last word I’ll pick is “emergence”. Emergence occurs in a group conversation when there is the right balance between openness and coherence. The coherence ensures that the participants can communicate (see: inferential distance), while the openness allows the production of something new. And when they both occur together, this we may observe emergence which is the production of something new that no-one could have produced by themselves. We can connect this to simulated annealing—openness is necessary to survive the heat and coherence to cool the group back into a forward direction.
We can also relate this to Samo’s concept of Live and Dead Players—the number of such players, particularly in roles of influence, largely determines the aliveness of a scene—although this isn’t sufficient for emergence as the players may just all be pursuing their individual projects and not channelling each other.
Both rationality and the sensemaking scene try to not just directly improve our understanding of the world, but to sharpen our lens itself (that is, the tools which we use to understand the world). Traditional rationality focuses heavily on explicit knowledge—of logic, decision theory and cognitive biases. In contrast, sensemaking focuses much more on introspection, emotions and collective-sensemaking. In this sense, it is very similar to post-rationality, although I expect most post-rationalists would have had some exposure to the more traditional rationality tools as well. (Though of course in recent years, meditation and circling have become part of mainstream rationality).
Some of the main practises involved in sense-making include mindfulness (which provides an awareness of thoughts and emotions and how they relate), circling (which brings mindfulness into a social context), acceptance (which involves facing the truth without fear) and shadow-work (which involves engaging with our darker emotions and desires to find out if they have anything to tell us).
Developmental frameworks, such as Integral Theory, Post-Integral, Spiral Dynamics or Hanzi’s Metamodernism are also quite popular. Kegan Levels also count as a developmental framework, but they are much more popular among post-rationalists than among the sensemaking web. I suspect that the broad strokes are more important that the specific framework itself, but the general idea is that:
Levels can be used to analysis both individuals and civilisations There are different levels each with its own perspective of the world and insights Levels can’t be skipped, you need to pass through each of the previous levels to reach the next Nonetheless, when we move from one level to the next as well as gaining insights, we may often lose them The higher levels are meta-levels where you gain the insights from the previous levels There are levels higher than the highest level anyone has achieved now There are certainly many critiques that could be made such as the vagueness of these models or the particular levels or that it presents a particular oversimplified model of history. At the same time, these frameworks seem to prove very effective in practise in pushing people towards considering more sides of an issue (see: Fake Frameworks).
Another key focus seems to be on localism. There seems to be a general belief that national-level politics is broken and that it is much better to instead focus on your local community or even create your own intentional community. This allows you to prefigure change which can later be copied or scaled up; although some people seem to believe that society will just collapse, but maybe strong local communities may be able to remain standing. I see this focus as valuable as individuals in strong communities are likely to be much more impactful. The rationality/EA community was a bit more focused on this in the past, though this seems to have become de-emphasised, although maybe after the pandemic this thread will be investigated more now that there are more jobs.
Complexity theory is another area that has received significant attention, but which has been mostly neglected by Less Wrong/Effective Altruism. I think it may be useful to refer to David Snowden’s Cynefin framework which identifies different types of situations which one may face. There’s obvious situations where its easy to identify the best course of action; there’s chaotic situations where the outcome is essentially random; there’s complicated systems which are hard to understand, but where if you understand the parts you understand the whole; and there’s complex situations where everything interacts with each other in a way that prevents you predicting the whole. And these situations should be handled in different ways—in a complicated system determining good practise is much more reliant on understanding, while in a complex system good practise is much more evolutionarily emergent. The core insight rings true to me, even though there are legitimate concerns about how everyone seems to define complexity in their own way.
Another difference is that the sensemaking web has a much greater focus on podcasts and Youtube discussion videos than rationality/EA. As McLuhan says, the medium is the message, and these formats tend to enable a deeper exchange of views by only containing a fewer participants and allowing more back and forth than tends to occur on comment threads. I suspect the rationality community would be enhanced if there was more of this kind of public debate.
One topic I haven’t addressed is narrative. I think the biggest difference is likely just a difference of focus. Less Wrong/EA focuses on Steelmanning views and understanding how a view could be reasonable. In contrast, the sensemaking community focuses much more on internal narratives and tries to understand what makes a story feel compelling.
I suspect that the emergence of the sensemaking web has largely been enabled by these formats because content producers have a natural incentive to appear on each other’s shows in order to broaden their audiences and there is an incentive to be nice if you want to be invited back/want guests to want to come on your podcast. Further, it’s generally a safe bet to book a guest who has well-received on other shows in the same niche, so people who are seen as insightful often get the opportunity to make their core ideas common knowledge. Beyond this, the lack of time pressure means that there is more opportunity to explore ideas in depth and the low barrier of entry means that someone can start a podcast without being dependent on advertisers if they don’t want to be subject to external interference. The confluence of all these factors in the presence of a dysfunctional societal conversation has created a natural attractor for those who think we can do better. (I could also link this to the Intellectual Dark Web which has never really lived up to its initial promise).
Given the increasing importance which I expect this network to have in the future, I think it is particularly important for people in the Less Wrong and Effective Altruism communities to be engaging to ensure that our ideas are represented as part of the mix.
I’ll make a post containing a bunch of links later, but for now I’ll just provide you with a few links for further reading:
Thanks to Jarred Filmer who provided feedback and even wrote a few small fragments of this!
Thanks for bringing this up, never heard of the Sensemaking scene. I enjoyed the “The War on Sensemaking” talk by Daniel Schmachtenberger that you linked in the end. I liked the idea that we should treat communicating biased or wrong information like pollution.
This is an interesting topic, and I’m glad you shared information about it on the Forum!
However, I found some elements of this post a bit confusing. In particular:
I kept expecting to see examples of people who are part of the “sensemaking” movement. What are the podcasts and YouTube channels you reference? The links at the end seem like they only reveal a tiny fraction of this community. (I tried Googling around a bit, but the term “sensemaking” didn’t lead me to the sorts of things you were talking about.)
“I suspect it’ll make a valuable contribution towards ending the culture wars.”
I’d have liked to hear more about why you suspect this. Are there good examples of the movement bringing together people from different, “warring” cultures and making them both more accepting of the other’s views? Does the movement have a lot of diversity across the political spectrum? Are any particular views associated with the movement starting to become popular in elite circles?
Yeah, hopefully at some point I find time to make another post, linking to various aspects of what I’d define as the community. I guess who is in or not is not well-defined as it’s not really a single community. Rather, it’s a bunch of groups with similar kinds of people who seem to be talking to each and talking about similar kinds of things, most of whom I think would agree that they’re doing something like sensemaking.
Regarding your second question, if you head over to the Stoa or listen to Both/And, you’ll see people from across the spectrum, although not really many strong social justice proponents. I suppose my suspicion is mainly driven by the intuition that ending the culture wars requires a movement with positive content of its own and not merely a negative critique as Quillette and (to a lesser degree) Persuasion seem to do. People need a reason to join apart from simply being sick of the culture wars.