Rationality and Erisology

Board and Seize

Marine Recon
Verified SOF
Jan 29, 2013
Long post; link dump! Numerous rabbit holes that may induce cognitive dissonance.

Side effects may include, but are not limited to: a systemic dismantling of cherished beliefs, years 'lost' to mastering a massive body of writing, acquiring skill in Bayesian Judo, developing a habit of making quantified predictions and backing them up with bets, and learning to disagree in a positive and civil manner than intellectually enriches. Also, death.

I. Preamble:
So I've been wanting to do a post like this for a while, and was finally spurred to do it by the past few weeks' excellent series of threads, What's Wrong With the Left / Right and Religion, ethics, and morality.

One of the things I most enjoy about the SS forums is the consistently high quality of discussion: the ever-amazing (though oh, so slowly released) case studies, mostly-civil debates on the culture wars*, expert dissection of technique and tactics, leadership analysis, and so much more. This has got to be one of the one of the most intellectually rich and meaningfully diverse military groups ever assembled - online or off.

That said, we fall prey to all the same traps as any/everyone else (though I'd say at a lesser rate than the general population): cognitive bias, tribe-membership/virtue signalling, talking past each other, appeals to emotion, and so on. This post is my attempt to share some of what I've learned over the past year and a half of a deep dive into the burgeoning community/movement/field of rationalism, and hopefully further elevate our level of debate and discussion.

*Culture war:
“Culture war” is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people change their minds regardless of the quality of opposing arguments.
-stolen from here

II. Definitions & Background:

Rationality - for the purposes of this post, I am not using the everyday definition of rationality. Nor am I talking about classical philosophical Rationalism, though there is certainly some overlap there. The rationalist movement that I wish to discuss is contemporary and traces its history to the blog Overcoming Bias by way of LessWrong. This movement has a strong geographical locus in the Bay Area, but has folks around the world.

(Warning: reductive account ahead) So it pretty much starts out when this guy, Eliezer Yudkowsky, starts blogging about ways to reduce cognitive bias in order to improve his economic performance (trying to make better investments and such). Pretty soon, he's developing a cult following (with a heavy CompSci/Sillicon Valley representation) and these people are working very hard to develop a martial art of rationality. A practice of training the mind to overcome its many flawed heuristics and other ways in our brains generally fail to generate logically sound trains of thought.

Eventually, the LessWrong community kind of implodes and there is an intellectual diaspora, they go out and found other blogs, companies, and institutions. You may have encountered some of these, such as:
Some of the fields/movements that were either directly spawned by or closely associated with the rationalist movement include:
Some noted bloggers/academics you may have encountered who are either involved with or adjacent to the movement:
Okay, that's already enough linkage to jump-start a multi-year journey of discovery and mind-broadening. I share all of this to give a sense of the intellectual heft this movement has, and the seriousness with which it tackles the project of improving human rationality. I'd also wrap up by pointing to what is currently my favorite blog which has, no kidding, probably the best comments section on the internet (in which a number of the individuals listed above regularly participate): Slate Star Codex (SSC).

Seriously, go check it out. I cannot do it justice. A small sampling:

As a segue to my next definition, it was SSC that led me to John Nerst and the new field he is creating:

Erisology - The word "erisology" comes from the name "Eris", the greek goddess of discord. It refers to a made-up field of study that deals with disagreement, where it comes from and how it works. Fundamental to erisology is the idea that humans use different internal models of reality. When they communicate its not only people who interact but also different worldviews that bump up against each other, and they may mix well or badly and with or without our awareness. The result is often misunderstanding and/or anger, especially on the internet where social context and cues are less helpful. (The preceding definition was cribbed from /r/erisology)

Here I'll link out to Nerst's blog, Everything Studies where he explains Erisology in his own words. His blog, though more recent and less populated than SSC, is fantastic and worth spending time in.

Nerst, his blog, and Erisology are yet another offshoot, result, or instantiation of the rationalist project, which is partly why I spilled so many words to set it up.

III. Discussion, so what, tl;dr:
Here on SS, we frequently have threads that could be classified as belonging to the ongoing Culture Wars. As we have a group of people across the political spectrum who are generally very thoughtful, these tend to be some of the most interesting and active threads on the site. They also tend to bring out our worst examples of what Nerst would call dysfunctional disagreement.

We have instances of people talking past each other, refusing to agree upon common (even just for the sake of argument) definitions, getting frustrated and moving into hostility, flamebaiting, and all the rest (though again, I'd say at lower levels than your average, representative internet forum). We can do better.

I'd like to challenge the SS community (including myself - I've dropped out of active participation and been lurking for far too long) to improve the quality of our disagreements and argumentation. Especially when it comes to the kind of topics that tend to get people fired up - those we hold dear to our hearts or that inform our identities.

To effect this improvement, it will be helpful to develop a standard terminology and understanding of the common pitfalls. Here's a semi-ordered reading list that should help achieve that:

Erisology 101 - Introduction
  1. People Are Different - No shit, you say, but it's far more true than most of us really realize.
  2. What Universal Human Experiences Are You Missing Without Realizing It? - Yup. People are really, really different.
  3. Varieties of Argumentative Experience - The SSC post that led me to Nerst and Erisology, and an excellent into/jumping-off-point for getting the basic concepts down.
Erisology 201 - Intermediate
  1. Partial Derivatives and Partial Narratives - Breaking down argumentative false dichotomies.
  2. I Can Tolerate Anything But the Outgroup - Possibly my favorite post on SSC, this describes Red/Blue/Grey Tribe theory.
  3. Five Case Studies on Politicization - Why everything always becomes a Giant Referendum on Everything, and why we should avoid that.
  4. The Signal and the Corrective - How to convince or alienate people.
  5. Superweapon Proliferation Worries - What are argumentative superweapons, and why to avoid them.
Erisology 301 - Applied
  1. Wordy Weapons of Is-Ought Alloy - How we (and others) speak is shaped by how we (and others) model the world.
  2. Beliefs as Endorsements - Expanding on the Is-Ought difficulties of argumentation.
  3. All in All, Another Brick in the Motte - A description of the pernicious motte-and-bailey argumentative failure mode.
  4. A Deep Dive Into the Harris-Klein Controversy - The erisological dissection of a semi-public dispute between well-known public intellectuals.
It's taken me about two years to get to where I currently am as an Aspiring Rationalist. I don't expect anyone to read through all the links or even the 'syllabus' above overnight or even over-week. But work your way through - maybe one a week. If even just a few of the core/regular SS members did so, I think we'd see a marked elevation in our contentious discussions (from an already above average starting point).

IV. Postscript
For those who find interesting some of the shared articles and style of thought they describe, I'll point you towards just a few more things I've been working through.

My first for-real intro to this kind of rationalist thought was Yudkowsky's Rationality: From AI to Zombies. This is not for the timid reader. I read pretty quickly, and this took me several months to get through. It's a collection of the core sequence of his blog posts on rationality from Overcoming Bias and LessWrong. This will challenge you, but give you an excellent education in rationality. Join me in the Baysian Conspiracy, and read.

A more Erisological work from Yudkowsky is his recent book Inadequate Equilibria. This is a much shorter read than the work above. It is closely related to the SSC article Meditations on Moloch (one of the three samples from SSC in the bulleted list above).

A final plug for Yudkowsky, Harry Potter and the Methods of Rationality. This takes the conceptual content from AI to Zombies and presents it as Harry Potter fanfic. This is one of the most popular pieces of fan fiction there is. It is well written, and takes the job of teaching rationality in a more accessible manner seriously.

For those who like their science fiction sciency and crunchy, (and firmly in the rationalist camp) I have three recommendations:
  • Greg Egan - He's a computer scientist who writes mind-bending hard scifi. Posthumans, AIs, and more. I can recommend any of his novels except the Orthogonal Series (haven't read it yet). I haven't read his short stories yet.
  • Peter Watts - He's a marine biologist who writes like Egan but from a biological perspective rather than a computery one. As with Egan, I can recommend any of his novels (still reading his most recent, but already a recommend), but haven't read the short stories. I discovered Watts after searching for more stuff like Egan.
  • Hanuu Rajaniemi - He's a mathematician and physicist. Disclaimer: I haven't read him yet, but I discovered him by searching for more like Egan and Watts. I've found several reviews that actually make the comparison to them. He's next on my list to read.
V. Thanks
I know this post is ridiculously long. And has even ridiculously more links. For any who hung with me through to the end, thanks for reading. This is my first 'public' attempt to share any of this stuff other than with one person with whom I can debate and such. I hope that if you follow some of these threads down the various rabbit holes they open up, that someone will join me on the path of aspiring to rationality.

edit: added Culture War definition, added link
Last edited:
Make your choice.

I dropped the mic at work today, so tomorrow may be a good time to read his post and ignore my Inbox.

"If I tell you I'm gonna' burn this building down to start a lemonade stand, I've already bought the lemons" effectively ended that telecon.
Excellent post. I understood the small words. I will put all my housework on hold, my 'work' work is dead to me, and I have disavowed and disowned my family. I should have the time. On the other hand....I need to eat, I have bills to pay, I have a yard to mow, and I need sex, so I will back-burner this for a day or two.

But get to it I shall.
The ShadowSpear Erisology Study Guide
*This post is going to be a living document that I'll continue to flesh out as I find time - I'm just getting it going now, and will be a skeleton to start.
Understanding that I dumped an insane amount of reading via links without too much elaboration in the OP, I'm building this post in an attempt to distill and summarize. I'm going to pull out some of the key conceptual tools from the literature and describe them in bite-sized chunks. Ideally, this will make these tools more accessible and readily available so we can start putting them to use.

Common reference acronyms:
SEoP = Stanford Encyclopedia of Philosohpy
Wiki = wikipedia

  • Skepticism, the Correspondence Theory of Truth, and Relativism
    Skeptical disclaimer:
    Philosophically, I'm a hard epistemic skeptic. I believe that there is somewhere between very little and nothing that we can truly know. Whatever the the external is actually like, we experience it from a distance, with several levels of mediation. If nothing else, there is a massive amount of sub-/pre-conscious filtering and processing of sensory input that happens before it ever hits our conscious perception. A bunch of stuff gets dumped, and our brains fill in the rest.

    You've probably heard of, or even seen, the famous video where you're supposed to look at ball passes and most people don't notice something else happening right in the the middle that's totally ridiculous (I don't want to spoil it if you haven't). They have a couple others. Take the ~5 min to check out the "Try It Yourself" videos from the Simmons Lab, and be amazed by how much you don't notice. This happens not only with sight but all of our senses. How aware of you right now of the feel of all your clothing on your skin?

    There's a number of great 'illusions' that demonstrate how our brains autocorrect or fill in the blanks. We all have a natural blind spot, where (reductive account warning) the the optic nerve passes through the retina, and there are no rods or cones. But we don't perceive holes in our vision on a regular basis. This quick demonstration really hammers that home. It only takes 30 seconds.

    For these, and a number of other reasons, I doubt my sensory experience as being authoritative or accurately presenting the external world. And yet, I live in the the world. I need to get by. So I don't allow myself to be hobbled by doubt. I will talk about knowing things as if we actually could. Just like I talk about color, as if it were a real thing that existed outside of my head (it doesn't). For the rest of this thread, if you want to get philosophical or pedantic with me, read my writing of "know", "knowledge", etc. as "believing with maximal certainty" or "the best evidence we think we have seems to very strongly support", etc.

    Did you notice the double "the" in each of the four preceding paragraphs?

    Correspondence Theory of Truth:
    Basically: A claim, belief, statement (etc.) is TRUE just in case it actually corresponds with the real world.

    I tend towards accepting this. For the sake of this rationality and Erisology thread, let's assume it is correct. So, truth isn't a matter of postmodern-style social constructions, or personal belief, or anything else. If the state of the world is accurately reflected by some claim (etc.) then it is TRUE. Else, FALSE.

    Accepting the above pretty much rules out most forms of epistemological (relating to knowledge - not talking about morality here) relativism. There is no room for 'true for you, but not for me'. Now, if someone wants to get into debate over relativism, I'd be happy to oblige, but that would warrant it's own thread.

    Additional reading:
    Filling in the gaps with hearing
    The original autocorrector
    SEoP Skepticism
    SEoP Correspondence Theory of Truth
    SEoP Relativism

  • The Map is Not the Territory. Also, Categorical Perception
    Building on the above, we don't directly inhabit the real world. We experience a model of the world that is the result of all the various data processing our brains conduct below/before the conscious level (check out the Sleight of Mind section here for a ton of references and elaboration- skip the vampire bit...).

    The relationship between the models of the world we construct and inhabit to the real world are analogous to the relationship between a map and the territory it (allegedly) depicts. Old school land nav will teach you real quick that the map does not equal the territory.

    There's an inherent and necessary information loss to begin with. The only perfect map of the territory (that perfectly depicts every detail to the nth degree) is the territory itself. Sure we have contour lines, but hills aren't usually terraced in elevation step changes. That's a feature of maps, not a bug. You can't carry Camp Talega around in your pocket, but that paper map was small and mostly prevented me from getting lost/off course.

    When you know someone, you have an idea of how they will act or react to things. You have a model of that person, and the better you know them, the more accurate your model. But your brain isn't running an copy of that person in addition to yourself. You can still be surprised. The model isn't complete.

    What we want to avoid is confusing the map for the territory. Our beliefs are the map. The world is the territory. The better our map, the more effective we can be. The more we can steer reality. With the Correspondence Theory of Truth, that means making our beliefs reflect the world as it is (not as we wish it to be), as closely as possible.

    Categorical Perception:
    Categories exist on the map. Borders between categories are often fuzzy and ultimately arbitrary. Much of what we argue about in dysfuntional disagreement is the proper definition of these categories. Other times, the dysfuntion comes when the arguers draw the categories differently but use the same label.

    Read the article linked to in the section header for a full breakdown of the concept, but here's the rough and ready: accurate modeling of details is computationally and energy intensive. If you've ever played a video game where the trees are these 2D cutouts that rotate to face you, then you've got an idea of what this is. Our brains, like the computer game, save on resources by slotting in a simplified and abstract version of the actual thing. In your peripheral vision, you don't typically notice all the little details of say, another person. Your brain just dumps in a cardboard cutout labeled "person's name" until you focus on them. Then the details get populated in your awareness.

    We perceive these categories or cardboard cutouts rather than the raw data. The phrase "categorical perception" comes from linguistics, and is a phenomenon where we hear letters rather than raw sounds. If you were to map out 'soundspace' in a 2D field, you could draw coundaries around letter sounds (phonemes) from a particular language. If you did the same for another language, there would be disagreements between them. Thus some famous instances of difficulties with letters from second languages, such as distinguishing "v"/"w" in English for native speakers of Slavic tongues, or "r"/"l" for Japanese speakers. In English, we have an "r" category and an "l" category. They sound very distinct to us because we are attuned to that categorization. But other languages have other categorizations.

    It gets better. Sounds that are close to each other but across letter-category diving lines sound the very different. Other sounds, much farther apart in 2D soundspace, but within the same letter-category are typically indistinguishable! So we letterally can't hear the difference between different sounds when they map to the same letter-category, but we can hear much smaller differences just as long as they sit on opposite sides of a letter-category boundary.

    We don't perceive the sound itself. We perceive the category.

    Again, categories exist on the map - in our heads - not in the territory/world. If you and I have different conceptual categories without realizing it, we will use the same words without ever talking about the same things. We might as well be speaking mutually unintelligible languages.

    Additional reading:
    Map & Territory

  • Charity and Steelmaning
    In the practice of philosophy, there's an excellent piece of etiquette known as the Principle of Charity. The basic idea, which is very similar if not identical with the concept of Steelmanning, is to interpret your opponent's/interlocutor's argument in the most charitable manner. We're all familiar with the Strawman fallacy - misrepresenting an argument in a weaker form. This is it's opposite.

    When Steelmanning, you argue against the strongest possible version of your opponent's/interlocutor's argument. This not only prevents the sort of (often pedantic) sniping that misses their point, it challenges you. If you can handle the strongest form of an argument/position, then the typical naive version should be no problem. If you can't, perhaps you should go back to the belief drawing board.

  • Ingroup / Outgroup
    to be fleshed out

  • Inside view / Outside view
    to be fleshed out

  • Object-level / Meta-level
    to be fleshed out

  • Tribes/teams and Signalling/cheering
    to be fleshed out

  • Relevant cognitive biases
    to be fleshed out

  • Common pitfalls
    to be fleshed out

  • more to come
Last edited:
In the middle of a move so I'll try and read through it all as I go, until I have a point where I can dedicate an entire day to dive all the way in. Going to lurk in the mean time and looking forward to what comes of this thread.
Generally speaking, I'm a knuckledragging slack-jawed yokel with a vocabulary slightly larger than that of your average 2nd grader. Nothing reinforces that more than seeing big unfamiliar words in thread titles.

So, imagine how thrilled I was after reading "Erisology" is not a real word -- or even a common one -- but rather a recently made up term for a made up field of study!

Whew! Embrace the little victories.
Mind blown.

Is this the value of our existence
Should we proclaim with such persistence
Our destiny relies on conscience
Red or blue, what's the difference
Stand or fall, state your peace tonight

Sang some guy in the eighties.
I went through 6-8 of those links and may do some more later. My off-the-cuff observations:

- Bias and heuristics: I think many of us don't consciously recognize the terms or mechanisms, but can admit they are there. I think this is a case of not knowing what we don't know. I don't know if I totally agree with the textbook notion of bias because "perception is reality." What one can call a bias or "wrong" could be perfectly right and justifiable to another. My perception is reality and YOUR perception is bias; it is all relative.
- I think some of the above does a great, though nerdy, job in explaining why we disagree. While it can rationalize our thoughts, our thoughts are ours. I hate to use the tautologies here, but "mine is mine" you know?
- The articles reminded me why I don't go down these rabbit holes and love what I can conceptualize, concrete matters for me to process. Theory is great, but beyond me. Talk about bias, when a person bogs down on the subject of being rational, they will move on. Now they are part of the problem because they can't/ won't grasp some esoteric* theory? I have to reject that notion.

* - I used a big word to prove I be smart.
Philosophy (erisology/rationality/whatever) tends to be a self licking ice cream cone.

You end up spending lots of time talking about things like ‘How SHOULD we discuss this issue? Is it right to discuss the issue like this?’ And you end up spending no time just having a conversation about the issue because there’s no good enough answer. It’s a masturbatory endeavor.
Last edited:
^ And some "extra" love for using one of my favorite terms: the self licking ice cream cone! :thumbsup:
Talk about bias, when a person bogs down on the subject of being rational, they will move on. Now they are part of the problem because they can't/ won't grasp some esoteric* theory? I have to reject that notion.

Love the subject matter but my own interpretations tend to include Physics (or is it metaphysics?).
I tend to experience things in a vacuum. The things I like tend to be considered "fine." Even without knowing it.
Example: My first taste of caviar was at 16. It was the good stuff. Didn't know squat except that I loved it. So this article (what-r-u-missing) isn't so much educational as "Dismissed on the grounds of those who are too aware of the world are not experiencing life as the Buddha."
Last edited:

Love the subject matter but my own interpretations tend to include Physics (or is it metaphysics?).
I tend to experience things in a vacuum. The things I like tend to be considered "fine." Even without knowing it.
Example: My first taste of caviar was at 16. It was the good stuff. Didn't know squat except that I loved it. So this article (what-r-u-missing) isn't so much educational as "Dismissed on the grounds of those who are too aware of the world are not experiencing life as the Buddha."

You're bulverizing as usual. 🤓😜