- Joined
- Jan 29, 2013
- Messages
- 438
Warning:
Long post; link dump! Numerous rabbit holes that may induce cognitive dissonance.
Side effects may include, but are not limited to: a systemic dismantling of cherished beliefs, years 'lost' to mastering a massive body of writing, acquiring skill in Bayesian Judo, developing a habit of making quantified predictions and backing them up with bets, and learning to disagree in a positive and civil manner than intellectually enriches. Also, death.
I. Preamble:
So I've been wanting to do a post like this for a while, and was finally spurred to do it by the past few weeks' excellent series of threads, What's Wrong With the Left / Right and Religion, ethics, and morality.
One of the things I most enjoy about the SS forums is the consistently high quality of discussion: the ever-amazing (though oh, so slowly released) case studies, mostly-civil debates on the culture wars*, expert dissection of technique and tactics, leadership analysis, and so much more. This has got to be one of the one of the most intellectually rich and meaningfully diverse military groups ever assembled - online or off.
That said, we fall prey to all the same traps as any/everyone else (though I'd say at a lesser rate than the general population): cognitive bias, tribe-membership/virtue signalling, talking past each other, appeals to emotion, and so on. This post is my attempt to share some of what I've learned over the past year and a half of a deep dive into the burgeoning community/movement/field of rationalism, and hopefully further elevate our level of debate and discussion.
*Culture war:
II. Definitions & Background:
Rationality - for the purposes of this post, I am not using the everyday definition of rationality. Nor am I talking about classical philosophical Rationalism, though there is certainly some overlap there. The rationalist movement that I wish to discuss is contemporary and traces its history to the blog Overcoming Bias by way of LessWrong. This movement has a strong geographical locus in the Bay Area, but has folks around the world.
(Warning: reductive account ahead) So it pretty much starts out when this guy, Eliezer Yudkowsky, starts blogging about ways to reduce cognitive bias in order to improve his economic performance (trying to make better investments and such). Pretty soon, he's developing a cult following (with a heavy CompSci/Sillicon Valley representation) and these people are working very hard to develop a martial art of rationality. A practice of training the mind to overcome its many flawed heuristics and other ways in our brains generally fail to generate logically sound trains of thought.
Eventually, the LessWrong community kind of implodes and there is an intellectual diaspora, they go out and found other blogs, companies, and institutions. You may have encountered some of these, such as:
Some noted bloggers/academics you may have encountered who are either involved with or adjacent to the movement:
Seriously, go check it out. I cannot do it justice. A small sampling:
As a segue to my next definition, it was SSC that led me to John Nerst and the new field he is creating:
Erisology - The word "erisology" comes from the name "Eris", the greek goddess of discord. It refers to a made-up field of study that deals with disagreement, where it comes from and how it works. Fundamental to erisology is the idea that humans use different internal models of reality. When they communicate its not only people who interact but also different worldviews that bump up against each other, and they may mix well or badly and with or without our awareness. The result is often misunderstanding and/or anger, especially on the internet where social context and cues are less helpful. (The preceding definition was cribbed from /r/erisology)
Here I'll link out to Nerst's blog, Everything Studies where he explains Erisology in his own words. His blog, though more recent and less populated than SSC, is fantastic and worth spending time in.
Nerst, his blog, and Erisology are yet another offshoot, result, or instantiation of the rationalist project, which is partly why I spilled so many words to set it up.
III. Discussion, so what, tl;dr:
Here on SS, we frequently have threads that could be classified as belonging to the ongoing Culture Wars. As we have a group of people across the political spectrum who are generally very thoughtful, these tend to be some of the most interesting and active threads on the site. They also tend to bring out our worst examples of what Nerst would call dysfunctional disagreement.
We have instances of people talking past each other, refusing to agree upon common (even just for the sake of argument) definitions, getting frustrated and moving into hostility, flamebaiting, and all the rest (though again, I'd say at lower levels than your average, representative internet forum). We can do better.
I'd like to challenge the SS community (including myself - I've dropped out of active participation and been lurking for far too long) to improve the quality of our disagreements and argumentation. Especially when it comes to the kind of topics that tend to get people fired up - those we hold dear to our hearts or that inform our identities.
To effect this improvement, it will be helpful to develop a standard terminology and understanding of the common pitfalls. Here's a semi-ordered reading list that should help achieve that:
Erisology 101 - Introduction
IV. Postscript
For those who find interesting some of the shared articles and style of thought they describe, I'll point you towards just a few more things I've been working through.
My first for-real intro to this kind of rationalist thought was Yudkowsky's Rationality: From AI to Zombies. This is not for the timid reader. I read pretty quickly, and this took me several months to get through. It's a collection of the core sequence of his blog posts on rationality from Overcoming Bias and LessWrong. This will challenge you, but give you an excellent education in rationality. Join me in the Baysian Conspiracy, and read.
A more Erisological work from Yudkowsky is his recent book Inadequate Equilibria. This is a much shorter read than the work above. It is closely related to the SSC article Meditations on Moloch (one of the three samples from SSC in the bulleted list above).
A final plug for Yudkowsky, Harry Potter and the Methods of Rationality. This takes the conceptual content from AI to Zombies and presents it as Harry Potter fanfic. This is one of the most popular pieces of fan fiction there is. It is well written, and takes the job of teaching rationality in a more accessible manner seriously.
For those who like their science fiction sciency and crunchy, (and firmly in the rationalist camp) I have three recommendations:
I know this post is ridiculously long. And has even ridiculously more links. For any who hung with me through to the end, thanks for reading. This is my first 'public' attempt to share any of this stuff other than with one person with whom I can debate and such. I hope that if you follow some of these threads down the various rabbit holes they open up, that someone will join me on the path of aspiring to rationality.
edit: added Culture War definition, added link
Long post; link dump! Numerous rabbit holes that may induce cognitive dissonance.
Side effects may include, but are not limited to: a systemic dismantling of cherished beliefs, years 'lost' to mastering a massive body of writing, acquiring skill in Bayesian Judo, developing a habit of making quantified predictions and backing them up with bets, and learning to disagree in a positive and civil manner than intellectually enriches. Also, death.
I. Preamble:
So I've been wanting to do a post like this for a while, and was finally spurred to do it by the past few weeks' excellent series of threads, What's Wrong With the Left / Right and Religion, ethics, and morality.
One of the things I most enjoy about the SS forums is the consistently high quality of discussion: the ever-amazing (though oh, so slowly released) case studies, mostly-civil debates on the culture wars*, expert dissection of technique and tactics, leadership analysis, and so much more. This has got to be one of the one of the most intellectually rich and meaningfully diverse military groups ever assembled - online or off.
That said, we fall prey to all the same traps as any/everyone else (though I'd say at a lesser rate than the general population): cognitive bias, tribe-membership/virtue signalling, talking past each other, appeals to emotion, and so on. This post is my attempt to share some of what I've learned over the past year and a half of a deep dive into the burgeoning community/movement/field of rationalism, and hopefully further elevate our level of debate and discussion.
*Culture war:
“Culture war” is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people change their minds regardless of the quality of opposing arguments.
-stolen from here
II. Definitions & Background:
Rationality - for the purposes of this post, I am not using the everyday definition of rationality. Nor am I talking about classical philosophical Rationalism, though there is certainly some overlap there. The rationalist movement that I wish to discuss is contemporary and traces its history to the blog Overcoming Bias by way of LessWrong. This movement has a strong geographical locus in the Bay Area, but has folks around the world.
(Warning: reductive account ahead) So it pretty much starts out when this guy, Eliezer Yudkowsky, starts blogging about ways to reduce cognitive bias in order to improve his economic performance (trying to make better investments and such). Pretty soon, he's developing a cult following (with a heavy CompSci/Sillicon Valley representation) and these people are working very hard to develop a martial art of rationality. A practice of training the mind to overcome its many flawed heuristics and other ways in our brains generally fail to generate logically sound trains of thought.
Eventually, the LessWrong community kind of implodes and there is an intellectual diaspora, they go out and found other blogs, companies, and institutions. You may have encountered some of these, such as:
- Center for Applied Rationality
- Machine Intelligence Research Institute
- Future of Life Institute
- 80,000 hours
- Open Philanthropy Project
- GiveWell
- MealSquares
- Beeminder
Some noted bloggers/academics you may have encountered who are either involved with or adjacent to the movement:
- Wait But Why?
- CGP Grey
- Bryan Caplan (author or The Case Against Education)
- Max Tegmark (author of Life 3.0 Being Human in the Age of Artificial Intelligence)
- Nick Bostrom (author of Superintelligence: Paths, Dangers, Strategies)
- Robin Hanson (author of The Elephant in the Brain: Hidden Motives in Everyday Life)
- Philip Tetlock (author of Superforecasting: The Art and Science of Prediction)
- David Friedman (son of Milton, author of The Machinery of Freedom: Guide to a Radical Capitalism)
Seriously, go check it out. I cannot do it justice. A small sampling:
- The Craft and the Codex (on the craft of rationality)
- Meditations on Moloch (why everything sucks, or entropy's a bitch)
- Reactionary Theory in an Enormous, Planet-sized Nutshell (he's not NRx, but tries to accurately portray their position)
As a segue to my next definition, it was SSC that led me to John Nerst and the new field he is creating:
Erisology - The word "erisology" comes from the name "Eris", the greek goddess of discord. It refers to a made-up field of study that deals with disagreement, where it comes from and how it works. Fundamental to erisology is the idea that humans use different internal models of reality. When they communicate its not only people who interact but also different worldviews that bump up against each other, and they may mix well or badly and with or without our awareness. The result is often misunderstanding and/or anger, especially on the internet where social context and cues are less helpful. (The preceding definition was cribbed from /r/erisology)
Here I'll link out to Nerst's blog, Everything Studies where he explains Erisology in his own words. His blog, though more recent and less populated than SSC, is fantastic and worth spending time in.
Nerst, his blog, and Erisology are yet another offshoot, result, or instantiation of the rationalist project, which is partly why I spilled so many words to set it up.
III. Discussion, so what, tl;dr:
Here on SS, we frequently have threads that could be classified as belonging to the ongoing Culture Wars. As we have a group of people across the political spectrum who are generally very thoughtful, these tend to be some of the most interesting and active threads on the site. They also tend to bring out our worst examples of what Nerst would call dysfunctional disagreement.
We have instances of people talking past each other, refusing to agree upon common (even just for the sake of argument) definitions, getting frustrated and moving into hostility, flamebaiting, and all the rest (though again, I'd say at lower levels than your average, representative internet forum). We can do better.
I'd like to challenge the SS community (including myself - I've dropped out of active participation and been lurking for far too long) to improve the quality of our disagreements and argumentation. Especially when it comes to the kind of topics that tend to get people fired up - those we hold dear to our hearts or that inform our identities.
To effect this improvement, it will be helpful to develop a standard terminology and understanding of the common pitfalls. Here's a semi-ordered reading list that should help achieve that:
Erisology 101 - Introduction
- People Are Different - No shit, you say, but it's far more true than most of us really realize.
- What Universal Human Experiences Are You Missing Without Realizing It? - Yup. People are really, really different.
- Varieties of Argumentative Experience - The SSC post that led me to Nerst and Erisology, and an excellent into/jumping-off-point for getting the basic concepts down.
- Partial Derivatives and Partial Narratives - Breaking down argumentative false dichotomies.
- I Can Tolerate Anything But the Outgroup - Possibly my favorite post on SSC, this describes Red/Blue/Grey Tribe theory.
- Five Case Studies on Politicization - Why everything always becomes a Giant Referendum on Everything, and why we should avoid that.
- The Signal and the Corrective - How to convince or alienate people.
- Superweapon Proliferation Worries - What are argumentative superweapons, and why to avoid them.
- Wordy Weapons of Is-Ought Alloy - How we (and others) speak is shaped by how we (and others) model the world.
- Beliefs as Endorsements - Expanding on the Is-Ought difficulties of argumentation.
- All in All, Another Brick in the Motte - A description of the pernicious motte-and-bailey argumentative failure mode.
- A Deep Dive Into the Harris-Klein Controversy - The erisological dissection of a semi-public dispute between well-known public intellectuals.
IV. Postscript
For those who find interesting some of the shared articles and style of thought they describe, I'll point you towards just a few more things I've been working through.
My first for-real intro to this kind of rationalist thought was Yudkowsky's Rationality: From AI to Zombies. This is not for the timid reader. I read pretty quickly, and this took me several months to get through. It's a collection of the core sequence of his blog posts on rationality from Overcoming Bias and LessWrong. This will challenge you, but give you an excellent education in rationality. Join me in the Baysian Conspiracy, and read.
A more Erisological work from Yudkowsky is his recent book Inadequate Equilibria. This is a much shorter read than the work above. It is closely related to the SSC article Meditations on Moloch (one of the three samples from SSC in the bulleted list above).
A final plug for Yudkowsky, Harry Potter and the Methods of Rationality. This takes the conceptual content from AI to Zombies and presents it as Harry Potter fanfic. This is one of the most popular pieces of fan fiction there is. It is well written, and takes the job of teaching rationality in a more accessible manner seriously.
For those who like their science fiction sciency and crunchy, (and firmly in the rationalist camp) I have three recommendations:
- Greg Egan - He's a computer scientist who writes mind-bending hard scifi. Posthumans, AIs, and more. I can recommend any of his novels except the Orthogonal Series (haven't read it yet). I haven't read his short stories yet.
- Peter Watts - He's a marine biologist who writes like Egan but from a biological perspective rather than a computery one. As with Egan, I can recommend any of his novels (still reading his most recent, but already a recommend), but haven't read the short stories. I discovered Watts after searching for more stuff like Egan.
- Hanuu Rajaniemi - He's a mathematician and physicist. Disclaimer: I haven't read him yet, but I discovered him by searching for more like Egan and Watts. I've found several reviews that actually make the comparison to them. He's next on my list to read.
I know this post is ridiculously long. And has even ridiculously more links. For any who hung with me through to the end, thanks for reading. This is my first 'public' attempt to share any of this stuff other than with one person with whom I can debate and such. I hope that if you follow some of these threads down the various rabbit holes they open up, that someone will join me on the path of aspiring to rationality.
edit: added Culture War definition, added link
Last edited: