Desire is the direction, rationality is the magnitude
Rationality: AI to Zombies is an e-book compiled from about two years worth of writing by Eliezer Yudkowsky, founder of the research institute I work at. It's a pretty good introductory text for aspiring rationalists — with a few caveats. First of all, it's rather lengthly, clocking in at around 1800 pages. (It comes in six parts; treat it like six books — start with the first one, see how you like it.) Secondly, as Eliezer says in the foreward, the content used to generate this book is a little dated and far from perfect. Third, before I recommend the book, know that there's a fair bit of background knowledge that Eliezer assumes the reader already possesses.
The next few blog posts will be a series of short remarks in which I try to make some of those background assumptions explicit (and say some things I wish I'd been told a long time ago). After that, I'll go back to writing my planned series on removing guilt & shame motivation.
A brief note on "rationality."
It's a common trope that thinking can be divided up into "hot, emotional thinking" and "cold, rational thinking" (with Kirk and Spock being the stereotypical offenders, respectively). The tropes say that the hot decisions are often stupid (and inconsiderate of consequences), while the cold decisions are often smart (but made by the sort of disconnected nerd that wears a lab coat and makes wacky technology). Of course (the trope goes) there are Deep Human Truths available to the hot reasoners that the cold reasoners know not.
Many people, upon encountering one who says they study the art of human rationality, jump to the conclusion that these "rationalists" are people who reject the hot reasoning entirely, attempting to disconnect themselves from their emotions once and for all, in order to avoid the rash mistakes of "hot reasoning." Many think that these aspiring rationalists are attempting some sort of dark ritual to sacrifice emotion once and for all, while failing to notice that the emotions they wish to sacrifice are the very things which give them their humanity. "Love is hot and rash and irrational," they say, "but you sure wouldn't want to sacrifice it." Understandably, many people find the prospect of "becoming more rational" rather uncomfortable.
So heads up: this sort of emotional sacrifice has little to do with the word "rationality" as it is used in Rationality: AI to Zombies.
When Rationality: AI to Zombies talks about "rationality," it's not talking about the "cold" part of hot vs cold reasoning, it's talking about the reasoning part.
One way or another, we humans are reasoning creatures. Sometimes, when time pressure is bearing down on us, we make quick decisions and follow our split-second intuitions. Sometimes, when the stakes are incredibly high and we have time available, we deploy the machinery of logic, in places where we trust it more than our impulses. But in both cases, we are reasoning. Whether our reasoning be hot or cold or otherwise, there are better and worse ways to reason.
(And, trust me, brains have found a whole lot of the bad ones. What do you expect, when you run programs that screwed themselves into existence on computers made of meat?)
The rationality of Rationality: AI to Zombies isn't about using cold logic to choose what to care about. Reasoning well has little to do with what you're reasoning towards. If your goal is to enjoy life to the fullest and love without restraint, then better reasoning (while hot or cold, while rushed or relaxed) will help you do so. But if your goal is to annihilate as many puppies as possible, then this-kind-of-rationality will also help you annihilate more puppies.
(Unfortunately, this usage of the word "rationality" does not match the colloquial usage. I wish we had a better word for the study of how to improve one's reasoning in all its forms that didn't also evoke images of people sacrificing their emotions on the altar of cold logic. But alas, that ship has sailed.)
If you are considering walking the path towards rationality-as-better-reasoning, then please, do not sacrifice your warmth. Your deepest desires are not a burden, but a compass. Rationality of this kind is not about changing where you're going, it's about changing how far you can go.
People often label their deepest desires "irrational." They say things like "I know it's irrational, but I love my partner, and if they were taken from me, I'd move heaven and earth to get them back." To which I say: when I point towards "rationality," I point not towards that which would rob you of your desires, but rather towards that which would make you better able to achieve them.
That is the sort of rationality that I suggest studying, when I recommend reading Rationality: AI to Zombies.