1
I'm not very good at feeling the size of large numbers. Once you start tossing around numbers larger than 1000 (or maybe even 100), the numbers just seem "big".
Consider Sirius, the brightest star in the night sky. If you told me that Sirius is as big as a million earths, I would feel like that's a lot of Earths. If, instead, you told me that you could fit a billion Earths inside Sirius… I would still just feel like that's a lot of Earths.
The feelings are almost identical. In context, my brain grudgingly admits that a billion is a lot larger than a million, and puts forth a token effort to feel like a billion-Earth-sized star is bigger than a million-Earth-sized star. But out of context — if I wasn't anchored at "a million" when I heard "a billion" — both these numbers just feel vaguely large.
I feel a little respect for the bigness of numbers, if you pick really really large numbers. If you say "one followed by a hundred zeroes", then this feels a lot bigger than a billion. But it certainly doesn't feel (in my gut) like it's 10 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 000 times bigger than a billion. Not in the way that four apples internally feels like twice as many as two apples. My brain can't even begin to wrap itself around this sort of magnitude differential.
This phenomena is related to scope insensitivity, and it's important to me because I live in a world where sometimes the things I care about are really really numerous.
For example, billions of people live in squalor, with hundreds of millions of them deprived of basic needs and/or dying from disease. And though most of them are out of my sight, I still care about them.
The loss of a human life with all is joys and all its sorrows is tragic no matter what the cause, and the tragedy is not reduced simply because I was far away, or because I did not know of it, or because I did not know how to help, or because I was not personally responsible.
Knowing this, I care about every single individual on this planet. The problem is, my brain is simply incapable of taking the amount of caring I feel for a single person and scaling it up by a billion times. I lack the internal capacity to feel that much. My care-o-meter simply doesn't go up that far.
And this is a problem.
2
It's a common trope that courage isn't about being fearless, it's about being afraid but doing the right thing anyway. In the same sense, caring about the world isn't about having a gut feeling that corresponds to the amount of suffering in the world, it's about doing the right thing anyway. Even without the feeling.
My internal care-o-meter was calibrated to deal with about a hundred and fifty people, and it simply can't express the amount of caring that I have for billions of sufferers. The internal care-o-meter just doesn't go up that high.
Humanity is playing for unimaginably high stakes. At the very least, there are billions of people suffering today. At the worst, there are quadrillions (or more) potential humans, transhumans, or posthumans whose existence depends upon what we do here and now. All the intricate civilizations that the future could hold, the experience and art and beauty that is possible in the future, depends upon the present.
When you're faced with stakes like these, your internal caring heuristics — calibrated on numbers like "ten" or "twenty" — completely fail to grasp the gravity of the situation.
Saving a person's life feels great, and it would probably feel just about as good to save one life as it would feel to save the world. It surely wouldn't be many billion times more of a high to save the world, because your hardware can't express a feeling a billion times bigger than the feeling of saving a person's life. But even though the altruistic high from saving someone's life would be shockingly similar to the altruistic high from saving the world, always remember that behind those similar feelings there is a whole world of difference.
Our internal care-feelings are woefully inadequate for deciding how to act in a world with big problems.
3
There's a mental shift that happened to me when I first started internalizing scope insensitivity. It is a little difficult to articulate, so I'm going to start with a few stories.
Consider Alice, a software engineer at Amazon in Seattle. Once a month or so, those college students with show up on street corners with clipboards, looking ever more disillusioned as they struggle to convince people to donate to Doctors Without Borders. Usually, Alice avoids eye contact and goes about her day, but this month they finally manage to corner her. They explain Doctors Without Borders, and she actually has to admit that it sounds like a pretty good cause. She ends up handing them $20 through a combination of guilt, social pressure, and altruism, and then rushes back to work. (Next month, when they show up again, she avoids eye contact.)
Now consider Bob, who has been given the Ice Bucket Challenge by a friend on facebook. He feels too busy to do the ice bucket challenge, and instead just donates $100 to ALSA.
Now consider Christine, who is in the college sorority ΑΔΠ. ΑΔΠ is engaged in a competition with ΠΒΦ (another sorority) to see who can raise the most money for the National Breast Cancer Foundation in a week. Christine has a competitive spirit and gets engaged in fund-raising, and gives a few hundred dollars herself over the course of the week (especially at times when ΑΔΠ is especially behind).
All three of these people are donating money to charitable organizations… and that's great. But notice that there's something similar in these three stories: these donations are largely motivated by a social context. Alice feels obligation and social pressure. Bob feels social pressure and maybe a bit of camaraderie. Christine feels camaraderie and competitiveness. These are all fine motivations, but notice that these motivations are related to the social setting, and only tangentially to the content of the charitable donation.
If you took any of Alice or Bob or Christine and asked them why they aren't donating all of their time and money to these causes that they apparently believe are worthwhile, they'd look at you funny and they'd probably think you were being rude (with good reason!). If you pressed, they might tell you that money is a little tight right now, or that they would donate more if they were a better person.
But the question would still feel kind of wrong. Giving all your money away is just not what you do with money. We can all say out loud that people who give all their possessions away are really great, but behind closed doors we all know that such people are crazy. (Good crazy, perhaps, but crazy all the same.)
This is a mindset that I inhabited for a while. There's an alternative mindset that can hit you like a freight train when you start internalizing scope insensitivity.
4
Consider Daniel, a college student shortly after the Deepwater Horizon BP oil spill. He encounters one of those college students with the clipboards on the street corners, soliciting donations to the World Wildlife Foundation. They're trying to save as many oiled birds as possible. Normally, Daniel would simply dismiss the charity as Not The Most Important Thing, or Not Worth His Time Right Now, or Somebody Else's Problem, but this time Daniel has been thinking about how his brain is bad at numbers and decides to do a quick sanity check.
He pictures himself walking along the beach after the oil spill, and encountering a group of people cleaning birds as fast as they can. They simply don't have the resources to clean all the available birds. A pathetic young bird flops towards his feet, slick with oil, eyes barely able to open. He kneels down to pick it up and help it onto the table. One of the bird-cleaners informs him that they won't have time to get to that bird themselves, but he could pull on some gloves and could probably save the bird with three minutes of washing.
Daniel decides that he would spend three minutes of his time to save the bird, and that he would also be happy to pay at least $3 to have someone else spend a few minutes cleaning the bird. He introspects and finds that this is not just because he imagined a bird right in front of him: he feels that it is worth at least three minutes of his time (or $3) to save an oiled bird in some vague platonic sense.
And, because he's been thinking about scope insensitivity, he expects his brain to misreport how much he actually cares about large numbers of birds: the internal feeling of caring can't be expected to line up with the actual importance of the situation. So instead of just asking his gut how much he cares about de-oiling lots of birds, he shuts up and multiplies.
Thousands and thousands of birds were oiled by the BP spill alone. After shutting up and multiplying, Daniel realizes (with growing horror) that the amount he acutally cares about oiled birds is lower bounded by two months of hard work and/or fifty thousand dollars. And that's not even counting wildlife threatened by other oil spills.
And if he cares that much about de-oiling birds, then how much does he actually care about factory farming, nevermind hunger, or poverty, or sickness? How much does he actually care about wars that ravage nations? About neglected, deprived children? About the future of humanity? He actually cares about these things to the tune of much more money than he has, and much more time than he has.
For the first time, Daniel sees a glimpse of of how much he actually cares, and how poor a state the world is in.
This has the strange effect that Daniel's reasoning goes full-circle, and he realizes that he actually can't care about oiled birds to the tune of 3 minutes or $3: not because the birds aren't worth the time and money (and, in fact, he thinks that the economy produces things priced at $3 which are worth less than the bird's survival), but because he can't spend his time or money on saving the birds. The opportunity cost suddenly seems far too high: there is too much else to do! People are sick and starving and dying! The very future of our civilization is at stake!
Daniel doesn't wind up giving $50k to the WWF, and he also doesn't donate to ALSA or NBCF. But if you ask Daniel why he's not donating all his money, he won't look at you funny or think you're rude. He's left the place where you don't care far behind, and has realized that his mind was lying to him the whole time about the gravity of the real problems.
Now he realizes that he can't possibly do enough. After adjusting for his scope insensitivity (and the fact that his brain lies about the size of large numbers), even the "less important" causes like the WWF suddenly seem worthy of dedicating a life to. Wildlife destruction and ALS and breast cancer are suddenly all problems that he would move mountains to solve — except he's finally understood that there are just too many mountains, and ALS isn't the bottleneck, and AHHH HOW DID ALL THESE MOUNTAINS GET HERE?
In the original mindstate, the reason he didn't drop everything to work on ALS was because it just didn't seem… pressing enough. Or tractable enough. Or important enough. Kind of. These are sort of the reason, but the real reason is more that the concept of "dropping everything to address ALS" never even crossed his mind as a real possibility. The idea was too much of a break from the standard narrative. It wasn't his problem.
In the new mindstate, everything is his problem. The only reason he's not dropping everything to work on ALS is because there are far too many things to do first.
Alice and Bob and Christine usually aren't spending time solving all the world's problems because they forget to see them. If you remind them — put them in a social context where they remember how much they care (hopefully without guilt or pressure) — then they'll likely donate a little money.
By contrast, Daniel and others who have undergone the mental shift aren't spending time solving all the world's problems because there are just too many problems. (Daniel hopefully goes on to discover movements like effective altruism and starts contributing towards fixing the world's most pressing problems.)
5
I'm not trying to preach here about how to be a good person. You don't need to share my viewpoint to be a good person (obviously).
Rather, I'm trying to point at a shift in perspective. Many of us go through life understanding that we should care about people suffering far away from us, but failing to. I think that this attitude is tied, at least in part, to the fact that most of us implicitly trust our internal care-o-meters.
The "care feeling" isn't usually strong enough to compel us to frantically save everyone dying. So while we acknowledge that it would be virtuous to do more for the world, we think that we can't, because we weren't gifted with that virtuous extra-caring that prominent altruists must have.
But this is an error — prominent altruists aren't the people who have a larger care-o-meter, they're the people who have learned not to trust their care-o-meters.
Our care-o-meters are broken. They don't work on large numbers. Nobody has one capable of faithfully representing the scope of the world's problems. But the fact that you can't feel the caring doesn't mean that you can't do the caring.
You don't get to feel the appropriate amount of "care", in your body. Sorry — the world's problems are just too large, and your body is not built to respond appropriately to problems of this magnitude. But if you choose to do so, you can still act like the world's problems are as big as they are. You can stop trusting the internal feelings to guide your actions and switch over to manual control.
6
This, of course, leads us to the question of "what the hell do you then?"
And I don't really know yet. (Though I'll plug the Giving What We Can pledge, GiveWell, MIRI, and The Future of Humanity Institute as a good start).
I think that at least part of it comes from a certain sort of desperate perspective. It's not enough to think you should change the world — you also need the sort of desperation that comes from realizing that you would dedicate your entire life to solving the world's 100th biggest problem if you could, but you can't, because there are 99 bigger problems you have to address first.
I'm not trying to guilt you into giving more money away — becoming a philanthropist is really really hard. (If you're already a philanthropist, then you have my acclaim and my affection.) First it requires you to have money, which is uncommon, and then it requires you to throw that money at distant invisible problems, which is not an easy sell to a human brain. Akrasia is a formidable enemy. And most importantly, guilt doesn't seem like a good long-term motivator: if you want to join the ranks of people saving the world, I would rather you join them proudly. There are many trials and tribulations ahead, and we'd do better to face them with our heads held high.
7
Courage isn't about being fearless, it's about being able to do the right thing even if you're afraid.
And similarly, addressing the major problems of our time isn't about feeling a strong compulsion to do so. It's about doing it anyway, even when internal compulsion utterly fails to capture the scope of the problems we face.
It's easy to look at especially virtuous people — Gandhi, Mother Theresa, Nelson Mandela — and conclude that they must have cared more than we do. But I don't think that's the case.
Nobody gets to comprehend the scope of these problems. The closest we can get is doing the multiplication: finding something we care about, putting a number on it, and multiplying. And then trusting the numbers more than we trust our feelings.
Because our feelings lie to us.
When you do the multiplication, you realize that addressing global poverty and building a brighter future deserve more resources than currently exist. There is not enough money, time, or effort in the world to do what we need to do.
There is only you, and me, and everyone else who is trying anyway.
8
You can't actually feel the weight of the world. The human mind is not capable of that feat.
But sometimes, you can catch a glimpse.