Guns

"Do you know, I've decided I like guns. Of course it would be preferable to wave a magic wand and have all sentient life live in peace and harmony in paradise forever. But if Reality puts you in a situation where you have to kill, at least we have tools to do it quickly: a well-aimed bang and there isn't a creature there to suffer for very long. That's actually a huge improvement over the state of nature, where animals kill with nothing but teeth and claws."

The Demandingness Objection

"Well, I'm not giving up dairy, but I can probably give up meat, and milk is at the very bottom of Brian's table of suffering per kilogram demanded, so I'd be contributing to much less evil than I was before. That's good, right?

"For all the unimaginably terrible things our species do to each other and to other creatures, we're not—we're probably not any worse than the rest of nature. Gazelles suffer terribly as lions eat them alive, but we can't intervene because then the lions would starve, and the gazelles would have a population explosion and starve, too. We have this glorious idea that people need to consent before sex, but male ducks just rape the females, and there's no one to stop it—nothing else besides humans around capable of formulating the proposition, as a proposition, that the torment and horror in the world is wrong and should stop. Animals have been eating each other for hundreds of millions of years; we may be murderous, predatory apes, but we're murderous, predatory apes with Reason—well, sort of—and a care/harm moral foundation that lets some of us, with proper training, to at least wish to be something better.

"I don't actually know much history or biology, but I know enough to want it to not be real, to not have happened that way. But it couldn't have been otherwise. In the absence of an ontologically fundamental creator God, Darwinian evolution is the only way to get purpose from nowhere, design without a designer. My wish for things to have been otherwise ... probably isn't even coherent; any wish for the nature of reality to have been different, can only be made from within reality.

Continue reading

Revisionist History I

"It is my considered opinion that Emily Dickinson was a time-traveling cryonicist."

"That is an opinion I have not previously heard advanced."

"C'mon! 'Because I could not stop for Death, / He kindly stopped for me; / The carriage held but just ourselves / And Immortality'? It's obvious!"

Huffman

Dear reader, you know what's way more fun than feeling sad about the nature of the cosmos? Data compression, that's what! Suppose you want to send a message to your friends in a nearby alternate universe, but interuniversal communication bandwidth is very expensive (different universes can't physically interact, so we and our alternate-universe analogues can only communicate by mutually inferring what the other party must be saying, which takes monstrous amounts of computing power and is not cheap), so you need to make your message as brief as possible. Note that 'brief' doesn't just have to do with how long your message is in natural language, it also has to do with how that message is represented over the transuniveral communication channel: indeed, the more efficient the encoding, the more you can afford to say on a fixed budget.

The classic ASCII encoding scheme uses seven bits to represent each character. (Seven?—you ask perplexedly, surely you mean eight? Apparently it was seven in the original version.) Can we do better? Well ... ASCII has a lot of stuff that arguably you don't need that badly. Really, upper and lower case letters? Ampersands, asterisks, backslashes? And don't get me started about those unprintable control characters! If we restrict our message to just the uncased alphabet A through Z plus space and a few punctuation marks, then we can encode our message using only a 32 (= 25) character set, at five bits per character.

Can we do better? Seemingly not—24 = 16 isn't a big enough character set to cover the alphabet. Unless ...

Continue reading

Remembering

"I remember feeling like a person, and feeling like people were ontologically distinct from animals, and I don't know how it's possible to pick up the pieces after that illusion has gone.

"I remember caring about parochial political concerns. I cared about gender equality, and educational freedom. And, and, I cared very badly about being respected for being intelligent. But now that I see that the latter was just a standard male primate status-seeking drive gone haywire—or not gone haywire, but functioning normally—and that my less-obviously-selfish concerns were driven by idiosyncratic features of my own psychology that few others have any reason to care about—none of it seems as compelling anymore.

"Then what is compelling? Well, I'm terrified of the pain of being physically hurt, so if I don't know what's real and I don't know what's right, I can always fall back on 'pain and suffering are bad.'

"But there has to be more to morality than that. I complained about how people in institutional contexts optimize for not-being-personally-blamed and no one is actually trying to do anything. But of course passive helplessness is the result when you don't have any goals except not-being-hurt.

"I want to be Innocent and Safe with Probability One, but Probability One is an absurdity that can't exist. In a sufficiently large universe, random fluctuations in maximum entropy heat death form a Boltzmann brain Judeo-Christian God who will punish you for masturbating. But somehow I'm not worried about that.

"But I shouldn't be thinking about any of this. I have my own life to tend to, and it looks great; the rest of space and time will have to take care of itself. I seem to have memories of being in the save/destroy/take over the world business, but now it seems more convenient to be agnostic about whether any of that actually happened."

Lyrics to the Song About Matt Reeves

Dead kid gets a bench
Dead kid gets a memorial bench
So now we all know his name
Though we don't know who he is

Class of nineteen ninety two
Though he died in 'ninety one
Was he a better friend than you?
And what'd he do for fun?
What were his opinions on the issues of the day?
And what exactly took his breath away?

Now he's still and in the grave
And since the dead seem all the same
No one really cares to wonder what he was
Forgotten as we're staring at his name

Dead kid gets a bench
Dead kid gets a memorial bench
So now we all know his name
Though we don't care who he is

Dead kid gets a bench
And the inscription just screams "Rust this"
No inscription can do justice
Though we don't know who he is
And we don't care who he is

Retirement

"Rational agents should never be made worse off by more information—well, almost never. So if I can no longer contemplate the big picture without life seeming like a bad thing—the fewer needs you have, the fewer ways in which you can be hurt; if you don't exist, you can't be hurt—then maybe I could just—not contemplate it? If my will to live is something that can be destroyed by the truth, then maybe P. C. Hodgell was wrong? This needn't entail self-delusion: distraction is quite sufficient. There are plenty of things to do that won't remind me of the vastness of suffering in the multiverse.

"Daily life, exercise, practical programming skills, finding a job—pure math and compsci if I need something intellectual. But no philosophy, history, current events, futurism, social science, biology, or game theory. Not much fiction, because stories are about people's pain. I just don't want to know anymore."

Relevance

"Utilitarianism is slowly driving me mad."

"Really?"

"Okay, the part of me that talks wants to self-report that utilitarianism is slowly driving me mad, but what's actually happening is probably better described at a lower level of organization.

"I don't know how to simultaneously love life and actually believe in evolution. People mostly like being alive, and there are all sorts of wonderful things like friendship and love and pleasure and beauty—but those things only exist at the expense of enough pain and suffering and death to carve love into the genome from scratch. I don't—I'm not sure it was worth it.

"But my thoughts are better constrained by decision-theoretic relevance: since I can't make decisions about the past, asking whether it was worth it is a type error, a confusion. My life is going fine right now: I'm young and healthy and smart and rich. The local future looks great. And the deep future—doesn't need us. I am content."

Relativity

"Empathy hurts.

"I'm grateful for being fantastically, unimaginably rich by world-historical standards—and I'm terrified of it being taken away. I feel bad for all the creatures in the past—and future?—who are stuck in a miserable Malthusian equilibrium.

"I simultaneously want to extend my circle of concern out to all sentient life, while personally feeling fear and revulsion towards anything slightly different from what I'm used to.

"Anna keeps telling me I have a skewed perspective on what constitutes a life worth living. I'm inclined to think that animals and poor people have a wretched not-worth-living existence, but perhaps they don't feel so sorry for themselves?—for the same reason that hypothetical transhumans might think my life has been wretched and not worth living, even while I think it's been pretty good on balance.

"But I'm haunted. After my recent ordeal in the psych ward, the part of me that talks described it as 'hellish.' But I was physically safe the entire time. If something so gentle as losing one night of sleep and being taken away from my usual environment was enough to get me to use the h-word, then what about all the actual suffering in the world? What hope is there for transhumanism, if the slightest perturbation sends us spiraling off into madness?

"The other week I was reading Julian Simon's book on overcoming depression; he wrote that depression arises from negative self-comparisons: comparing your current state to some hypothetical more positive state. But personal identity can't actually exist; time can't actually exist the way we think it does. If pain and suffering are bad when they're implemented in my skull, then they have to be bad when implemented elsewhere.

"Anna said that evolutionarily speaking, bad experiences are more intense than good ones because you can lose all your fitness in a short time period. But if 'the brain can't multiply' is a bias—if two bad things are twice as bad as one, no matter where they are in space and time, even if no one is capable of thinking that way—then so is 'the brain can't integrate': long periods of feeling pretty okay count for something, too.

"I'm not a negative utilitarian; I'm a preference utilitarian. I'm not a preference utilitarian; I'm a talking monkey with delusions of grandeur."

Dimensionality

"So, an engineer and a mathematician are leaving a lecture. The engineer says, 'I just don't understand how you can visualize objects in seven-dimensional space.' The mathematician says, 'Oh, that's easy. You just visualize the n-dimensional case, and then set n equal to seven.'"

"I've never liked that joke. The punchline is intended to be absurd, but it's not: that's actually how you do it."

"Really?"

"Okay, fine. You visualize the three-dimensional case, and then set three equal to seven."

I Don't Understand Time

Our subjective experience would have it that time "moves forward": the past is no longer, and the future is indeterminate and "hasn't happened yet." But it can't actually work that way: special relativity tells us that there's no absolute space of simultaneity; given two spacelike separated events, whether one happened "before" or "after" the other depends on where you are and how fast you're going. This leads us to a "block universe" view: our 3+1 dimensional universe, past, present, and future, simply exists, and the subjective arrow of time somehow arises from our perspective embedded within it.

Without knowing much in the way of physics or cognitive science myself, I can only wonder if there aren't still more confusions to dissolved, intuitions to be unlearned in the service of a more accurate understanding. We know things about the past from our memories and by observing documents; we might then say that memories and documents are forms of probabilistic evidence about another point in spacetime. But predictions about the future are also a form of probabilistic evidence about another point in spacetime. There's a sort of symmetry there, isn't there? Could we perhaps imagine that minds constructed differently from our own wouldn't perceive the same kind of arrow of time that we do?

Second-Order Rationality for the Chronically Anxious

In your conscious verbal thoughts, take it as an axiom that "I am Safe and Innocent with Probability One," not because that's actually true, but because the Maslow Physiological/Safety levels require it. Of course, actually assigning Probability One would be a very dangerous thing to do, because it means never changing your mind, ever: P(H|E) = P(E|H)P(H)/(P(E|H)P(H) + P(E|¬H)P(¬H)), but if P(H) is unity, then P(H|E) = P(E|H)(1)/(P(E|H)(1) + P(E|¬H)(0)) = P(E|H)/P(E|H) = 1. If you were really Safe and Innocent with Probability One, there would be no harm in dropping an anvil on yourself or someone else's head. So meanwhile, have other parts of your brain secretly, nonverbally select actions to secure your innocence and safety using some other procedure.

The Horror of Naturalism

There's this deeply uncomfortable tension between being an animal physiologically incapable of caring about anything other than what happens to me in the near future, and the knowledge of the terrifying symmetry that cannot be unseen: that my own suffering can't literally be more important, just because it's mine. You do some philosophy and decide that your sphere of moral concern should properly extend to all sentient life—whatever sentient turns out to mean—but life is built to survive at the expense of other life.

I want to say, "Why can't everyone just get along and be nice?"—but those are just English words that only make sense to other humans from my native culture, who share the cognitive machinery that generated them. The real world is made out of physics and game theory; my entire concept of "getting along and being nice" is the extremely specific, contingent result of the pattern of cooperation and conflict in my causal past: the billions of corpses on the way to Homo sapiens, the thousands of years of culture on the way to the early twenty-first century United States, the nonshared environmental noise on the way to me. Even if another animal would agree that pleasure is better than pain and peace is better than war, the real world has implementation details that we won't agree on, and the implementation details have to be settled somehow.

I console myself with the concept of decision-theoretic irrelevance: insofar as we construe the function of thought as to select actions, being upset about things that you can't affect is a waste of cognition. It doesn't help anyone for me to be upset about all the suffering in the world when I don't know how to alleviate it. Even in the face of moral and ontological uncertainty, there are still plenty of things-worth-doing. I will play positive-sum games, acquire skills, acquire resources, and use the resources to protect some of the things I care about, making the world slightly less terrible with me than without me. And if I'm left with the lingering intuition that there was supposed to be something else, some grand ideal more important than friendship and Pareto improvements ... I don't remember it anymore.

Continuum Utilitarianism

You hear people talk about positive (maximize pleasure) versus negative (minimize pain) utilitarianism, or average versus total utilitarianism, none of which seem very satisfactory. For example, average utilitarianism taken literally would suggest killing everyone but the happiest person, and total utilitarianism implies what Derek Parfit called the repugnant conclusion: that for any possible world with lots of happy people, the total utilitarian must prefer another possible world with many more people whose lives are just barely worth living.

But really, it shouldn't be that surprising that there's no simple, intuitively satisfying population ethics, because any actual preference ordering over possible worlds is going to have to make tradeoffs: how much pleasure and how much pain distributed across how many people's lives in what manner, what counts as a "person," &c.

Prodrome

"I'm okay—I've been through this—it's just the sort of prodrome that could develop into paranoid schizophrenia, but won't, because I've been trained not to believe my own thoughts!

"But the relationship between psychology and philosophy is funny. I've been having pretty drastic mood swings on the timescale of hours or minutes, and I've also been paying a lot of attention to modal realism, mathematical universe, "existence as an ensemble of disconnected observer-moments"-type ideas. I think the causality actually goes in that direction: from psychoticism to Tegmark IV. But the nature of reality can't actually depend on the minutia of my particular form of mental illness ...

"I don't want to do philosophy or social science or futurism anymore; I've lost the ability to do it sanely, if I ever had it. My brain just keeps generating cosmic horror stories to be terrified of, when really it's not my business and can't be my business. Most of what happens in the future is outside of my current conceptspace. Most of what happens in the present is outside of my current conceptspace. It all adds up to normality locally: that is, that which we consider normal is an artifact of how the world has unfolded up to now.

"Better to take up an engineering mindset. Focus on solving practical problems in the only world that I can actually touch, rather than continuing to execute self-injury behaviors dwelling on the horror that must exist in the vastness of space and time.

"I'll be fine—for the near future. Only I miss how consciousness used to feel. I used to feel like I knew things, but now all I can do is make predictions."

Strategy Overhaul

"I have drastically, drastically underestimated the social costs of nonconformity—costs I was paying, and quite possibly correctly so under reflection, but which I didn't notice I was paying."

"Say more."

"Well, as discussed previously, I had been modeling other people as defective versions of my model of myself, without realizing that this was a mistake on at least two counts: one, other people are not like my model of me, and two, I'm not as much like my model of me as I had wanted to believe, both of which observations are manifestations of that horrifying fact which I'm only now starting to appreciate: that people are animals, that Darwinism isn't just a proposition to endorse, but it actually happened that way in real life."

"And how does that relate to the costs of nonconformity?"

"I had expected people, including myself, to be fairly agent-like, when actually we're far more animal-like than I would have ever guessed: we're mostly just kludges of habits and heuristics; the skill of, of ... recomputing how to behave in the service of some goal is rare, and it's justifiably rare, because it usually doesn't work; most new ideas are wrong. We're told that school is about learning, and when I noticed that the things I do outside of school are genuinely more intellectually meritorious than my official homework, I felt outraged and betrayed: why didn't anyone just tell me that knowledge is good, and skill is good, and anything you do in the service of the acquisition of knowledge and skill is good?! But it was a rhetorical question; I didn't actually try to answer it. But it's not hard to figure out: the stories we tell about ourselves aren't very good models of our behavior, that's all. Insofar as we attribute purpose to the evolved social institution of schooling, it's probably some weighted blend of learning, babysitting, signaling intelligence and conscientiousness, subordination training, and path-dependent noise. Insofar as we construe people as agents who want to learn stuff, paying for college is idiotic: that's what books are for. But as coordination technology for a civilization of crazy monkeys?—if everyone expects a Bachelor's degree, who am I to tell them that it's just a signaling game, just a bubble?"

"So, you're planning to finish your degree despite your recent, uh, setback?"

"Well ... maybe. I certainly need to learn to fit in better with the other crazy monkeys by being more empathetic and agreeable—I've had a lot of unacknowledged outgroup hostility going on that I should stop. But it should be clear now that the degree is strictly of instrumental value. That's how most people think of it, isn't it?—just a job ticket. It shouldn't be heartbreaking to do something instrumentally, just for what it buys you. And yet ..."

Apophenia

It's well-known that it shouldn't actually be that shocking to occasionally encounter seemingly shocking coincidences: the time your friend calls you just as you were about to call them might seem like compelling evidence for psychic powers, but only because you don't remember all the other occasions when an equally improbable coincidence could have happened, but didn't. We tend to see patterns even where none exist, and neglect that million-to-one events happen seven times a day in New York.

I expect this problem to actually get worse as you learn more: if you know n concepts, the number of possible connections between them is O(n2); if your ability to notice patterns grows faster than your sense of what patterns constitute a coincidence worth noticing, then you should expect to encounter more and more shocking coincidences over time.