Relativity

"Empathy hurts.

"I'm grateful for being fantastically, unimaginably rich by world-historical standards—and I'm terrified of it being taken away. I feel bad for all the creatures in the past—and future?—who are stuck in a miserable Malthusian equilibrium.

"I simultaneously want to extend my circle of concern out to all sentient life, while personally feeling fear and revulsion towards anything slightly different from what I'm used to.

"Anna keeps telling me I have a skewed perspective on what constitutes a life worth living. I'm inclined to think that animals and poor people have a wretched not-worth-living existence, but perhaps they don't feel so sorry for themselves?—for the same reason that hypothetical transhumans might think my life has been wretched and not worth living, even while I think it's been pretty good on balance.

"But I'm haunted. After my recent ordeal in the psych ward, the part of me that talks described it as 'hellish.' But I was physically safe the entire time. If something so gentle as losing one night of sleep and being taken away from my usual environment was enough to get me to use the h-word, then what about all the actual suffering in the world? What hope is there for transhumanism, if the slightest perturbation sends us spiraling off into madness?

"The other week I was reading Julian Simon's book on overcoming depression; he wrote that depression arises from negative self-comparisons: comparing your current state to some hypothetical more positive state. But personal identity can't actually exist; time can't actually exist the way we think it does. If pain and suffering are bad when they're implemented in my skull, then they have to be bad when implemented elsewhere.

"Anna said that evolutionarily speaking, bad experiences are more intense than good ones because you can lose all your fitness in a short time period. But if 'the brain can't multiply' is a bias—if two bad things are twice as bad as one, no matter where they are in space and time, even if no one is capable of thinking that way—then so is 'the brain can't integrate': long periods of feeling pretty okay count for something, too.

"I'm not a negative utilitarian; I'm a preference utilitarian. I'm not a preference utilitarian; I'm a talking monkey with delusions of grandeur."

Continuum Utilitarianism

You hear people talk about positive (maximize pleasure) versus negative (minimize pain) utilitarianism, or average versus total utilitarianism, none of which seem very satisfactory. For example, average utilitarianism taken literally would suggest killing everyone but the happiest person, and total utilitarianism implies what Derek Parfit called the repugnant conclusion: that for any possible world with lots of happy people, the total utilitarian must prefer another possible world with many more people whose lives are just barely worth living.

But really, it shouldn't be that surprising that there's no simple, intuitively satisfying population ethics, because any actual preference ordering over possible worlds is going to have to make tradeoffs: how much pleasure and how much pain distributed across how many people's lives in what manner, what counts as a "person," &c.