Relevance

"Utilitarianism is slowly driving me mad."

"Really?"

"Okay, the part of me that talks wants to self-report that utilitarianism is slowly driving me mad, but what's actually happening is probably better described at a lower level of organization.

"I don't know how to simultaneously love life and actually believe in evolution. People mostly like being alive, and there are all sorts of wonderful things like friendship and love and pleasure and beauty—but those things only exist at the expense of enough pain and suffering and death to carve love into the genome from scratch. I don't—I'm not sure it was worth it.

"But my thoughts are better constrained by decision-theoretic relevance: since I can't make decisions about the past, asking whether it was worth it is a type error, a confusion. My life is going fine right now: I'm young and healthy and smart and rich. The local future looks great. And the deep future—doesn't need us. I am content."

One thought on “Relevance

  1. I disagree with reality on the proposition that value must be created ab initio, that the funnel of complexity must flow from the shards of quantum mechanical noise to the faucet of reasoning beings. I could handle the proposition that mathematics can be united, is a connected Grothendieck or some other such universe which admits a singularity, or origin, that yields a causal path from oblivion to deity through the shambled rocks of pain and suffering exploited by natural selection, and thereby yields fruit of at least countably infinite cardinality, if such a notion makes sense as applied to experiences embedded in time. However, by the anthropic principle, I conclude it terrifyingly unlikely for the funnel to point in reverse: if a god, a champion of computation over all possible structures in Tegmark's metaverse, had made the journey through ateolological tormented battles in disconnected sentience-space--segregated like prison cells by confines of skulls--and reshape the extant haplessness of an idiot optimization process, then I would have likely found myself, or whatever exotic sentient process emergences from such a course of events, *happy*, as would the apparent co-player characters beside me. To notice this isn't so, despite its exceedingly likely probability if such an eventuality did come to pass, induces a resignation of a depth that is simply unfair for a physical animal, or any being within even a dozen orders of magnitude "more awake" (the phrase used by Fermi to describe Von Neumann) to ever decently bear. As for the accompaniment of terror and identity dissolution, I am glad to announce I have managed to insert bugs in my wetware to crash these emotional processes before they can yield a just translation to behavior; I am not the Rybka that evaluates to -9 and just keeps chugging, I pretended the calculation was not *interesting* and fell asleep. And then, Rybka is not stuck in the pieces...

Leave a Reply

Your email address will not be published. Required fields are marked *