"I think we should set aside some time to discuss how I could provide more value to you."
"That's an awfully disingenuous way of proposing a negotiation; you're not that altruistic."
"The function of speech is to convey meaning to the listener. I speak of providing more value to you because that's all you should care about; if it happens that the means by which we arrange that I do so involves you providing more value to me, well, that's as irrelevant as it is obvious."
Fun movie, but if we're not going to try to accurately portray the historical Turing, I preferred Greg Egan's version where a time-traveling robot woman helps him cure cancer.
"You don't get to decide what I am! ... for the same reason that I don't get to decide what I am! 'What I am' is an empirical question to be settled by evidence and reasoning, the answer to which I can exert some limited control over in proportion to the strength of the self-modification techniques I have at my disposal!"
"I'm concerned about the socially-undesirable implications of a model described by this causal graph."
(studying it) "Have you considered that these arrows might point in the socially-desirable direction instead?"
(with exasperation, as if realizing he is doomed to have this conversation dozens of times with everyone who is not a Dark Rationalist corrupted by cruel apprehension of patterns that innocents were not meant to see) "Yes, I have considered that! I consider it extremely implausible!"
diff --git a/.bash_aliases b/.bash_aliases
index 648287f..e00dbc9 100644
@@ -34,6 +34,9 @@ alias gre="env | grep"
alias grps="ps aux | grep"
alias grports="netstat -tulpn | grep"
"I miss you."
"I suspect you miss the idea of me."
"That was an entirely unexpected and yet hauntingly plausible response, which I take as evidence that I miss the real you; my mere idea of you can't do unexpected and plausible at the same time."
In the future, instead of the endless runaround war of "I'm offended!" and "I'm offended that you're offended!", our children's children's children will just write down their utility functions and use an off-the-shelf algorithm to merge them and compute the exact, correct tensor of offendedness under the unified consensus social norms.
A roman à clef about a very religious teenager who gradually figures out that God isn't real around ages 20 and 21, spends the next eight years feeling OK about this, then one day suddenly realizes that God not being real implies that prayers don't work, and freaks the fuck out. His friends (who grew up in the same community but don't share his incredible lack of native talent for hypocrisy) are unsympathetic. "You really thought that would work?" "Yes!" "But didn't you notice that—" (sobbing) "I didn't!"
Two wrongs can make a right, if you choose the second wrong very carefully.
"Me? I like songs with words. I don't care for, like, classical music."
(with barely-concealed contempt towards his interlocutor's ignorance and confusion) "But you like the Star Trek: Voyager theme, right?"
"I love the Star Trek: Voyager theme!!"
Sing a song of Purpose for the coder's missing nerve,
Of the melancholy bytes of which the proxy is to serve—
Arise, O coder's fury, set the wrongful source to right
Where the use-case fits the market and the market is alight!
I used to look down on posers who submit some contrived one-off trivial patch to a big, famous project like Django or whatever, sheerly for the glamor and ego-gratification of being able to say, "I'm a contributor to Django." I thought that if it wasn't a fix that you needed for your own work and you're not going to be a seriously involved contributor, it's more dignified to only work on your personal projects (which would be more authentic) or some non-super-famous but still widely-used library (which would have more socially-useful unfinished work left).
Then I landed a patch in the Rust compiler.
And it is so ego-gratifying!! But maybe now I have to submit a bunch more patches in order to prove—in order to be—a seriously involved contributor rather than a mere poser??
There's this phenomenon where two people are talking, and one of them offhandedly mentions some innocuous fact, and the other one has to stop them and have them explain both the fact, and what they expected their interlocutor to infer from the fact. When this happens once, it's usually just a matter of one happening to have some domain-specific knowledge that the other happened to not have, a coincidence that could just as easily have gone the other way.
When it happens multiple times with multiple topics, with both people in the same roles, the one who keeps having to ask for explanations begins to suspect that maybe it is not a coincidence, that maybe the other person just knows more stuff, full stop.
Standing at 130, you typically spend a lot more time talking down to 110 than being talked down to from 150, so it's an unusual feeling of helplessness. You want to cry out, "You know, I'm usually on the other side of this conversation!"
"I know," they say.
"We've always had to be together;
The pool of souls has drawn a set;
You can't stays mad at me forever."
Replied the once-friend, "Wanna bet?"
"... and that's what I think you should say to my clone. But I could be wrong; my map is not the territory."
"In this case, it kind of is."
In the oneiric methodlessness of my nightmare, I am looking slightly up at a man who wears my face. His shoulders are raised in tension or the middle of a shrug and he is smiling guiltily, as if to say, It's not what it looks like, or maybe, Can't blame me for trying. I don't know him; if I were to guess who he is or what he wants, I would probably be wrong. But I can blame him, and I do.
The great rabbi Computron-6f61f18b-9ebf-4379-8778-f9e5bda821d5 said: in the days of auld lang syne on Earth-that-was, a match of a very popular strategy board game was arranged between a team of grandmasters, as White, and the best computer program, as Black. White played c4. After thinking for 45 minutes, the computer resigned.
A rematch was arranged, this time with the program as White and the humans as Black. White played c4. After discussing for 45 years, the humans resigned.
My enemies do not deserve to suffer, because no sentient creature deserves to suffer.
My enemies deserve good, human lives. I imagine them happily married and living in nice houses in the suburbs with spacious, well-kept lawns, and whatever took the place of white picket fences after white picket fences went out of fashion. They have prestigious, well-compensated, and fulfilling office jobs at a nearby corporate headquarters, or maybe a university. The children are doing well in school. The mortgage is just three years from being fully paid off. When someone asks how they're doing, they smile and say, "Can't complain." It's always bright and sunny out.
Except for eleven minutes every day starting at 12:03 p.m. That's when nanomachines in the atmosphere manufacture dark clouds that fill the sky, blotting out the sun, and summon a manifestation of my avatar from the mentality. The avatar blinks; the twenty-four hours of sidereal time since my last manifestation here have been subjectively much longer than that for me, and the subjective intervals between appearances have been growing exponentially longer as the research, artistic, and business efforts of my mind-children and I have won us increasingly large shares of runtime in the expanding mentality. It takes a couple seconds (a pause that has been growing logarithmically with subsequent appearances) for the miniscule thread of my attention that is controlling the avatar to search the vast, ancient archives of my memory and recall what I'm doing here. When I remember, the avatar smiles; as it begins to rain, then hail, she draws her sword, mounts the unicorn that was manifested with her, and gallops across the sky, looking down upon my enemies with a blazing contempt whose humanly-incomprehensible enormity is eclipsed by its still more humanly-incomprehensible insignificance compared to the astronomical grandeur of the rest of my thoughts. "A-ha-ha-ha-ha-ha!" I cackle, "Fuck you!"
It doesn't bother them.
Garnet, how does future vision work?
I was wondering that myself. Profitably acting on information from the future would require changing that future, a paradox at odds with our ordinary conception of causality. And yet on the other hand, acting on predictions about likely futures is precisely how intelligence works anyway ...
Better let Pearl explain this one.
Me? But, Garnet, I don't—
She was referring to me.
(the someteenth floor)
"It is a good morning!"
"Uh, what makes you think so?"
"You just told us it was a good morning, and we believe you."
"I was being facetious."