Joined

(Previously on Star Trek: An Algorithmic Lucidity.)

The morning of Thursday the eighth, before heading off to see the new LCSW at the multi-specialty clinic, I was idly rereading some of the early Closetspace strips, trying to read between the lines (as it were) using the enhanced perception granted by the world-shattering insight about how everything I've cared about for the past fourteen years turns out to be related in unexpected and terrifying ways that I can't talk about because I don't want to lose my cushy psychology professorship at Northwestern University. (Victoria tells Carrie, "Not to mention you don't think like one of 'them'"; ha ha, I wonder what that means!) When I got to the part where Carrie chooses a Maj. Kira costume to wear to the sci-fi convention, it occured to me that in addition to having the exactly the right body type to cosplay Pearl from Obnoxious Bad Decision Child, I also have exactly the right body type to cosplay Jadzia Dax from Star Trek: Deep Space Nine, on account of my being tall—well, actually I'm an inch shorter than Terry Farrell—thin, white, and having a dark ponytail.

Okay, not exactly the right body type. You know what I mean.

Continue reading

Your Periodic Reminder I

Aumann's agreement theorem should not be naïvely misinterpreted to mean that humans should directly try to agree with each other. Your fellow rationalists are merely subsets of reality that may or may not exhibit interesting correlations with other subsets of reality; you don't need to "agree" with them any more than you need to "agree" with an encyclopædia, photograph, pinecone, or rock.

Fighting Game Ideas I

Évariste Galois vs. Aaron Burr

particularist special-snowflake fox vs. broad-brush dimensionality-reducing hedgehog

the pain of arguing with creationists vs. the pain of being a creationist and not understanding why those damned smug evolutionists won't even talk to you

Resisting the Narrative

Culture wars are a subtle thing to wage, because they determine everything without being about anything. Explicitly political contests are at least ostensibly about some particular concrete thing: you're fighting for or against a specific law or a specific candidate. But how do you fight a narrative, when your enemy is less of a regime and more of a meme? How do you explain to anyone what you're trying to accomplish when you're not trying to get anyone to do anything different in particular, but to renounce their distorted way of thinking and speaking, after which you expect them to make better decisions, even if you can't say in advance what those decisions will be?

Picture me rushing into a room. "People, people! The standard map is wrong! Look at this way better map I found in the literature; let's use this one!"

"Our map isn't wrong. It has all the same continents yours does."

"I mean, yes, but it's a Mercator projection. Surely you don't really think Antarctica is larger than Asia?"

"Why do you care what size Antarctica is? What difference does it make? People are perfectly happy with Antarctica being the largest continent."

"But it's not true!"

"It sounds like you're assuming your beliefs are true. What is truth, anyway?"

And it being the case that no one will die if she gets the size of Antarctica wrong, what can I say to that?

Prescription II

that feel eighteen months post-Obergefell when you realize you missed your chance to be pro-civil-unions-with-all-the-same-legal-privileges but anti-calling-it-marriage while that position was still in the Overton window

(in keeping with the principle that it shouldn't be so exotic to want to protect people's freedom to do beautiful new things without necessarily thereby insisting on redefining existing words that already mean something else)

Alpha Gamma Phi

In the oneiric methodlessness of my daydream, my bros at ΑΓΦ are telling me that E is the best party drug and that I have to try it.

"I don't know, guys," I say.

"Nah, bro, you've got to try it!"

"Okay," I say, "just don't expect me to mentally rotate any 3D objects tomorrow."

Hiatus I

An Algorithmic Lucidity is going on hiatus until December 1! There will be no new posts in November and the remainder of October. Thanks for reading, and hope to see you back in eight weeks!

The Parable of the Honest Man and the Thing

"I really want to do the thing! All of my friends who are just like me are doing the thing, and they look like they're having so much fun!"

"You can totally do the thing! You just have to sign ... this loyalty oath!"

(reading it) "What? I can't sign this. It's, it's—" (rising horror) "not scientifically accurate!"

"Everyone else who is doing the thing has signed the loyalty oath."

"Could I ... do the thing, without signing the loyalty oath?"

"You could, but everyone you ever interact with for the rest of your life will assume that you've signed the loyalty oath; it would take five hours for you to explain what you actually believe, but no one will listen to you for that long because they'll decide that you're a hateful lunatic thirty seconds in."

(A beat.)

"You know, honestly, my life is fine as it is. I don't need to do the thing. I'm glad my friends are having fun."

(dies of cardiac disease fifty years later without having done the thing)

(Earth is consumed in a self-replicating nanotechnology accident)

Concerns II

(Previously.)

"I'm concerned about the socially-undesirable implications of the correlations documented in these published studies, which seem consistent with my own observations and personal experience."

(studying them) "Hey! These correlation coefficients are not equal to one! In fact, all of them are substantially less than one! How dare you try to construct predictive models about how the world works, when you yourself admit that your model won't assign literally all of its probability mass to the exact outcome?!"

(in despair, as if realizing that the nature of reasoning as an adaptation for arguing with conspecifics in imperfectly-deceptive social organisms implies that no one can ever have a serious, grown-up conversation about anything important) "Just kill meeeeeeeeeee"

Wicked Transcendence II

went to the genderqueer support/discussion group at the Pacific Center again; showed up early to change into my Pearl dress (it would be a waste to only wear it once) and surreptitiously slip a copy of Anne Lawrence into the library

I think I mostly enjoy being the token conservative/TERF (um, relatively speaking); I say that my pronouns are he/him "because I don't perceive myself as having a choice in the matter" and probably smashed the record for most uses of the phrase biological sex at one of these

Bayesomasochism

Physical pain is the worst thing in the world, and the work of effective altruists will not be done until the last nociceptor falls silent and not a single moment of suffering remains to be computed across our entire future light cone.

But the emotional pain of discovering that your cherished belief is false, that everything you've ever cared about is not only utterly unattainable, but may in fact not even be coherent?—yeah, I'm pretty sadomasochistic about that. That's rationality; that's what it feels like to be alive.

RustConf 2016 Travelogue

(Previously on An Algorithmic Lucidity.)

sfo_reflections

The other weekend, excited to learn more and connect with people about what's going on at the forefront of expressive, performant, data-race-free computing—and eager for a healthy diversion from the last two months of agonizing delirium induced by the world-shattering insight about how everything I've cared about for the past fourteen years turns out to be related in unexpected and terrifying ways that I can't talk about for reasons that I also can't talk about—I took Friday off from my dayjob and caught a Thursday night flight out of SFO to exotic Portland (... I, um, don't travel much) for RustConf!

The conference itself was on Saturday, but Friday featured special training sessions run by members of the Rust core team! I was registered for Niko Matsakis's afternoon session on lifetimes, but I arrived at the venue (the Luxury Collection Nines Hotel) early to get registered (I had never seen socks as conference swag before!) and hang out with folks and get a little bit of coding done: my coolest Rust project so far is a chess engine that I wrote this time last year (feel free to go ahead and give it a Star!) which I wanted the option to show off (Option<ShowOff>) to other conference attendees, but the pretty web application frontend had broken due to a recent bug and my JavaScript build pipeline having rotted. I fixed it just in time for the lifetimes training session to start.

Continue reading

The Roark–Quirrell Effect

Education increases altruism up to a point (as you increasingly understand that other people are real too and have moral value for the same reasons you do even if you don't experience it from the first person), until you accumulate so many seemingly unique insights that the entire rest of the world looks so abominably stupid that you no longer want to waste a single precious dollar or minute on the concerns of these creatures that can't even see the Really Obvious Thing.

(Or, maybe this is just a form of mental illness specific to high-psychoticism males that can be cured with the appropriate drugs. We'll find out!)

The World By Gaslight

In the oneiric methodlessness of my nightmare, I am a lieutenant commander posted to the Glomar Explorer; I am pacing the deck while opining that taking the correct, minority position in a scientific controversy necessarily feels just like early-onset dementia (which I can't help but notice makes a perfect pairing with a late-onset case of the other d------ia word).

Something is wrong with the ship's computer. Before I can figure out whether it has to do with HTTP Strict Transport Security or the Accelerated Graphics Port (it has to be one or the other), we sink, and I drown.