Schroedinger’s assassins

Your rating: None
No votes yet

“So it turns out there’s a nonlinearity in quantum mechanics”.
“There are microtubules in the brain that couple strongly to the square of the probability amplitude”
“But only for two level systems, for some reason”
“Yes! The theorist explained it to me – something to do with Grassman algebras and the correspondence principle”
“Yes – so we cooked up a Schroedinger’s cat experiment”
“We should have put it through IRB – but we were afraid they’d say…..”
“Exactly. But when we set it up right we can keep the cat alive for thousands of runs by concentrating on that outcome.”
“Well, it takes some learning, and when the theorists do it the cats always die, but it’s a ten sigma effect at this stage.”
“So we’ve been working on the coupling.”
“Yes, we think about it in the Bohmian framework, so we learn to change the quantum potential for a particular subsystem, where we can choose the subsystem”
“Like a single spin you mean?”
“Yes – but actually it only seems to work if the binary choice is alive/dead”
“Yes, those branches in the multiverse are the only ones where the coupling’s strong enough”
“I thought you said you were Bohmians”
“I mean, whatever works. It’s not a cult, it’s an experiment”
“Sounds like a cult. Sounds like a death cult.”
“Well, yeah, the only way to get a positive result is to choose the branch where the subject dies – otherwise, you know, you can’t write the paper saying “and the rabbit kept living”.”
“I thought you were doing cats?”
“Yes, but for the long distance effects we’ve been targeting the rabbits over in the biology department.”
“How do you know what’s happening?”
“Oh, they’re on a webcam so PETA don’t come. We just set things up and then watch the webcam feed afterwards.”
“And you can choose which bunny dies?”
“Yeah, that’s like 10 sigma too. Except there’s something else.”
“Well, the average number of deaths doesn’t change”
“What do you mean the average number of deaths? How do these bunnies usually die?”
“Oh, they’re all genetically engineered to get cancer. They die like ten a day.”
“Yes. But if we kill ten in a day, that’s it. No more. And if we kill five a day then only about five die of cancer. It’s totally consistent with their being an expected number of deaths per day and we just change which ones die.”
“The ones you focus one.”
“Exactly, the ones we couple to.”
“Well then it got a bit interesting.”
“Only then?”
“See, we didn’t know this, but Eve, one of the grad students has a boyfriend, Fred, that’s really sick”
“Yeah, like, should have died six months ago sick.”
“And then we noticed that there were a lot of accidental deaths happening on or near campus”
“And that these were when Fred was taking a turn for the worse, and then he would pull through.”
“Yes. So we confronted Eve”
“Turns out Eve is a solipsist.”
“Yes. That’s why she won’t TA.”
“Makes sense.”
“So that day another accident happened.”
“Yes. You know that guy with the off-campus lab?”
“The climate denier?”
“Exactly. Well, he drowned.”
“In the river?”
“No in his lab. He studies… well it’s gross, but lets just say he drowned in a bath of something you wouldn’t want to step in.”
“Couldn’t happen to a nicer guy.”
“That’s what Eve said. And that she didn’t understand why she imagined that guy in the first place.”
“So you’re saying the multiverse has a fixed number of accidental deaths every day.”
“And you can choose who has those accidents.”
“And if we choose Alice to die, then Bob lives, so we don’t change the whole number”
“Yes - so there’s no moral problem – it’s just this accident or that accident.”
“So then we made a list”
“You can probably guess who’s at the top”
“So after that guy we kept going”
“Those brothers?”
“That media guy? The one that owns all the news?”
“You’re not really thinking about doing this.”
“Well, we made a list.”
“But what are the arguments against it?”
“Violence breeds violence.”
“No. These would just be accidents. Just these people instead of someone else.”
“What about what Gandalf said.”
“You know: many die who deserve to live, many live who deserve to die. If you can’t give life to those who die don’t give death…”
“Yeah, but we’re giving life to someone and taking it from someone else.”
“Hard to argue that someone deserves to die more than the people on our list.”
“You mean we cut off the possibility that a bunch of evil octagenerians have a road to Damascus and change the world for the better?”
“Well, when you say it like that it sounds stupid.”
“Butterfly effect?”
“You mean the world might actually be a better place leaving these guys in it?”
“Now you’re just being sarcastic.”
“What about the Hitler argument?”
“This is the Hitler argument – Hitler would definitely make the list.”
“No, you know, the fact that Hitler was a terrible strategist and killing him would have prolonged the war.”
“You mean there are worse people waiting in the wings?”
“Yeah. There aren’t. We checked. Turns out everyone on the list surrounds themselves with sycophantic incompetents.”
“What about the, you know, peril to your immortal soul?”
“Well, Eve’s gonna actually do it.”
“And she’s a solipsist…so..”
“Right – so the question is – should we do it?”
“Sounds like you decided already.”
“No. We only decided that you should decide.”
“You just have to decide: would the world be better off if these people had tragic accidents instead of some other people.”
“I see.”
“There are only two possibilities: yes or no”
……someone is typing.

About the Author: 
Peter Love is Associate Professor of Physics at Tufts University.