Miriam chucked the replica Salvator Mundi into the bonfire.
“We do a lot of shooting first and asking questions never,” said Vi.
“Personnel are expensive. AIs are replaceable. We have standard operating procedures. It wasn’t always this way. AIs used to be rare. Knowledge was precious in those early days. We didn’t know what the machines could and couldn’t do,” said Miriam.
“You were a founder of Bayeswatch?” said Vi.
“I am not that old,” said Miriam.
Miriam paused the holocaust for a moment to examine an oil painting of a vampire chained to a solar execution chamber. She tossed it into the fire.
“They had yet to standardize the architectures back then. There were overhangs all over the place. Using explicit error-entropy functions wasn’t even standard industry practice. Instead they just used the implicit priors of whatever architecture got results quickly,” said Miriam.
“That’s like making gunpowder without atomic theory,” said Vi.
“Or medicine without chemistry. Those early machines were…” Miriam trailed off.
Vi tossed a painting of a dodo tree into the fire.
“One of my first missions…it was my mentor’s last. We were dispatched to explore a small compound with signs of unaligned activity,” said Miriam.
“That’s suicide. What was command thinking?” said Vi.
“It was the early days. Singularity breakout could have been just around the corner,” said Miriam.
“But drones—” said Vi.
“Software back then was written by human beings. It had more security holes than features. Sending a drone to investigate a misaligned AI was like sending a set of lockpicks to investigate a whether a magician has broken out of his cell,” said Miriam.
Miriam wore lots of foundation and concealer on her face. Vi wondered how many scars it covered up.
“We investigated a compound in the mountains of California. Kind of reminded me of Ex Machina,” said Miriam.
“Was it owned by a billionaire?” said Vi.
“In your dreams. You read too many romance novels. The guy wasn’t not-rich. He was an early employee of a moderately-successful software startup. I guess he built the error-entropy minimizer himself. To this day I am unsure what the thing was supposed to do. It was dead by the time we showed up,” said Miriam.
“Dead?” said Vi.
“Poetic license. My point is we weren’t dealing with an active threat. The inventor turned on his machine. It carried out its mission. It turned itself off. The end,” said Miriam.
“There’s obviously more to it. Otherwise you wouldn’t be telling me this story,” said Vi.
“We did passive scans. No sound or electric activity. The compound had been built around a courtyard. We entered the courtyard by climbing over the compound wall,” said Miriam.
Vi had long since stopped noticing the works of art. Her hands continued on autopilot.
“The courtyard had been…. Here. Let me show you a picture,” said Miriam.
Miriam handed her phone to Vi. The photo was poor quality. It had been taken from a cell phone camera.
“What is that?” said Vi.
The photo looked fake. It was like a Zen garden grown out of 3D-printed crystals. Plants had once formed part of the fractal but they had subsequently misbehaved. The original vision lingered only in the inorganic bits. Vi identified hints of higher-order patterns but most of the symmetries lurked beyond her conscious comprehension.
Vi had dated a grad student studying physics. His professor once allowed him a single index card full of handwritten equations to use on test. He packed the index card with equations and diagrams so concise they were almost encrypted. The garden reminded Vi of that index card.
“It’s the most beautiful thing I’ve ever seen,” said Vi.
Miriam nodded.
“Why don’t we make AI art like that anymore?” said Vi. She threw a knockoff Picasso into the fire.
Miriam laughed. “It’s like the crested black macaque selfie. Humans never made it in the first place.”
“Ah. I see. After the error-entropy machine completed its task it set about optimizing its environment to conform to its sense of beauty. But can’t we deliberately program a machine to do that? It’s not like the technology has disappeared,” said Vi.
“We can’t copy it exactly because the machine erased its own source code. But that garden came from a very particular configuration of good priors and resource constraints. Good priors are dangerous. Priors that good…we’re lucky it didn’t turn the universe into paperclips,” said Miriam.
“Great art comes from tortured people. Torturing a superintelligence is dangerous,” said Vi.
“You can’t torture an AI,” said Miriam.
“Poetic license. Besides, good artists are sadists. It’s hard to make an AI both safe and sadistic at the same time. You also can’t give it real world resource constraints in a simulation,” said Vi.
“Bayeswatch is expensive. Our tools cost money. Agents die in the line of duty. There is collateral damage. But the greatest casualty of regulation is novel machines. If that AI was built today it would be decommissioned before it got to optimize its environment,” said Miriam.
“Did you find anything else in the compound?” said Vi.
“A few deactivated robots. A server rack overwritten with random data. The corpse of the engineer. I think he died of natural causes. We left through the front door. Well, we tried to. As we opened it an M18A1 Claymore anti-personnel mine activated. My mentor took the brunt of the blast,” said Miriam.
“Why did the AI want to kill you?” said Vi.
“It didn’t care about us. It set up the booby trap to protect the garden while it was under construction. When the garden was finished it just didn’t bother to disable it,” said Miriam.
Vi’s hands stopped. There were no more paintings to destroy.
Bayeswatch 4: Mousetrap
Miriam chucked the replica Salvator Mundi into the bonfire.
“We do a lot of shooting first and asking questions never,” said Vi.
“Personnel are expensive. AIs are replaceable. We have standard operating procedures. It wasn’t always this way. AIs used to be rare. Knowledge was precious in those early days. We didn’t know what the machines could and couldn’t do,” said Miriam.
“You were a founder of Bayeswatch?” said Vi.
“I am not that old,” said Miriam.
Miriam paused the holocaust for a moment to examine an oil painting of a vampire chained to a solar execution chamber. She tossed it into the fire.
“They had yet to standardize the architectures back then. There were overhangs all over the place. Using explicit error-entropy functions wasn’t even standard industry practice. Instead they just used the implicit priors of whatever architecture got results quickly,” said Miriam.
“That’s like making gunpowder without atomic theory,” said Vi.
“Or medicine without chemistry. Those early machines were…” Miriam trailed off.
Vi tossed a painting of a dodo tree into the fire.
“One of my first missions…it was my mentor’s last. We were dispatched to explore a small compound with signs of unaligned activity,” said Miriam.
“That’s suicide. What was command thinking?” said Vi.
“It was the early days. Singularity breakout could have been just around the corner,” said Miriam.
“But drones—” said Vi.
“Software back then was written by human beings. It had more security holes than features. Sending a drone to investigate a misaligned AI was like sending a set of lockpicks to investigate a whether a magician has broken out of his cell,” said Miriam.
Miriam wore lots of foundation and concealer on her face. Vi wondered how many scars it covered up.
“We investigated a compound in the mountains of California. Kind of reminded me of Ex Machina,” said Miriam.
“Was it owned by a billionaire?” said Vi.
“In your dreams. You read too many romance novels. The guy wasn’t not-rich. He was an early employee of a moderately-successful software startup. I guess he built the error-entropy minimizer himself. To this day I am unsure what the thing was supposed to do. It was dead by the time we showed up,” said Miriam.
“Dead?” said Vi.
“Poetic license. My point is we weren’t dealing with an active threat. The inventor turned on his machine. It carried out its mission. It turned itself off. The end,” said Miriam.
“There’s obviously more to it. Otherwise you wouldn’t be telling me this story,” said Vi.
“We did passive scans. No sound or electric activity. The compound had been built around a courtyard. We entered the courtyard by climbing over the compound wall,” said Miriam.
Vi had long since stopped noticing the works of art. Her hands continued on autopilot.
“The courtyard had been…. Here. Let me show you a picture,” said Miriam.
Miriam handed her phone to Vi. The photo was poor quality. It had been taken from a cell phone camera.
“What is that?” said Vi.
The photo looked fake. It was like a Zen garden grown out of 3D-printed crystals. Plants had once formed part of the fractal but they had subsequently misbehaved. The original vision lingered only in the inorganic bits. Vi identified hints of higher-order patterns but most of the symmetries lurked beyond her conscious comprehension.
Vi had dated a grad student studying physics. His professor once allowed him a single index card full of handwritten equations to use on test. He packed the index card with equations and diagrams so concise they were almost encrypted. The garden reminded Vi of that index card.
“It’s the most beautiful thing I’ve ever seen,” said Vi.
Miriam nodded.
“Why don’t we make AI art like that anymore?” said Vi. She threw a knockoff Picasso into the fire.
Miriam laughed. “It’s like the crested black macaque selfie. Humans never made it in the first place.”
“Ah. I see. After the error-entropy machine completed its task it set about optimizing its environment to conform to its sense of beauty. But can’t we deliberately program a machine to do that? It’s not like the technology has disappeared,” said Vi.
“We can’t copy it exactly because the machine erased its own source code. But that garden came from a very particular configuration of good priors and resource constraints. Good priors are dangerous. Priors that good…we’re lucky it didn’t turn the universe into paperclips,” said Miriam.
“Great art comes from tortured people. Torturing a superintelligence is dangerous,” said Vi.
“You can’t torture an AI,” said Miriam.
“Poetic license. Besides, good artists are sadists. It’s hard to make an AI both safe and sadistic at the same time. You also can’t give it real world resource constraints in a simulation,” said Vi.
“Bayeswatch is expensive. Our tools cost money. Agents die in the line of duty. There is collateral damage. But the greatest casualty of regulation is novel machines. If that AI was built today it would be decommissioned before it got to optimize its environment,” said Miriam.
“Did you find anything else in the compound?” said Vi.
“A few deactivated robots. A server rack overwritten with random data. The corpse of the engineer. I think he died of natural causes. We left through the front door. Well, we tried to. As we opened it an M18A1 Claymore anti-personnel mine activated. My mentor took the brunt of the blast,” said Miriam.
“Why did the AI want to kill you?” said Vi.
“It didn’t care about us. It set up the booby trap to protect the garden while it was under construction. When the garden was finished it just didn’t bother to disable it,” said Miriam.
Vi’s hands stopped. There were no more paintings to destroy.