A is not bad because torturing a person and then restoring their initial state has precisely same consequences as forging own memory of torturing a person and restoring their initial state.
Maybe_a
Well, it seems somewhat unfair to judge the decision on information not available for decision-maker, however, I fail to see how is that an ‘implicit premise’.
I didn’t think Geneva convention was that old, and, actually updating on it makes Immerwahr decision score worse, due to lower expected amount of saved lives (through lower chance of having chemical weapons used).
Hopefully, roleplaying this update made me understand that in some value systems it’s worth it. Most likely, E(\Delta victims to Haber’s war efforts) > 1.
Standing against unintended pandemics, atomic warfare and other extinction threatenting events have been quite good of an idea in retrospect. Those of us working of scientific advances shall indeed ponder the consequences.
But Immerwahr-Haber episode is just an unrelated tearjerker. Really, inventing process for creation of nitrogen fertilizers is so more useful than shooting oneself in the heart. Also, chemical warfare turned out not to kill much people since WWI, so such sacrifice is rather irrelevant.
“The universe is too complex for it to have been created randomly.”
You’re right. Exactly.
Unless there are on order of $2^KolmogorovComplexity(Universe)$ universes, the chance of it being constructed randomly is exceedingly low.
Please, do continue.
Are artificial neural networks really Turing-complete? Yep, they are [Siegelman, Sontag 91]. Amount of neurons in the paper is 105, with rational edge weights, so it’s really Kolmogorov-complex. This, however, doesn’t say if we can build good machines for specific purposes.
Let’s figure out how to sort a dozen numbers with λ-calculus and sorting networks. It must stand to notice, that lambda-expression is O(1), whereas sorter network is O(n (log n)^2) in size.
Batcher’s odd–even mergesort would be O(log n) levels deep, and given one neuron is used to implement comparator, would result in O(n!) possible connections (around 229 per level). That we need 200 bits of insight to sort a dozen of numbers with that specific method does not mean that there is no cheaper way to do that, but sets a reasonable upper bound.
Apparently, I cannot do good lambda-calculus, but seems like we can do merge sorting of Church-encoded numerals in less than a hundred lambda-terms which is about the same amount of bits as sorting networks.
On a second note: how are Bayesian networks different from preceptrons, except fro having no thresholds?