Another way to think about probabilities of 0 and 1 is in terms of code length.
Shannon told us that if we know the probability distribution of a stream of symbols, then the optimal code length for a symbol X is:
l(X) = -log p(X)
If you consider that an event has zero probability, then there’s no point in assigning a code to it (codespace is a conserved quantity, so if you want to get short codes you can’t waste space on events that never happen). But if you think the event has zero probability, and then it happens, you’ve got a problem—system crash or something.
Likewise, if you think an event has probability of one, there’s no point in sending ANY bits. The receiver will also know that the event is certain, so he can just insert the symbol into the stream without being told anything (this could happen in a symbol stream where three As are always followed by a fourth). But again, if you think the event is certain and then it turns out not to be, you’ve got a problem: the receiver doesn’t get the code you want to send.
If you refuse to assign zero or unity probabilities to events, then you have a strong guarantee that you will always be able to encode the symbols that actually appear. You might not get good code lengths, but you’ll be able to send your message. So Eliezer’s stance can be interpreted as an insistence on making sure there is a code for every symbol sequence, regardless of whether that sequence appears to be impossible.
But then, do you really want to build a binary transmitter that is prepared to handle not only sequences of 0 and 1, but also the occasional “zebrafish” and “Thursday” (imagine somehow fitting these into an electrical signal, or don’t, because the whole point is that it can’t be done)? Such a transmitter has enormously increased complexity to handle signals that, well… won’t ever happen. I guess you could say the probability is low enough that the expected utility of dealing with it is not worth it. But what about the chance that a “zebrafish” in the launch codes will wipe out humanity? Surely that expected utility cannot be ignored? (Except it can!)
Umm, it’s a real thing. ECC memory https://en.m.wikipedia.org/wiki/ECC_memory
I’m sure it isn’t 100% foolproof (coincidentally the point of this article) but I imagine it reduces error probability by orders of magnitude.
Another way to think about probabilities of 0 and 1 is in terms of code length.
Shannon told us that if we know the probability distribution of a stream of symbols, then the optimal code length for a symbol X is: l(X) = -log p(X)
If you consider that an event has zero probability, then there’s no point in assigning a code to it (codespace is a conserved quantity, so if you want to get short codes you can’t waste space on events that never happen). But if you think the event has zero probability, and then it happens, you’ve got a problem—system crash or something.
Likewise, if you think an event has probability of one, there’s no point in sending ANY bits. The receiver will also know that the event is certain, so he can just insert the symbol into the stream without being told anything (this could happen in a symbol stream where three As are always followed by a fourth). But again, if you think the event is certain and then it turns out not to be, you’ve got a problem: the receiver doesn’t get the code you want to send.
If you refuse to assign zero or unity probabilities to events, then you have a strong guarantee that you will always be able to encode the symbols that actually appear. You might not get good code lengths, but you’ll be able to send your message. So Eliezer’s stance can be interpreted as an insistence on making sure there is a code for every symbol sequence, regardless of whether that sequence appears to be impossible.
But then, do you really want to build a binary transmitter that is prepared to handle not only sequences of 0 and 1, but also the occasional “zebrafish” and “Thursday” (imagine somehow fitting these into an electrical signal, or don’t, because the whole point is that it can’t be done)? Such a transmitter has enormously increased complexity to handle signals that, well… won’t ever happen. I guess you could say the probability is low enough that the expected utility of dealing with it is not worth it. But what about the chance that a “zebrafish” in the launch codes will wipe out humanity? Surely that expected utility cannot be ignored? (Except it can!)
Umm, it’s a real thing. ECC memory https://en.m.wikipedia.org/wiki/ECC_memory I’m sure it isn’t 100% foolproof (coincidentally the point of this article) but I imagine it reduces error probability by orders of magnitude.