After learning a bit more about inductive and coinductive types from Robert Harper’s book in progress(which I found more lucid on the topic than Types and Programming Languages), I don’t quite understand why it’s important for the datatype to be coinductive. You can clearly encode the integers as inductive types, why doesn’t that suffice for ‘explaining to a computer what a “terminating computation” is’? Are computers ‘naturally coinductive’ in some sense?
Actually, since computers are turing complete, perhaps it’s more like ‘computers are naturally recursively typed’ and there is an analogous impossibility theorem for encoding some coinductive type in the same way? Perhaps I just don’t understanding enough yet.
After learning a bit more about inductive and coinductive types from Robert Harper’s book in progress(which I found more lucid on the topic than Types and Programming Languages), I don’t quite understand why it’s important for the datatype to be coinductive. You can clearly encode the integers as inductive types, why doesn’t that suffice for ‘explaining to a computer what a “terminating computation” is’? Are computers ‘naturally coinductive’ in some sense?
Actually, since computers are turing complete, perhaps it’s more like ‘computers are naturally recursively typed’ and there is an analogous impossibility theorem for encoding some coinductive type in the same way? Perhaps I just don’t understanding enough yet.