I think that one of the difficulties inherent in monotonous logics comes from the fact that real numbers are not very good a representing things continuous. In order to define a single point, an infinite number of digits are needed and thus an infinite amount of information. Often mathematicians ignore this. To them, using the symbol 2 to represent a continuous quantity is the same as the symbol 2.000… which seem to make for all kinds of weird paradoxes caused by the use of, often implied, infinite digits. For example, logicians seem to be unable to make a distinction between 1.999… and 2 (where they take two as meaning 2.000...) thus two different definable real numbers represent the same point.
When using real numbers that represent continuous value, I often wonder if we shouldn’t always be using the number of digits to represent some kind of uncertainty. Using significant digits, is one of the first thing students learn in university, they are crucial for experiments of the real world, they allow us to quantify the uncertainty in the digits we write down. Yet mathematicians and logicians seem to ignore them in favor of paradoxical infinities. I wonder if by using uncertainty in this way, we might not do away with Godel’s theorem and define arithmetics within a certain amount of relative uncertainty inherent to our measuring instruments and reasoning machinery.
I think that one of the difficulties inherent in monotonous logics comes from the fact that real numbers are not very good a representing things continuous. In order to define a single point, an infinite number of digits are needed and thus an infinite amount of information. Often mathematicians ignore this. To them, using the symbol 2 to represent a continuous quantity is the same as the symbol 2.000… which seem to make for all kinds of weird paradoxes caused by the use of, often implied, infinite digits. For example, logicians seem to be unable to make a distinction between 1.999… and 2 (where they take two as meaning 2.000...) thus two different definable real numbers represent the same point.
When using real numbers that represent continuous value, I often wonder if we shouldn’t always be using the number of digits to represent some kind of uncertainty. Using significant digits, is one of the first thing students learn in university, they are crucial for experiments of the real world, they allow us to quantify the uncertainty in the digits we write down. Yet mathematicians and logicians seem to ignore them in favor of paradoxical infinities. I wonder if by using uncertainty in this way, we might not do away with Godel’s theorem and define arithmetics within a certain amount of relative uncertainty inherent to our measuring instruments and reasoning machinery.