I used to think that the way mathematicians did things was forced to be the best way we could do because of the requirements to do things properly in order to advance in maths. But then the Tau Manifesto showed me I was wrong.
I think you’re right about cosine. I think sine seemed simpler when it was named back in classical times, but then when complex numbers were discovered and their relationships to the trigonometric functions was discovered, cosine turned out to be simpler.
Here’s one I come across as a programmer: which number is better for starting indexing and counting things with? Zero or one? Zero is so much better for calculating with relative indexes, you have less off-by-one errors. In maths, the default convention is to number things starting at one. But when working with serieses (arithmetic series, discrete fourier transforms, maclaurin series e.g. the polynomial that equals e) the convention is to start at zero.
I used to think that the way mathematicians did things was forced to be the best way we could do because of the requirements to do things properly in order to advance in maths. But then the Tau Manifesto showed me I was wrong.
I think you’re right about cosine. I think sine seemed simpler when it was named back in classical times, but then when complex numbers were discovered and their relationships to the trigonometric functions was discovered, cosine turned out to be simpler.
Here’s one I come across as a programmer: which number is better for starting indexing and counting things with? Zero or one? Zero is so much better for calculating with relative indexes, you have less off-by-one errors. In maths, the default convention is to number things starting at one. But when working with serieses (arithmetic series, discrete fourier transforms, maclaurin series e.g. the polynomial that equals e) the convention is to start at zero.