Has anybody else wished that the value of the symbol, pi, was doubled? It becomes far more intuitive this way—this may even affect uptake of trigonometry in school. This rates up with declaring the electron’s charge as negative rather than positive.
I read an argument to that effect on the Internet, but I don’t have any strong feelings—maybe if I were writing a philosophical conlang I would make the change, but not normally. You may as well argue for base four arithmetic.
I figure each finger can be up or down, 2 states, so binary.
And then base 16 is just assigning symbols to sequences of 4 binary digits, a good, manageable, compression for speaking and writing.
(When I say I could count something on one hand, it means there are up to 31 of them.)
The cost in number length is not large − 3*10^8 is roughly 1*4^14 - and the cost in factorization likewise—divisibility by 2, 3, and 5 remain simple, only 11 becomes difficult.
If you want to argue from number of fingers, though, six beats ten. ;)
Six works because you don’t need a figure for the base. Thus, zero to five fingers on one hand, then drop all five and raise one on the other to make six. (Plus, you get easy divisibility by seven, which beats easy divisibility by eleven.)
Edit: Binary, the logical extension of the above principle, has the problem that the ring finger and pinky have a mechanical connection, besides the obvious 132decimal issue. ;)
Well making pi=2pi would just mean the complex exponential function would repeat itself every pi radians instead of every 2pi radians. e^0 would still = 1 in either case. Note that in the current definition, e^jn(2pi) = 1 for any integer n.
No. This is nowhere near like the metric vs. english units debate. (If you want to talk about changing units, you should put your weight on that boat instead, as it’s much more of a serious issue.) Pi is already well defined, anyways. It’s defined according to its historical contextual meaning, regarding diameter, for which the factor of 2 does not appear.
Pi is well-defined, yes, and that’s not going to change. But some notation is better than others. It would be better notation if we had a symbol that meant 2pi, and not necessarily any symbol that meant pi, because the number 2pi is just usually more relevant. There’s all sorts of notation we have that is perfectly well-defined, purely mathematical, not dependent on any system of units, but is not optimal for making things intuitive and easy to read, write and generally process. The gamma function is another good example.
I really fail to see why metric vs. english units is much more serious; neither metric nor english units is particularly suggestive of anything these days. Neither is more natural. The quantities being measured with them aren’t going to be nice clean numbers like pi/2, they’re going to be messy no matter what system of units you measure them with.
Yeah. It’s artificially introduced (why the s-1 power?) and is basically just confusing. Gamma function isn’t really something I’ve had reason to use myself, so I’m just going on the fact that I’ve heard lots of people complain about this and never anyone defending it, to conclude that it really is as dumb as it looks.
The t^(s-1) in the gamma function should be thought of as the product of t^s dt/t. This is a standard part of the Mellin transform. The dt/t is invariant under multiplication, which is a sensible thing to ask for since the domain of integration (0,infinity) is preserved by scaling, but not by the translations that preserve dt.
In other words, dt/t = d(log t) and it’s telling you to change variables: the gamma function is the Laplace (or Fourier) transform of exp(-exp(u)).
One can dream. :) Pi relates to diameter; it’d be much nicer if it related to radius directly instead.
Personally, I want to replace the kg in the mks system with a new symbol and name: I want to go back to calling it the “grave” (as it was called at one time in France), having the symbol capital gamma. Then we wouldn’t have the annoying fact of a prefixed unit as a basic unit of the system.
Hehehe. Cgs units… it really amuses me that it seems to be astronomers who like them best.
Of course, if we were really uber-cool, we’d use natural units, but somehow I can’t see Kirstie Alley going on TV talking about how she lost 460 million Planck-masses on Jenny.
Meh. 2 Pi shows up a lot, but so does Pi, and so does Pi/2. I think I’d rather cut it in half, actually, as fractions are more painful than integer multiples.
Think about the context here, though. Having a symbol for 2pi would be much more convenient because it would make things consistent. 2pi is the number that you typically cut into fractions. Let’s say we define, say, rho to mean 2pi. Then we have rho, rho/2, rho/3, rho/4… whereas with pi, we have 2pi, 2pi/2, 2pi/3, 2pi/4… the problem is those even numbers. Writing 2pi/4 looks ugly, you want to simplify, but writing pi/2 means that you no longer see the number “4” there, which is what’s important, that it’s a quarter of 2pi. You see the “2″ on the bottom so you think it’s half of 2pi. It’s a mistake everyone makes every now and then—seeing pi/n and thinking it’s 2pi/n. If we just had a symbol for 2pi, this wouldn’t occur. Other mistakes would, sure, but as commonly as this one does?
If we were to define, say, xi=pi/2, then 4xi, 2xi, 4xi/3, xi, 4xi/5… well, that’s just awful.
No, like anyone who isn’t watching out for traps caused by bad notation. It’s much easier to copy down numbers than it is to alter them appropriately. If you see “e^(pi i/3)”, what stands out is the 3 in the denominator. Except oops, pi actually only means half a circle, so this is a sixth root of unity, not a third one. Part of why I like to just write zeta_n instead of e^(2pi i/n). Sure, this can be avoided with a bit of thought, but thought shouldn’t be required here; notation that forces you to think about something so trivial, is not good notation.
I’ve certainly used it for that—but I pattern it with dropping the subscript n, when it is clear when there is only one particular root of unity we’re basing off of. I’ve never ever seen zeta used.
Has anybody else wished that the value of the symbol, pi, was doubled? It becomes far more intuitive this way—this may even affect uptake of trigonometry in school. This rates up with declaring the electron’s charge as negative rather than positive.
I read an argument to that effect on the Internet, but I don’t have any strong feelings—maybe if I were writing a philosophical conlang I would make the change, but not normally. You may as well argue for base four arithmetic.
Huh. Would that actually be easier? I always figured ten fingers...
I figure each finger can be up or down, 2 states, so binary. And then base 16 is just assigning symbols to sequences of 4 binary digits, a good, manageable, compression for speaking and writing.
(When I say I could count something on one hand, it means there are up to 31 of them.)
Fewer symbols to memorize.
Smaller multiplication table to memorize.
Direct compatibility with binary computers.
The cost in number length is not large − 3*10^8 is roughly 1*4^14 - and the cost in factorization likewise—divisibility by 2, 3, and 5 remain simple, only 11 becomes difficult.
If you want to argue from number of fingers, though, six beats ten. ;)
I could see eight, but why six?
Six works because you don’t need a figure for the base. Thus, zero to five fingers on one hand, then drop all five and raise one on the other to make six. (Plus, you get easy divisibility by seven, which beats easy divisibility by eleven.)
Edit: Binary, the logical extension of the above principle, has the problem that the ring finger and pinky have a mechanical connection, besides the obvious 132decimal issue. ;)
I don’t see how eight comes in, though.
Eight would be if you counted your fingers with the thumb of the same hand.
I see—I count by raising fingers, so that method didn’t occur to me.
There’s are websites dedicated to making Base 12 the standard. Same principle as making Base 6.
Nature’s Numbers
Dozenal Society
Simplest explanation—its possible to divvy 12 up in more whole fractions than the number 10.
I don’t see myself with ten fingers as a posthuman anyway.
e^(pi*i) = −1
Anything else: lame.
Uh, how is e^(pi*i) = 1 lame?
Maybe because e^0 = 1?
Well making pi=2pi would just mean the complex exponential function would repeat itself every pi radians instead of every 2pi radians. e^0 would still = 1 in either case. Note that in the current definition, e^jn(2pi) = 1 for any integer n.
e^(2*Pi*i) − 1 = 0. Hah. I fit in more numbers.
No. This is nowhere near like the metric vs. english units debate. (If you want to talk about changing units, you should put your weight on that boat instead, as it’s much more of a serious issue.) Pi is already well defined, anyways. It’s defined according to its historical contextual meaning, regarding diameter, for which the factor of 2 does not appear.
Pi is well-defined, yes, and that’s not going to change. But some notation is better than others. It would be better notation if we had a symbol that meant 2pi, and not necessarily any symbol that meant pi, because the number 2pi is just usually more relevant. There’s all sorts of notation we have that is perfectly well-defined, purely mathematical, not dependent on any system of units, but is not optimal for making things intuitive and easy to read, write and generally process. The gamma function is another good example.
I really fail to see why metric vs. english units is much more serious; neither metric nor english units is particularly suggestive of anything these days. Neither is more natural. The quantities being measured with them aren’t going to be nice clean numbers like pi/2, they’re going to be messy no matter what system of units you measure them with.
What about the gamma function is bad? Is it the offset relation to the factorial?
Yeah. It’s artificially introduced (why the s-1 power?) and is basically just confusing. Gamma function isn’t really something I’ve had reason to use myself, so I’m just going on the fact that I’ve heard lots of people complain about this and never anyone defending it, to conclude that it really is as dumb as it looks.
The t^(s-1) in the gamma function should be thought of as the product of t^s dt/t. This is a standard part of the Mellin transform. The dt/t is invariant under multiplication, which is a sensible thing to ask for since the domain of integration (0,infinity) is preserved by scaling, but not by the translations that preserve dt.
In other words, dt/t = d(log t) and it’s telling you to change variables: the gamma function is the Laplace (or Fourier) transform of exp(-exp(u)).
http://www.math.utah.edu/~palais/pi.pdf
One can dream. :) Pi relates to diameter; it’d be much nicer if it related to radius directly instead.
Personally, I want to replace the kg in the mks system with a new symbol and name: I want to go back to calling it the “grave” (as it was called at one time in France), having the symbol capital gamma. Then we wouldn’t have the annoying fact of a prefixed unit as a basic unit of the system.
Embarrassingly, my first reaction was to think, “how about cgs units? Those don’t use kilograms!”
Hehehe. Cgs units… it really amuses me that it seems to be astronomers who like them best.
Of course, if we were really uber-cool, we’d use natural units, but somehow I can’t see Kirstie Alley going on TV talking about how she lost 460 million Planck-masses on Jenny.
Definitely. 2pi appears so much more often than pi.
Meh. 2 Pi shows up a lot, but so does Pi, and so does Pi/2. I think I’d rather cut it in half, actually, as fractions are more painful than integer multiples.
Think about the context here, though. Having a symbol for 2pi would be much more convenient because it would make things consistent. 2pi is the number that you typically cut into fractions. Let’s say we define, say, rho to mean 2pi. Then we have rho, rho/2, rho/3, rho/4… whereas with pi, we have 2pi, 2pi/2, 2pi/3, 2pi/4… the problem is those even numbers. Writing 2pi/4 looks ugly, you want to simplify, but writing pi/2 means that you no longer see the number “4” there, which is what’s important, that it’s a quarter of 2pi. You see the “2″ on the bottom so you think it’s half of 2pi. It’s a mistake everyone makes every now and then—seeing pi/n and thinking it’s 2pi/n. If we just had a symbol for 2pi, this wouldn’t occur. Other mistakes would, sure, but as commonly as this one does?
If we were to define, say, xi=pi/2, then 4xi, 2xi, 4xi/3, xi, 4xi/5… well, that’s just awful.
What? Like who, 6th graders?
I find that unfair. I have made the mistake Sniffnoy describes many times, all of them after I was in 6th grade.
Easy solution. Pi is half a circle. Pie is the whole one. Then there is a smooth transition from grade 3 to university.
I’ve been looking for a good thing to call 2*Pi—this might cut it.
Nice one! ;)
No, like anyone who isn’t watching out for traps caused by bad notation. It’s much easier to copy down numbers than it is to alter them appropriately. If you see “e^(pi i/3)”, what stands out is the 3 in the denominator. Except oops, pi actually only means half a circle, so this is a sixth root of unity, not a third one. Part of why I like to just write zeta_n instead of e^(2pi i/n). Sure, this can be avoided with a bit of thought, but thought shouldn’t be required here; notation that forces you to think about something so trivial, is not good notation.
omega_n is the notation I most often run across.
Hm, I’ve generally just seen omega for zeta_3.
I’ve certainly used it for that—but I pattern it with dropping the subscript n, when it is clear when there is only one particular root of unity we’re basing off of. I’ve never ever seen zeta used.
Pi/3 shows up a lot as well. If you halve pi, then you’d have to write that as 2*pi/3, which is more irritating still.