I’ve noticed people using formal logic/mathematical notation unnecessarily to make their arguments seem more “formal”: ∀x∈X(∃y∈Y|Q(x,y)), f:S→T, etc. Eliezer Yudkowsky even does this at some points in the original sequences. These symbols were pretty intimidating to me before I learned what they mean, and I imagine they would be confusing/intimidating to anyone without a mathematical background.
Though I’m a bit conflicted on this one because if the formal logic notation of a statement is shown alongside the English description, it could actually help people learn logic notation who wouldn’t have otherwise. But it shouldn’t be used as a replacement for the English description, especially for simple statements that can easily be expressed in natural language. It often feels like people are trying to signal intellectualism at the expense of accessibility.
This is a pet peeve of mine. I remember 20 years ago, a wee boyston getting into the semimathy parts of programming and seeing all the dense notation and thinking “this must speak to the inherent complexity of the problem and must be the most natural representation!”
No!
I’ve become progressively more annoyed by it. I was reading a paper a week ago that enjoyed its notation a little too much- it took a while for me to realize what a particular equation was supposed to represent, despite the fact that I had implemented exactly what it represented from scratch before.
To be clear, I don’t mind having a single symbol that means a very specific thing by strong convention. But sometimes you’ll see ϕeθ,0 and τwer and sixteen other symbols, and then you look for a lookup table and there isn’t one, and then you scan 8 paragraphs to find the definition of two of the symbols, and then you find the only reference toτwer, except it says ”τwer is, by default, equivalent to τke,” and then you rub your temples. Or developing a custom notation that is maybe internally consistent, maybe not, and building a giant edifice on it.
It’s code golf, except there’s no compiler and the author couldn’t test it and it sometimes has errors. Enjoy, reader!
I don’t think lesswrong is unusually bad about this, but I’m now sufficiently allergic to it that seeing excess notation does make me suspicious.
I’ve noticed people using formal logic/mathematical notation unnecessarily to make their arguments seem more “formal”: ∀x∈X(∃y∈Y|Q(x,y)), f:S→T, etc. Eliezer Yudkowsky even does this at some points in the original sequences. These symbols were pretty intimidating to me before I learned what they mean, and I imagine they would be confusing/intimidating to anyone without a mathematical background.
Though I’m a bit conflicted on this one because if the formal logic notation of a statement is shown alongside the English description, it could actually help people learn logic notation who wouldn’t have otherwise. But it shouldn’t be used as a replacement for the English description, especially for simple statements that can easily be expressed in natural language. It often feels like people are trying to signal intellectualism at the expense of accessibility.
This is a pet peeve of mine. I remember 20 years ago, a wee boyston getting into the semimathy parts of programming and seeing all the dense notation and thinking “this must speak to the inherent complexity of the problem and must be the most natural representation!”
No!
I’ve become progressively more annoyed by it. I was reading a paper a week ago that enjoyed its notation a little too much- it took a while for me to realize what a particular equation was supposed to represent, despite the fact that I had implemented exactly what it represented from scratch before.
To be clear, I don’t mind having a single symbol that means a very specific thing by strong convention. But sometimes you’ll see ϕeθ,0 and τwer and sixteen other symbols, and then you look for a lookup table and there isn’t one, and then you scan 8 paragraphs to find the definition of two of the symbols, and then you find the only reference toτwer, except it says ”τwer is, by default, equivalent to τke,” and then you rub your temples. Or developing a custom notation that is maybe internally consistent, maybe not, and building a giant edifice on it.
It’s code golf, except there’s no compiler and the author couldn’t test it and it sometimes has errors. Enjoy, reader!
I don’t think lesswrong is unusually bad about this, but I’m now sufficiently allergic to it that seeing excess notation does make me suspicious.
See also: Physics Envy.