The “best” mathematically-informed topics?
Recently, I asked LessWrong about the important math of rationality. I found the responses extremely helpful, but thinking about it, I think there’s a better approach.
I come from a new-age-y background. As such, I hear a lot about “quantum physics.”
Accordingly, I have developed a heuristic that I have found broadly useful: If a field involves math, and you cannot do the math, you are not qualified to comment on that field. If you can’t calculate the Schrödinger equation, I discount whatever you may say about what quantum physics reveals about reality.
Instead of asking which field of math are “necessary” (or useful) to “rationality,” I think it’s more productive to ask, “what key questions or ideas, involving math, would I like to understand?” Instead of going out of my way to learn the math that I predict will be useful, I’ll just embark on trying understand the problems that I’m learning the math for, and working backwards to figure out what math I need for any particular problem. This has the advantage of never causing me to waste time on extraneous topics: I’ll come to understand the concepts I’ll need most frequently best, because I’ll encounter them most frequently (for instance, I think I’ll quickly realize that I need to get a solid understanding of calculus, and so study calculus, but there may be parts of math that don’t crop up much, so I’ll effectively skip those). While I usually appreciate the aesthetic beauty of abstract math, I think this sort of approach will also help keep me focused and motivated. Note, that at this point, I’m trying to fill in the gaps in my understanding and attain “mathematical literacy” instead of a complete and comprehensive mathematical understanding (a worthy goal that I would like to pursue, but which is of lesser priority to me).
I think even a cursory familiarity with these subjects is likely to be very useful: when someone mentions say, an economic concept, I suspect that the value of even just vaguely remembering having solved a basic version of the problem will give me a significant insight into what the person is talking about, instead of having a hand-wavy, non-mathematical conception.
Eliezer said in the simple math of everything:
It seems to me that there’s a substantial advantage in knowing the drop-dead basic fundamental embarrassingly simple mathematics in as many different subjects as you can manage. Not, necessarily, the high-falutin’ complicated damn math that appears in the latest journal articles. Not unless you plan to become a professional in the field. But for people who can read calculus, and sometimes just plain algebra, the drop-dead basic mathematics of a field may not take that long to learn. And it’s likely to change your outlook on life more than the math-free popularizations or the highly technical math.
(Does anyone with more experience than me foresee problems with this approach? Has this been tired before? How did it work?)
So, I’m asking you: what are some mathematically-founded concepts that are worth learning? Feel free to suggest things for their practical utility or their philosophical insight. Keep in mind that there is a relevant cost benefit analysis to consider: there are some concepts that are really cool to understand, but require many levels of math to get to. (I think after people have responded here, I’ll put out another post for people to vote on a good order to study these things, starting with those topics that have the minimal required mathematical foundation and working up to the complex higher level topics that require calculus, linear algebra, matrices, and analysis.)
These are some things that interest me:
- The math of natural selection and evolution
- The Schrödinger equation
- The math of governing the dynamics of political elections
- Basic optimization problems of economics? Other things from economics? (I don’t know much about these. Are they interesting? Useful?)
- The basic math of neural networks (or “the differential equations for gradient descent in a non-recurrent multilayer network with sigmoid units”) (Eliezer says it’s simper than it sounds, but he was also a literal child prodigy, so I don’t know how much that counts for.)
- Basic statistics
- Whatever the foundations of bayesianism are
- Information theory?
- Decision theory
- Game theory (does this even involve math?)
- Probability theory
- Things from physics? (While I like physics, I don’t think learning more of it would significantly improve my understanding of macro-level processes that that would impact my decisions. It’s not as interesting to me as some of the other things on this list, right now. Tell me if I’m wrong or what particular sub-fields of physics are most worthwhile.)
- Some common computer science algorithms (What are these?)
- The math that makes reddit work?
- Is there a math of sociology?
- Chaos theory?
- Musical math
- “Sacred geometry” (an old interest of mine)
- Whatever math is used in meta analyses
- Epidemiology
I’m posting most of these below. Please upvote and downvote to tell me how interesting or useful you think a given topic is. Please don’t vote on how difficult they are, that’s a different metric that I want to capture separately. Please do add your own suggestions and any comments on each of the topics.
Note: looking around, I fount this. If you’re interested in this post, go there. I’ll be starting with it.
Edit: I looking at the page, I fear that putting a sort of “vote” in the comments might subtlety dissuade people from commenting and responding in the usual way. Please don’t be dissuaded. I want your ideas and comments and explicitly your own suggestions. Also, I have a karma sink post under
Edit2: If you know of the specific major equations, problems, theorems, or algorithms that relate to a given subject, please list them. For instance, I just added Price’s Equation as a comment to the listed “math of natural selection and evolution” and the Median Voter Theorem has been listed under “the math of politics.”
+1 to this post.
Learn about first and second derivatives and finding a maximum of a function. Then think about how you might find a maximum if you can only make little hops at a time.
Learn a little linear algebra (what a matrix inverse, determinant, etc. is). Understand the relationship between solving a system of linear equations and matrix inverse. Then think about what you might want to do if you have more equations than unknowns (can’t invert exactly but can find something that’s “as close to an inverse as possible” in some sense). A huge chunk of stuff that falls under the heading of “statistics/machine learning/neural networks/etc” is basically variations of that idea.
Read Structure and Interpretation of Computer Programs: one of the highest concept/page density for computer science books.
Important algorithmic ideas are, in my opinion: hashing, dynamic programming/memoization, divide and conquer by recursion, splitting up tasks to be done in parallel, and locality (things you want at a particular point are often close in space and time).
Locality is sort of like “a smoothness assumption on access.” The reason your laptop is fast even though your hard disk is slow is due to locality being generally true.
“I will always link to my ingroup”, says Scott. So it is with me: I always recommend learning about association vs causation. If you are into learning by doing, try to find some media articles that make claims of the form “scientists report that to [Y], do [X],” and look up the original study and think about if the media claim actually follows (it generally does not). This will also give you practice reading empirical papers, which is a good skill to have. Stuff the authors do in such papers isn’t magic, after all: the set of statistical ideas that come up over and over again in them is fairly small.
Don’t think like that. There are no wizards, just people doing sensible things.
+1 for Structure and Interpretation of Computer Programs (aka SICP, aka “the wizard book”) - this is a legendary programming book. Here is an interactive version: https://xuanji.appspot.com/isicp/.
I also agree on the important algorithmic ideas, with one addition: algorithmic analysis. Just as you can describe the movement of the planets with a few simple equations, and that’s beautiful, you can describe any sequence of steps to finish a task as an algorithm. And you can mathematically analyze the efficiency of that sequence: as the task gets larger, do the number of steps required to finish it grow linearly, quadratically, logarithmically (we hope)? This is a broadly applicable and powerful idea, since pretty much everything (even learning) involves a sequence of steps or process.
and
Will you make top level comments for both of these, so that people can vote on them? (I can do it, but I figure you should get any karma from the upvotes)
How flattering; I’ve now done so. Also, I very much like your approach to learning math by grounding it in concrete subjects. Many people say they learned calculus best by learning it alongside physics, since calculus appears much more concrete when you look at the velocity and arc of, say, a fired cannonball.
Finally, here’s an excellent article from Barbara Oakley, who learned math starting about age 26 after getting out of the Army. She’s now an engineering prof, and teaches a MOOC called “Learning How to Learn” (I have not taken it, but I have reviewed the topics, and it appears to hit all the correct points): http://nautil.us/issue/17/big-bangs/how-i-rewired-my-brain-to-become-fluent-in-math-rd
Thank you!
Basic statistics
Be careful here. Statistical intuition does not come naturally to humans—Kahneman and others have written extensively about this. Learning some mathematical facts (relatively simple to do) without learning the correct statistical intuitions (hard to do) may well have negative utility. Unjustified self confidence is an obvious outcome.
Can you elaborate? What is the difference between “mathematical facts” and “statistical intuitions”? Can you give an example of each?
If you take the average introductory statistics textbook it tells you thinks that are true for normally distributed data.
If you are faced with a real world problem that doesn’t follow the normal distribution and try to apply statistical techniques proven to work for normal distributed data you are getting mistakes.
Being good at statistical modelling means that you have an idea of what assumptions you can make about a certain data set and the kind of errors you will get when your assumptions don’t match reality.
Example of a mathematical fact: a formula for calculating correlation coefficient. Example of a statistical intuition: knowing when to conclude that close-to-zero correlation implies independence. (To see the problem, see this picture for some datasets in which variables are uncorrelated, but not independent.)
Not sure why are you calling this “intuition”. Understanding that Pearson correlation attempts to measure a linear relationship and many relationships are not linear is just statistical knowledge, only a bit higher level than knowing the formula.
Something to consider: what’s a good field in which to learn basic statistics (sticking with the “learn by doing, when possible” theme) ?
If you want to learn statistics by doing, try to do what gwern does.
Or do the complete opposite.
The impression I get of gwern is that he reads widely, thinks creatively, and experiments frequently, so he is constantly confronted with hypotheses that he has encountered or has generated. His use of statistics is generally confirmatory, in that he’s using data to filter out unjustified hypotheses so he can further research or explore or theorize about the remaining ones.
Another thing you can do with data is exploratory data analysis, using statistics to pull out interesting patterns for further consideration. The workflow for this might look more like:
Acquire (often multivariate) data from another researcher, source, or experiment.
Look at its marginal distributions to check your understanding of the system and catch really obvious outliers.
Maybe use tools like mixture modeling or Box-Cox transformation to clarify marginal distributions.
Use statistical tools like (linear, logistic, support vector, etc.) regression, PCA, etc., to find patterns in the data.
Do stuff with the resulting patterns: think up mechanisms, do confirmatory analysis, check literature, show them to other people, etc.
A lot of what you get out of this process will be spurious, but seeing hypotheses that the data seemed to support go down in flames is a good way to convince yourself of the value of confirmatory analysis, and of tools for dealing with this multiple testing problem.
I remember Gelman saying useful stuff like this, but it’s been a while since I read that post so I might be mischaracterizing it.
(Ilya, you know all of this, surely at a deeper level than I do. I’m just rhetorically talking to you as a means to dialogue at Capla. Gwern, hopefully my model of you is not too terrible.)
I want to do that. Tell me how. I think I already read widely (at least compared to my meat-space peers and possibly compared to the typical LW reader), but I can do better. I am frequently complimented for asking creative questions, coming up with unusual ideas and solutions (again, in comparison to non-rationalists), but if there are ways to do this better, I want to hear them. However, I want to make regular experimentation a part of my life and don’t really know how. I’m interning with a psych lab, and hope to work with some behavioral economists who run field-experiments.
How do I gain proficiency with experimental methods and build the habit of running simple experiments regularly? I suppose that there’s a certain kind of phenomenon that to the educated mind is automatically flagged as ripe for experimentation (I’m thinking of Feynman’s curiosity about the ants in his room or Harry James Potter-Evans-Verres testing to find out what the optimal way to fight is, prior the the first battle), but I don’t have that intuition, yet.
Suggestions?
That’s usually called “data mining” and is a popular activity. Unfortunately many people think that’s all they need and stop before the confirmatory phase.
What does gwern do?
This.
Political Science! Since you’re interested in election dynamics, 538′s description of its model is a good place to get a punches-pulled look at how a statistical model is constructed.
I’ll side-step the “field” part of the question and instead point to an undergraduate lecture course on data analysis which has some online notes and a series of exercises.
It’s worth pointing to something specific one could immediately start working on, because I think people underrate the trivial inconvenience of not knowing which specific book or course to consult. The course linked is not that basic, admittedly, but even if it’s too advanced it should help highlight specific keywords & terms to look up on Google, Wikipedia, or textbooks.
Probability theory
Game theory
Definitely involves math (since you asked).
Decision theory
What do you mean by decision theory?
One of the options I would call ‘decision analysis,’ and doesn’t require much more than algebra and a basic understanding of probabilities. I wrote an introduction to it, and my current book recommendation on the subject is Decisive by the Heath brothers, and I remember it being much more about the psychology of decision-making (and practical heuristics you can apply) than math.
A second option is “what’s the difference between CDT and EDT?”, but that’s something you shouldn’t really approach until you understand causal graphs, which you shouldn’t really touch until you understand probabilities and graphical networks (like Bayes nets).
A third option is “what are those exotic things they talk about on LW like TDT and UDT?”, and I don’t feel qualified to tell you the right way to approach that.
I don’t know what any of those acronyms are.
I don’t know much about decision theory, actually, except that at least 5 people on lesswrong think its worth learning.
Alright; I’d recommend focusing your attention on decision analysis. The acronyms refer to ‘philosophical positions’ about the right way to make decisions; you are probably better off focusing on practical improvements you can make in your decision-making ability and what formal decision-making looks like (which gets called stuff like cost/benefit analysis more than it does decision theory).
Computer Science: recursion
Information theory
I also agree with Ilya on the important algorithmic ideas, with one addition: algorithmic analysis. Just as you can describe the movement of the planets with a few simple equations, and that’s beautiful, you can describe any sequence of steps to finish a task as an algorithm. And you can mathematically analyze the efficiency of that sequence: as the task gets larger, do the number of steps required to finish it grow linearly, quadratically, logarithmically (we hope)?
This is a broadly applicable and powerful idea, since pretty much everything (even learning) involves a sequence of steps or process.
I am currently enjoying Tim Roughgarden’s course on algorithms: https://www.coursera.org/course/algo. Luay Nakhleh’s course on Algorithmic Thinking is also excellent: https://www.coursera.org/course/algorithmicthink.
Epidemiology
As the token epidemiologist in the Less Wrong community, I should probably comment on this.
The utility of learning epidemiology will depend critically on what you mean by the word:
If you interpret “epidemiology” as the modern theory of causal inference and causal reasoning applied to health and medicine, then learning epidemiology is very useful, so much so that I believe that a course on causal reasoning should be required in high school. If you are interested in learning this material, my advisor is writing a book on Causal Inference in Epidemiology, part of which is freely available at http://www.hsph.harvard.edu/miguel-hernan/causal-inference-book/ . For more mathematically oriented readers, Pearl’s book is also great.
If you interpret “epidemiology” to mean the material you will learn when taking a course called “Epidemiology”, or to mean the methods used in most papers published in epidemiologic journals (ie endless Cox models, p-hacking, model selection algorithms and incoherent reasoning about confounding), then what you will get is a broken epistemology with negative utility. Stay far away from this—people who don’t have the time to learn proper causal reasoning are better off with the heuristic “if it is not randomized, don’t trust it” . This happens to be the mindset of most clinicians, and appropriately so.
[Hey, I thought I was the token epidemiologist! ;) ]
I largely agree with Anders’ comment (leave Pearl be for now; it’s a difficult book), but there are some interesting non-causal mathy epidemiology topics that might suit your needs.
Concretely: study networks. Specifically, pick up the book Networks, Crowds, and Markets: Reasoning about a Highly Connected World (or download the free pdf, or take the free MOOC).
It presents a smooth slope of increasing mathematical sophistication (assuming only basic high school math at the outset), and is endlessly interesting as it gently builds and extends concepts. It eventually touches many of the topics you’ve indicated interest in (game theory, voting, epidemic dynamics, etc), giving you some powerful mathematical tools to reason with. Advanced sections are clearly marked as such, and can be passed over without losing coherence.
And hey, if the math in the advanced sections frustrates your understanding… that’s basically what you’ve said you want!
If I was once employed by a Dept. of Epidemiology does that also make me the token epidemiologist? :)
Epidemiology is defined to be things done by people in Departments of Epidemiology, correct?
That makes you an expert on epidemiology, duh :-)
I foresee (minor) problems. Nothing too serious, but it might be useful to be aware of the existence of problems with this approach. Most notably:
Many (sub)fields use only a single model out of a larger, overarching theory. Most of the times you want to skip the grand theory to get immediate results from a single model (so that’s a plus for your approach), but sometimes having someone show you the similarities between different theories explicitly can be very useful. By going for depth rather than breadth it might be hard to compare, contrast and most importantly merge pieces of knowledge from different fields. For clarity: I’m talking about really, really general mathematics here (for example learning linear algebra rather than the algorithm for the Fast Fourier Transform).
I personally found that in general it is quite hard to figure out which math you need to fully understand a certain result if you don’t already know the math. If you start with an overly ambitious goal (for example if you start with the Einstein equations of General Relativity and say to yourself: ‘Lets backtrack to see which math I need’) I suspect that you will have trouble figuring out which math to learn.
All in all these points are only minor—most useful math is relatively simple (it’s called the simple math for a reason), and you already seem to plan to start with pretty general mathematics (e.g. learning statistics rather than just the linear least squares algorithm). But sometimes learning the math before you have a goal can be useful.
Noted, but I do have the advantage of being able to ask. The next post will ask, “for each topic, formula, or problem on this list, what math do I need to know to understand and solve it?”
Computer Science: iteration
Economics optimization problems
Microeconomics contains several useful tools for thinking about the world, but those tools are only somewhat driven by math. For example, let’s consider “supply and demand.”
In a basic econ course, you’d focus on the math (which is basically just algebra). You’d have a increasing linear function for ‘supply,’ and a decreasing linear function for ‘demand.’ (Both represent the quantity supplied or demanded at a particular price.) You would then find the intersection of those two lines, and this is the “market price.” You would then consider increasing or decreasing the supply or demand curve, and notice how this changes the intersection.
In a basic econ philosophy book, you’d focus on the feedback mechanism that generates the lines discussed above. “Suppose there are more people who want to buy the good than there are sellers of the good at a particular price,” the armchair philosopher would say, “and one of the buyers will craftily offer to pay more than the listed price in order to secure a spot in line.” This would give you a sense of why the supply line is increasing and the demand line is decreasing.
The idea of what an equilibrium means, and the math underpinning their study, seems useful. But for the math you’re potentially better off looking at control theory / controls engineering / feedback systems / signal processing / there might be more names I’m forgetting.
I don’t think macroeconomic tools are personally relevant to you, but you can learn about the models in any standard undergraduate text, and it probably won’t even require calculus.
Numerical optimization is only kind of in economics. This is the class of problems where you have an exact statement of what you want (‘this is my objective function, these are my constraints’) and it just takes a lot of pushing numbers around to figure out what the best option is. Once you know linear algebra, you can figure out the basic approaches here, but I don’t think this is particularly useful unless you’re employed in a context where you need to solve these sorts of logistical problems.
I have what I think are very strong economic intuitions: it just makes sense to me and always has. I frequently think in terms of supply and demand. Do you think I would gain a great deal of value from the explicit mathematical formulations?
Are your “very strong economic intuitions” correct and how do you know whether they are or not?
They are informed by and consistent with the books I’ve read.
I should note that I don’t know much about macroeconomics. Keynesian economics (like Marxism) baffles me. I need a Keynesian to explain to me why there should be “leakages” in the banking system.
Are they consistent with reality?
I believe so, but I don’t know how to check.
Well, on which basis do you believe so?
What I am getting at is that you were informed by some books and, unsurprisingly, found your knowledge consistent with these books. But that’s no guarantee of correctness. There are a lot of books which teach and advocate views that range from not quite true to quite not true. The arbiter of correctness is reality.
You said that you have correct economic intuitions. So let me ask again—why do you believe they are correct, is that only because a book told you so?
Let me be clarify:
When I was very young, my dad t aught me the basis of economic concepts . They made sense to me. Latter, when I took economics classes, the instruction seemed obvious. I found myself having a feeling for the principles, in contrast with others that I spoke to. For instance, my peers were astonished to learn about the huge mark-up on “fancy” water. A high school friend commented that the water is obviously not worth that much. I knew that “value” is subjective, rather than being an intrinsic property of an object, and that the price, rather than reflecting that supposed real-value, was the result of the interaction of supply and demand. The water was “worth” exactly as much as it cost, by the simple fact that people would pay for it at that price.
I read other economists. They sometimes expressed ideas from new perspectives, but the fundamental ideas were the same. I took college level classes.
My confidence in my economic understanding parallels my confidence in evolution: the ideas make sense to me, and they are, to my knowledge, endorsed by the professionals who work in the field, yet I don’t know of a way of verifying those concepts for myself, without relying on the testimony of those experts. I discount my economic knowledge somewhat, since economics in general seems more contentious than biology, and especially the foundation of biology.
Does that clarify? What should I be asking?
Well, it’s pretty easy—try to make some real-life forecasts. There is a lot of economic activity happening all around the world and enough of it is visible and documented.
Take something you have an interest in, look at the current situation, explicitly apply the economic concepts you have, create a forecast (which you will be able to verify). Then see if your forecast worked out. Make a bunch of forecasts to get some diversification.
Hmm...
My first thought is “prediction is a b---- and macroeconomics is less predictable than anything else. Expecting to make predictions, and have them be right, is crazy.” Now, I know there’s a problem with that. (lthough, subjectively, I’m not doubting my knowledge any more than I was. I was going to type “Which makes me hugely suspect of my supposed knowledge”, but then I realized that’s not true and I was just signaling how “rational” I am (which, to go another level into meta-space, is, I think, what I’m doing right now, by telling you that I was going to falsely signal, then realized it and restrained myself))
Making predictions. That’s a good idea. I don’t have anything to lose except some false beliefs or some unfounded confidence in some beliefs.
What are some simple questions of which I can predict answers?
Not necessarily macroeconomics which are hard because if you can successfully predict things in that sphere, there is usually a simple way to monetize that ability.
Look at your economic intuitions. What kind of claims do they make? What do they think they can predict?
That if somthign is hard to get or make it will cost more than if its easy to get or make.
That some consumption is conscious consumption and provides a signalling benefit which swamps the direct material benefit.
Higher order goods have a longer delay to pay-off in the form of first order goods, but the progression to higher and higher order goods and the specialization that drives it, is economic growth.
People like variety in their consumption because of LMD in their utility functions.
If the price of a good rises, the prices of its compliment goods will fall slightly and the prices of its substitute goods will rice slightly (since people will buy more of substitute and less of the complement and supply & demand.
Economies of scale.
There’s no such thing as a free lunch.
(If you know that diminishing marginal returns are pervasive and allow for trade between actors, pretty much all of classical economics follows by implication.)
The math of natural selection and evolution
Depends whether you want to go genetics, or a more theoretical approach. But complete expert-level knowledge of the subject would probably include fitness landscapes, network theory, a basic understanding of what’s fashionably called ‘big data’ (for bioinformatics), linear algebra (including a familiarity with the dynamics of formally unsolvable systems and chaos theory), and a LOT of probability theory.
Price’s Equation
That is important destination but maybe too subtle a starting point.
Start with ecological models for inter-species interactions (predation, competition, mutualism, etc.) where there are more examples and the patterns are simpler, starker, and more intuitive. Roughly, death processes may depend on all involved populations but birth processes depend on each species separately. Then move to natural selection and evolution, intra-species interactions, where the birth processes for each genotype may depend on populations of all the different genotypes, and death processes depend on the phenotypes of all the different populations.
Do you have a curriculum that works through these?
It can either be an already existing textbook or class or just a list of concepts.
It’s been a while since I’ve thought about how to learn ecology, but maybe check out Ben Bolker’s Ecological Models and Data in R? It would also be a decent way to start to learn how to do statistics with R.
As Ilya recommended, a great choice for programming in general is the legendary Structure and Interpretation of Computer Programs (aka SICP, aka “the wizard book”). Here is an interactive version: https://xuanji.appspot.com/isicp/. (You can find solutions to the problems here, but of course use sparingly if at all: http://community.schemewiki.org/?sicp-solutions)
If you benefit from more instruction than a solo journey through SICP, I cannot recommend highly enough MIT’s Introduction to Computer Programming course, which remains one of the best educational experiences I have ever had: https://www.edx.org/course/mitx/mitx-6-00-1x-introduction-computer-5626
What other sorts of math do economists use?
The Schrödinger equation
I don’t think that trying to solve the Schrödinger equation itself is particularly useful. The SE is a partial differential equation, and there’s a whole logic of differential equations and boundary conditions, etc. that provides context for the SE. If you’re serious about trying to understand quantum mechanics, I think the concept of Hilbert space/abstract vector spaces/linear algebra in general is a bigger conceptual shift than just being able to solve the particle in a box in function space. It’s also just a really useful set of concepts that makes learning things like optimization, coordinate/fourier transforms, etc. easier/more intuitive.
Until I had the wave function explained to me as some vector in a high dimensional space that we could map into x-space or p-space or Lz-space I don’t think I really had a good grasp on quantum mechanics. This is anecdote not data, your mileage may vary.
I’m just making a similar experience.
Try the wave equation first? If you want to think of particles like waves it might be useful to know what a wave is. Note that you already need to have heard of a respectable chunk of calculus to solve this equation.
The Schrodinger equation itself can be understood as simply a particular instance of integral calculus. If I recall my undergrad days correctly, you didn’t even need linear algebra. Once you know the calculus, quantum waveforms don’t require a whole lot of additional mathematical insight.
This is an interesting approach, but I am wondering if reversing the process wouldn’t be more helpful. The problem is that you can’t just jump from propositional logic to chaos theory, since you must first learn algebra, calculus, and linear algebra. So an unordered list of mathematical subjects has some pitfalls.
If there existed some graph of dependencies in mathematical knowledge (basically a Diablo skill tree, or Khan Academy with lower resultion), we could note on each node any applications like quantum theory, music theory, and so on. This would help Capla go about their studies in a more systematic way, and also help show the rather elegant, unified structure underlying many fields of scientific inquiry.
Where did you get the Khan Academy image? Could I see the full version?
A dependencies graph would be of great philosophical and practical interest.
After compiling a list of topics, the next step is to figure our what math underpins each one, and order them according to increasing assumed mathematical knowledge.
Sadly, you can no longer see the full version on Khan Academy.
https://khanacademy.zendesk.com/hc/en-us/articles/203353750-Where-is-the-Knowledge-Map-Star-Map-math-overview-
The Exercise Dashboard is not as helpful for highlighting dependencies: https://www.khanacademy.org/exercisedashboard
You may be able to find other knowledge maps; Khan wasn’t the first to have the idea. I like Kaj’s idea as well. I compared the curricula of several majors at MIT to come up with a core curriculum, useful across engineering, computer science, and biology.
You could assemble a partial dependency graph by looking at the course pages of different math departments and noting which courses are listed as prerequisites for more advanced courses.
Yes, although that wouldn’t include the applied math that I’m looking for.
The main problem with the Khan version is that it got huge; they were subdividing things down to about the level of an individual half-hour lesson. For someone interested in wider strategic planning, something like this would be a bit more reasonable, as long as you added in the annotations. This book is also reviewed as a good way to conceptualize the macrostructure of mathematical reasoning, although I can’t vouch for it personally.
Consider giving people an optional Karma balance sink post, so they can downvote to even out your Karma after voting on an option if they feel like it. Or not, just a suggestion.
Thanks for the suggestion.
Can this be my karma sink post?
Should I add an edit to the document that says that I have a karma sink here (since it’ll get down-voted out of view)?
Yep, sounds good.
The math of music
The book Music and Mathematics is a bit of a jumble of essays from various people. It’s not a coherent whole or anything, BUT, if you are the least bit interested in this subject I would strongly you recommend reading the first essay (i.e. chapter). A lot of “music theory” mumbles around circles of fifths and chord progressing and things with vague pretenses of being math, but that chapter includes the only fundamental mathematical idea in the entire book.
Conveniently that chapter, and some others, are here.
The math that makes reddit work
I am not sure exactly what you mean, but if you mean the math of its article rankings then it’s important to note that the amount of very early upvotes, the subreddit a post is posted to, the name and preview thumbnail, and the time of day it is posted all have tremendous impact on its ikelihood to hit the front page.
http://cseweb.ucsd.edu/~jmcauley/pdfs/icwsm13.pdf
THAT is what I mean. What is that algorithm?
The algorithm combines recency and total upvotes. It also takes subreddit origin into account so that the totally massive subreddits don’t dominate completely.
All the code is open-source if you want to go really deep, but basically upvotes + time penalty.
The math of neural networks
Have you read Margaret Boden’s The creative mind: myths and mechanisms? It might make studying relevant math more rewarding.
Nope. What can you tell me about it?
I haven’t read it all—got sidetracked by exams and then stuff—but the first half is pretty good. Substantial and understandable (yum-yum, like a rich stew) even to people from different fields (I’m a botanist, with maybe only a bit more than the common knowledge of brain chemistry and very vague ideas about computers.) It can be downloaded as a .pdf from http://cs.oswego.edu/~malloy/Courses/Files/COG468/mythsmechanisms.pdf Try the first chapter (In a Nutshell) and see if it goes for you.
I found this.
http://abstrusegoose.com/275
I want this kind of insight, or at least the low hanging fruit.
What are the foundations of bayesianism?
The Bayes’ theorem, of course.
Deriving probabilities from causal diagrams ?
The problem with focusing on the math of elections is that it often makes the person ignore politics of elections. In this video there for example the claims that First Past the Post Voting always results in a two party system. In the UK you have a third party with the Liberal Party. Canada also has multiple parties in it’s parliament despite First Past the Post Voting.
“If a field involves math, and you cannot do the math, you are not qualified to comment on that field.”
The “politics of elections” are explicable mathematically using formal methods like game theory. That said, it’s social science, which is naturally inexact, which makes it difficult. If you want to criticize it, how about learning the math yourself?
I trust people who are actually successful at politics over people sitting in academia when it comes to explaining me how politics works.
As far as the ability to do the math of the field goes, when I was a kid I did the d’hondt calculation to get the amount of seats particular election results would produce while being at a election party and most of the people with public offices at the party had never run d’hondt.
OK, props to you for working through a mathematical model of an election. But I find your criticism about the ability of elected officials to be wanting: Do you expect your plumber to well-versed in the mathematics of hydrodynamics?
The point is that elected officials are the experts at politics. If they have no use for knowing the math, then the math is not central.
So you’re criticizing Capla’s interest in models of electoral dynamics because politicians don’t do it. That misses the point: electoral dynamics are mathematically explicable. They don’t cause electoral outcomes, so an in-depth understanding of them is of limited value to candidates themselves. But trusting the explanations of the politicians as to why they won an election is a genuinely terrible approach to understanding anything but the politician’s own beliefs.
The campaigns of candidates employ political consultants many of whom are statisticians of the Nate Silver kind. All high-level campaigns do a lot of data mining and analysis.
I don’t deny that there’s polling and trying to predict the effect of political messaging based on statistical models.
On the other hand running a campaign and doing public policy are two different things.
*I would also note that I don’t live in the US but in Germany, and we don’t have exactly the same political system.
Isn’t that like saying psychology is useless since humans have “free will”? It may not be perfectly predictive, but it’s still interesting and useful to know what the underlying math and incentives tend to.
In any case, if there are major exceptions that deviate form the mathematical political optima, I want to know why that is.
The problem isn’t uselessness it’s that people think they understand more than they do and make a lot of silly mistakes because they are overconfident that their models matter.
In particular people it makes people underrate the value of the public debate and complex coalition building and focus to much on elections as if they are the only way that public policy get’s decided.
Whether or not humans have free will is also arguable.
Is your disagreement with Capla’s interest in electoral dynamics, or with Political Science writ large?
I think plenty of people in political science departments misunderstand politics because they are in their ivory tower. On the other hand that doesn’t mean that everybody in political science doesn’t know what they are talking about.
“In particular people it makes people underrate the value of the public debate and complex coalition building and focus to[sic] much on elections as if they are the only way that public policy get’s[sic] decided.”
There’s a lot more to political science than non-causal models predicting elections. Coalition-building, to borrow your example, is a particularly rich topic of study.
Here my core concern isn’t so much political science but people from a STEM mindset trying to understand politics and then focusing their energies on easily modeled processes and thereby misunderstand the complexity of politics.
If you want to know more about my position see the discussion on http://lesswrong.com/lw/krp/three_methods_of_attaining_change/ .
Ah, the sweet smell of common ground! I definitely agree with this.
It’s in quotes for a reason.
“Sacred Geometry”
I’m surprised that this is the first thing to get a point. [My surprise is flagged for consideration with regards to what it means about my model] Who else is it that even knows what sacred geometry is?
It looks like the fault with my model was inferring to much from a single data point.
Chaos theory
The math of politics
Median Voter Theorem is a good one.
If, in a two-person race, you will vote for whomever is is closest to your position, even if they’re only a tiny deviation from he median towards you and their opponent is a tiny deviation away from you, then candidates are incentivized to cater as closely as possible to the median voter.
Many factors complicate this in real life elections, but it’s still a good concrete start.
Applied Game Theory!
This is mostly just applied Game Theory for theoretical political science and Basic Statistics for empirical political science, so I upvoted those instead.