Epistemologist specialized in the difficulties of alignment and how to solve AI X-Risks. Currently at Conjecture.
Blogging at For Methods.
Epistemologist specialized in the difficulties of alignment and how to solve AI X-Risks. Currently at Conjecture.
Blogging at For Methods.
If the point you’re trying to make is: “the way we go from preparadigmatic to paradigmatic is by solving some hard problems, not by communicating initial frames and idea”, I think this points to an important point indeed.
Still, two caveats:
First, Kuhn’s concept of paradigm is quite an oversimplification of what actually happens in the history of science (and the history of most fields). More recent works that go through history in much more detail realize that at any point in fields there are often many different pieces of paradigms, or some strong paradigm for a key “solved” part of the field and then a lot of debated alternative for more concrete specific details.
Generally, I think the discourse on history and philosophy of science on LW would improve a lot if it didn’t mostly rely on one (influential) book published in the 60s, before much of the strong effort to really understand history of science and practices.
Second, to steelman John’s point, I don’t think he means that you should only communicate your frame. He’s the first to actively try to apply his frames to some concrete problems, and to argue for their impressiveness. Instead, I read him as pointing to a bunch of different needs in a preparadigmatic field (which maybe he could separate better ¯\_(ツ)_/¯)
That in a preparadigmatic field, there is no accepted way of tackling the problems/phenomena. So if you want anyone else to understand you, you need to bridge a bigger inferential distance than in a paradigmatic field (or even a partially paradigmatic field), because you don’t even see the problem in the same way, at a fundamental level.
That if your goal is to create a paradigm, almost by definition you need to explain and communicate your paradigm. There is a part of propaganda in defending any proposed paradigm, especially when the initial frame is alien to most people, and even the impressiveness require some level of interpretation.
That one way (not the only way) by which a paradigm emerges is by taking different insights from different clunky frames, and unifying them (for a classic example, Newton relied on many previous basic frames, from Kepler’s laws to Galileo’s interpretation of force as causing acceleration). But this requires that the clunky frames are at least communicated clearly.
Curated. I’ve heard this book suggested a few times over the years, and feels like it’s a sort of unofficial canon among people studying how preparadigmatic science happens. This review finally compelled me to get the book.
There’s something quite funny in that I discovered this book in January 2022, during the couple of days I spent at Lightcone offices. It was in someone’s office, and I was curious about it. Now, we’re back full circle. ^^
I do think this review would be a lot better if it actually distilled the messy-bits-that-you-need-to-experientially-stew-over into a something that was (probably) much longer than this post, but, much shorter than the book. But that does seem legitimately hard.
Agreed.
But as I said in the post, I think it’s much more important to get the feel from this book than just the big ideas. I believe that there’s a way to write a really good blog post that shares that feel and compresses it, but that was not what I had the intention or energy (or mastery) to write.
It sounds cool, though also intuitively temperature seems like one of the easiest attributes to measure because literally everything is kind of a thermometer in the sense that everything equillibrates in temperature.
Can’t guarantee that you would benefit from it, but this sentence makes me think you have a much cleaner and simplified idea of how one develops even simple measuring device than what the history shows (especially when you don’t have any good theory of temperature or thermodynamics).
So would say you might benefit from reading it. ;)
If you enjoyed Inventing Temperature, Is Water H2O? is pretty much the same genre from the same author.
Yeah, I am a big fan of Is Water H2O? (and the other Chang books). It’s just that I find Is Water H2O? both less accessible (bit more focused on theory) and more controversial (notably in its treatement of phlogiston, which I agree with, but most people including here have only heard off phlogiston from fake histories written by scientists embellishing the histories of their fields (and Lavoisierian propaganda of course)). So that’s why I find Inventing Temperature easier to recommend as a first book.
My another favorite is The Emergence of Probability by Ian Hacking. It gets you feeling of how unimaginably difficult for early pioneers of probability theory to make any advance whatsoever, as well as how powerful even small advances actually are, like by enabling annuity.
It’s in my anti-library, but haven’t read it yet.
It is my pet peeve that people don’t (maybe can’t) appreciate how great intellectual achievement first order logic really is, being the end result of so much frustrating effort. Because learning to use first order logic is kind of trivial, compared to inventing it.
I haven’t read it in a while, but I remember The Great Formal Machinery Works being quite good on this topic.
It’s rare that books describe such processes well, I suspect partly because it’s so wildly harder to generate scientific ideas than to understand them, that they tend to strike people as almost blindingly obvious in retrospect.
Completely agreed!
I think this is also what makes great history of science so hard: you need to unlearn most of the modern insights and intuitions that didn’t exist at the time, and see as close as possible to what the historical actors saw.
This makes me think of a great quote from World of Flows, a history of hydrodynamics:
There is, however, a puzzling contrast between the conciseness and ease of the modern treatment of [wave equations], and the long, difficult struggles of nineteenth-century physicists with them. For example, a modern reader of Poisson’s old memoir on waves finds a bewildering accumulation of complex calculations where he would expect some rather elementary analysis. The reason for this difference is not any weakness of early nineteenth-century mathematicians, but our overestimation of the physico-mathematical tools that were available in their times. It would seem, for instance, that all that Poisson needed to solve his particular wave problem was Fourier analysis, which Joseph Fourier had introduced a few years earlier. In reality, Poisson only knew a raw, algebraic version of Fourier analysis, whereas modern physicists have unconsciously assimilated a physically ‘dressed’ Fourier analysis, replete with metaphors and intuitions borrowed from the concrete wave phenomena of optics, acoustics, and hydrodynamics.
(Also, thanks for the recommendations, will look at them! The response to this post makes me want to write a post about my favorite books on epistemology and science beyond Inventing Temperature ^^)
Thanks for the links!
But yeah, I’m more interested in detailed descriptions of how things actually work, rather than models of ideal governance.
Thanks!
After checking them, it feels like most of your links are focused on an economic lens to politics and governance, or at least an economic bent. Does that seem correct?
And of course just reading the rule books for the various governments or parts of the government—for the US that would be looking at the Constitution and the rules governing internal processes for both the House and Senate. Parlimentary systems will have similar rules of governance.
Looking at the organizational charts likely also help—what are the committee structures and how does legislation flow through.
Yeah, ideally I would prefer to read an overview and model of these, but I agree that if it doesn’t exist, then reading the docs and charts is probably the simplest alternative.
That said I’m not sure I would view political governance as truely having any gears. I think all the rules tend to become more like the Pirate’s Code in Piarates of the Caribbean: more like guidelines than hard and fast rule.
I would guess that there are probably gears level model of how the governments actually work. Whether these are exactly the models provided in rules and guidelines, I’m not sure, but assuming not.
The true deep philosophical answer was… I wanted to separate cakes from bread (in french we have patisserie and boulangerie), but couldn’t find any obvious one in english (seems like indeed, english-speaking countries use baking for both). So I adapted the “patisser” verb in french, hoping that I would get away with a neologism given that english is so fit for constructing them.
My bad. Thanks for the correction, edited the post.
Unfortunately all the positives of these books come paired with a critical flaw: Caro only manages to cover two people, and hasn’t even finished the second one!
In my view, Caro is actually less guilty of this than most biographers.
Fundamentally, this is because he cares much more about power, its sources, and its effects on the wielders, beneficiaries, and victims. So even though the throughline are the lives of Moses and Johnson, he spends a considerable amount of time on other topics which provide additional mechanistic models with which to understand power.
Among others, I can think of:
The deep model of the geology and psychology of the trap of the hill country that I mention in the post
What is considered the best description of what it was for women especially to do all their chores by hand in the hill country before Johnson brought them electricity
Detailed models of various forms of political campaigning, the impact of the press and
Detailed models of various forms of election stealing
What is considered the best history of the senate, what it was built for, with which mechanisms, how these became perverted, and how Johnson changed it and made it work.
Detailed model of the process that led to the civil rights movements and passage of the civil rights bills
Detailed model of the hidden power and control of the utilities
In general, many of Moses’ schemes mentioned to force the legistlature and the mayor to give him more funding and power
He even has one chapter in the last book that is considered on par with many of the best Kennedy biographies.
Still, you do have a point that even if we extend the range beyond the two men, Caro’s books are quite bound in a highly specific period: mid 20th century america.
Have you found other biographers who’ve reached a similar level? Maybe the closest I’ve found was “The Last Lion” by William Manchester, but it doesn’t really compare giving how much the author fawns over Churchill.
I think it’s kind of a general consensus that finding something of a similar level is really hard. But in terms of mechanistic models, I did find Waging A Good War quite good. It explores the civil rights movement successes and failures through the lens of military theory and strategy. (It does focus on the same period and locations as the Caro books though...)
I do find thinking on paper (a bit more intentional than freewriting, but the same vibe) to be particularly helpful, I agree. Just like walks.
The reasons I don’t find them enough is that:
They generally happen after the fact, which means that some build up happened
Personally, I’m rarely able to release all the build up just through thinking on paper (happens, just rare)
Still, I find it’s a good way to build emotional potential energy much slower, and to notice when you really need to have a full break/sabbaticl.
Oh, I like the neural annealing connection, I have read the post but didn’t relate it to emotional potential energy, but it makes sense!
Hope you take some time to anneal away some of that potential energy soon. People consistently underestimate the negative ripples on the social web from being overstretched, as opposed to the obvious and tangible “but this thing right in front of me needs doing”.
Thanks. That’s the plan. ;)
No worries. ;)
However, when it comes to more inchoate domains like research skill, such writing does very little to help the inexperienced researcher. It is more likely that they’d simply miss out on the point you are trying to tell them, for they haven’t failed both by, say, being too trusting (a common phenomenon) and being too wary of ‘trusting’ (a somewhat rare phenomenon for someone who gets to the big leagues as a researcher). What would actually help is either concrete case studies, or a tight feedback loop that involves a researcher trying to do something, and perhaps failing, and getting specific feedback from an experienced researcher mentoring them. The latter has an advantage that one doesn’t need to explicitly try to elicit and make clear distinctions of the skills involved, and can still learn them. The former is useful because it is scalable (you write it once, and many people can read it), and the concreteness is extremely relevant to allowing people to evaluate the abstract claims you make, and pattern match it to their own past, current, or potential future experiences.
I wholeheartedly agree.
The reason why I didn’t go for this more grounded and practical and teachable approach is that at the moment, I’m optimizing for consistently writing and publishing posts.
Historically the way I fail at that is by trying too hard to write really good posts and make all the arguments super clean and concrete and detailed—this leads to me dropping the piece after like a week of attempts.
So instead, I’m going for “write what comes naturally, edit a bit to check typos and general coherence, and publish”, which leads to much more abstract pieces (because that’s how I naturally think).
But reexploring this topic in an in-depth and detailed piece in the future, along the lines of what you describe, feels like an interesting challenge. Will keep it in mind. Thanks for the thoughtful comment!
Just sharing some vibe I’ve got from your.. framing!
Minimalism ~ path ~ inside-focused ~ the signal/reward
Maximalist ~ destination ~ outside-focused ~ the worldThese two opposing aesthetics is a well-known confusing bit within agent foundation style research. The classical way to model an agent is to think as it is maximizing outside world variables. Conversely, we can think about minimization ~ inside-focused (reward hacking type error) as a drug addict accomplishing “nothing”
Feels there is also something to say with dopamine vs serotonine/homeostasis, even with deontology vs consequentialism, and I guess these two clumsy clusters mirrors each other in some way (feels isomorph by reverse signe function). Will rethink about it for now.
I see what you’re pointing out, but in my head, the minimalism and maximalism that I’ve discussed both allow you quick feedback loops, which is generally the way to go for complex stuff. The tradeoff lies more in some fuzzy notion of usability:
With the minimalism approach, you can more easily iterate in your head, but you need to do more work to lift the basic concepts to the potentially more tricky abstactions you’re trying to express
With the maximalist approach, you get affordances that are eminently practical, so that many of your needs are solved almost instantly; but you need to spend much more expertise and mental effort to simulate what’s going to happen in your head during edge-cases.
As an aside note: I’m French too, and was surprised I’m supposed to yuck maximalist aesthetic, but indeed it’s consistent with my reaction reading you about TypeScript, also with my K-type brain.. Anecdotally, not with my love for spicy/rich foods ^^′
I’m obviously memeing a bit, but the real pattern I’m point out is more for “french engineering school education”, which you also have, rather than mere frenchness.
Interestingly, the Lean theorem prover is sometimes considered a bit of a mess type-theoretically. (an illustrative thread), but is perhaps the most popular theorem prover among mathematicians. I would say it’s more on the “maximalist” side.
Didn’t know this about Lean, but the fact that a maximalist option is most popular with mathematicians makes sense to me. As someone who worked both with mathematicians and formal methods researchers (much more meta-mathematicians), the latter are much closer to programmers, in the sense that they want to build things and solve their own abstract problems, instead of necessarily wanting the most compositional machinery possible (although I still expect compositionality to be baked into the intuitions of many mathematicians).
Last I read about Rust’s type system, it basically didn’t have a theoretical basis, and seemed like it was just based around Graydon figuring out algorithms for getting the properties he wanted. Rust is much more popular than SML (or Haskell, though I’m not sure Haskell should really count as a ‘minimalist’ type system with all of its language extensions).
Rust is an interesting point in the design space. If I had to describe it quickly according to the framing above, it feels like a really pleasant fractal tradeoff between different type systems:
It has basically affine type but with more practical usage through borrowing (see this survey for more details)
It has an ML type system with algebraic datatypes (and even traits which are close to typeclasses in Haskell)
So it definitely feels more maximalist than some ML or some pure linear type system, but that’s more from the combination and UX work than from a crazy “let’s add this super advanced feature” rush à la TypeScript imo.
It is definitely one minimalist vs maximalist dimensions ^^.
Oh, I didn’t see it actually mentioned your package. 😂
Units / dimensional analysis in physics is really a kind of type system. I was very big into using that for error checking when I used to do physics and engineering calculations professionally.
Definitely!
Dimensional analysis was the first place this analogy jumped to me when reading Fly By Night Physics, because it truly used dimensions not only to check results, but also to infer the general shape of the answer (which is also something you can do in type systems, for example a function with a generic type can only be populated by the identity function, because it cannot not do anything else than return its input).
Although in physics you need more tricks and feels to do it correctly. Like the derivation of the law of the pendulum just from dimensional analysis requires you to have the understanding of forces as accelerations to know that you can us g here.
Dimensions are also a perennial candidate for things that should be added to type systems, with people working quite hard at implementing it (just found this survey from 2018).
I invented my own weird way to do it that would allow them to be used in places where actual proper types & type-checking systems weren’t supported—like most numerical calculation packages, or C, or Microsoft Excel, etc.
I looked at the repo, and was quite confused how you did it, until I read
A complete set of independent base units (meters, kilograms, seconds, coulombs, kelvins) are defined as randomly-chosen positive floating-point numbers. All other units and constants are defined in terms of those. In a dimensionally-correct calculation, the units all cancel out, so the final answer is deterministic, not random. In a dimensionally-incorrect calculations, there will be random factors causing a randomly-varying final answer.
That’s a really smart trick! I’m sure there are some super advanced cases where the units might cancel out wrongly, but in practice they shouldn’t, and this let’s you interface with all the random software that exists! (Modulo the run it twice, as you said, because the two runs correspond to two different drawings of the constants)
Yeah a case where this came up for me is angles (radians, degrees, steradians, etc.). If you treat radians as a “unit” subjected to dimensional analysis, you wind up needing to manually insert and remove the radian unit in a bunch of places, which is somewhat confusing and annoying.
Another interesting thing with radiants is that when you write a literal expression, it will look quite different than in degrees (involving many more instances of ), so inspection can fix many errors without paying the full price of different types.
Apparently people want some clarification on what I mean by anti-library. It’s a Nassim Taleb term which refers to books you own but haven’t read, whose main value is to remind you and keep in mind what you don’t know and where to find it if you want to expand that knowledge.