I don’t think he has literally one trick but you’re right that a lot of his recent public work has been exploring his ideas instead of falsifying them.
I’d describe his ‘main trick’ as trying to find a simple computable system that mostly mimics the ‘dynamics’ of some other system.
And – or so I think – exploring, at considerable length, the idea that ‘everything is space’ and ‘(maybe) space is a hypergraph evolving according to a simple rule’ is an extremely interesting endeavor. It doesn’t seem particularly crazy compared to other niche ‘theories of everything’ for one.
And, yes, he talks and writes about ‘universal computation’, his own phrase, instead of ‘Turing-complete’ – that’s a somewhat lamentable phenomena, but pretty understandable. We all – as individuals and groups – do that too tho, so I don’t really ‘ding’ him for those ‘excesses’. This is an extremely common complaint about him and his work, but it’s mostly irrelevant to determining whether his ideas are interesting, let alone true.
(Arguably we – the LessWrong users – have done the same thing repeatedly!)
I think the bigger thing that he has – not demonstrated exactly – but accumulated tantalizing evidence for, is that Turing-completeness (‘universal computation’) is both easy and, surprisingly, common. I still think that’s an under-appreciated point.
His recent ‘hypergraph’ work seems promising to me – it seems like a (very mildly or weakly) plausible (tho rough) idea of how one might formulate everything else in terms of ‘space quanta’ and his ideas about what ‘time’ and ‘causality’ could mean based on an example formulation seem very interesting. I certainly don’t begrudge him, or anyone else, spending their time this doing this. And I definitely don’t think him, or anyone else, owes me a falsifiable theory! (I might feel a little differently if I was involuntarily supporting his efforts, e.g. via taxes, like I am with string theory.)
The practical obstacles to actually start to test how well his ideas or theories work seem insurmountable, but that’s still true of string theory as well – and maybe you feel similarly about it!
I think the bigger thing that he has – not demonstrated exactly – but accumulated tantalizing evidence for, is that Turing-completeness (‘universal computation’) is both easy and, surprisingly, common. I still think that’s an under-appreciated point.
Yes.
I’d describe his ‘main trick’ as trying to find a simple computable system that mostly mimics the ‘dynamics’ of some other system.
The analogies seem very superficial to me. I mean, I would be impressed if he could derive the equation for Lorentz contraction, but I am unimpressed if he merely shows that “something can get shorter”. Etc. Could you give me the best example of mimicking some specific law of physics?
He in fact did derive (approximately) both special and general relativity for the ‘hypergraph physics’ project – I think. I’ll look for a link but it should be on the same site as the link for this post.
Have you read his previous book “A New Kind of Science”? It’s available for free online here. I think the “analogies” he presents are surprisingly good given how simple they are, e.g. the fluid dynamics stuff seems ‘right’, even if it’s not (nearly) as accurate as standard numerical approximations/simulations based on the standard differential equations.
And here is a review, written by an expert on physics and computation. It does not address all claims in the book, specifically not the ones you mentioned. But I think it explains why people who know a lot about these things are not necessarily impressed by “the new kind of science”.
In the second paragraph of the introduction in the review by Aaronson:
As a popularization, A New Kind of Science is an impressive accomplishment.
With regard to Aaronson’s criticisms with respect to the content in NKS about quantum mechanics, I’m pretty sure Wolfram has addressed some of them in his newer work, e.g. (previously) ignoring ‘multiway systems’.
One thing that jumps out at me, in Aaronson’s ‘not compatible with both special relativity and Bell inequality violations’ argument against Wolfram’s (earlier version of his) ‘hypergraph physics’:
A technicality is that we need to be able to identify which vertices correspond to x_a, y_a, and so on, even as G evolves over time.
Funnily enough, it’s Aaronson’s ‘computation complexity for philosophers’ paper that now makes me think such an ‘identification’ routine is possibly (vastly far) from “a technicality”, especially given that the nodes in the graph G are expected to represent something like a Planck length (or smaller) and x_a and y_a are “input bits”, i.e. some two-level quantum mechanical system (?). The idea of identifying the same x_a and y_a as G doesn’t seem obvious or trivial from a computational complexity perspective.
Tho, immediately following what I quoted above, Aaronson writes:
We could do this by stipulating that (say) “the x_a vertices are the ones that are roots of complete binary trees of depth 3”, and then choosing the rule set to guarantee that, throughout the protocol, exactly two vertices have this property.
That doesn’t make sense to me as even a reasonable example of how to identify ‘the same’ qubits as G evolves. Aaronson seems to be equating vertices in G with a qubit but Wolfram’s idea is that a qubit is something much much bigger inside G.
I can’t follow the rest of that particular argument with any comprehensive understanding.
I wonder how much ‘criticism’ of Wolfram is a result of ‘independent discovery’. Aaronson points out that a lot of Wolfram’s ‘hypergraph physics’ is covered in work on loop quantum gravity. While Wolfram was a ‘professional physicist’ at one point, he hasn’t been a full-time academic in decades so it’s understandable that he isn’t familiar with all of the possibly relevant literature.
It’s also (still) possible that Wolfram’s ideas will revolutionize other sciences as he claims. I’m skeptical of this too tho!
Thanks! I just read another Aaronson paper recently – his ‘computation complexity for philosophers’ – and thought it was fantastic. (I’ve been following his blog for awhile now.)
I definitely appreciate, not even having (yet) read the paper to which you linked, that Wolfram might not be entirely up-to-date with the frontier of computation complexity. (I’m pretty sure he knows some, if not a lot, of the major less-recent results.)
Wolfram’s also something of a ‘quantum computing’ skeptic, which I think satisfyingly explains why he doesn’t discuss it much in NKS or elsewhere. (He also does somewhat explain his skepticism, and that he is skeptical of it, in NKS (IIRC).)
I can also understand and sympathize with experts not being impressed with the book, or his work generally. But Robin Hanson has expressed similar complaints about the reception of his own work, and interdisciplinary work more widely, and I think those complaints are valid and (sadly) true.
I don’t personally model academia as (effectively) producing truth or even insight as a particularly high priority.
Ahh – I can understand and sympathize with that!
I don’t think he has literally one trick but you’re right that a lot of his recent public work has been exploring his ideas instead of falsifying them.
I’d describe his ‘main trick’ as trying to find a simple computable system that mostly mimics the ‘dynamics’ of some other system.
And – or so I think – exploring, at considerable length, the idea that ‘everything is space’ and ‘(maybe) space is a hypergraph evolving according to a simple rule’ is an extremely interesting endeavor. It doesn’t seem particularly crazy compared to other niche ‘theories of everything’ for one.
And, yes, he talks and writes about ‘universal computation’, his own phrase, instead of ‘Turing-complete’ – that’s a somewhat lamentable phenomena, but pretty understandable. We all – as individuals and groups – do that too tho, so I don’t really ‘ding’ him for those ‘excesses’. This is an extremely common complaint about him and his work, but it’s mostly irrelevant to determining whether his ideas are interesting, let alone true.
(Arguably we – the LessWrong users – have done the same thing repeatedly!)
I think the bigger thing that he has – not demonstrated exactly – but accumulated tantalizing evidence for, is that Turing-completeness (‘universal computation’) is both easy and, surprisingly, common. I still think that’s an under-appreciated point.
His recent ‘hypergraph’ work seems promising to me – it seems like a (very mildly or weakly) plausible (tho rough) idea of how one might formulate everything else in terms of ‘space quanta’ and his ideas about what ‘time’ and ‘causality’ could mean based on an example formulation seem very interesting. I certainly don’t begrudge him, or anyone else, spending their time this doing this. And I definitely don’t think him, or anyone else, owes me a falsifiable theory! (I might feel a little differently if I was involuntarily supporting his efforts, e.g. via taxes, like I am with string theory.)
The practical obstacles to actually start to test how well his ideas or theories work seem insurmountable, but that’s still true of string theory as well – and maybe you feel similarly about it!
Yes.
The analogies seem very superficial to me. I mean, I would be impressed if he could derive the equation for Lorentz contraction, but I am unimpressed if he merely shows that “something can get shorter”. Etc. Could you give me the best example of mimicking some specific law of physics?
He in fact did derive (approximately) both special and general relativity for the ‘hypergraph physics’ project – I think. I’ll look for a link but it should be on the same site as the link for this post.
Have you read his previous book “A New Kind of Science”? It’s available for free online here. I think the “analogies” he presents are surprisingly good given how simple they are, e.g. the fluid dynamics stuff seems ‘right’, even if it’s not (nearly) as accurate as standard numerical approximations/simulations based on the standard differential equations.
And here is a review, written by an expert on physics and computation. It does not address all claims in the book, specifically not the ones you mentioned. But I think it explains why people who know a lot about these things are not necessarily impressed by “the new kind of science”.
In the second paragraph of the introduction in the review by Aaronson:
With regard to Aaronson’s criticisms with respect to the content in NKS about quantum mechanics, I’m pretty sure Wolfram has addressed some of them in his newer work, e.g. (previously) ignoring ‘multiway systems’.
One thing that jumps out at me, in Aaronson’s ‘not compatible with both special relativity and Bell inequality violations’ argument against Wolfram’s (earlier version of his) ‘hypergraph physics’:
Funnily enough, it’s Aaronson’s ‘computation complexity for philosophers’ paper that now makes me think such an ‘identification’ routine is possibly (vastly far) from “a technicality”, especially given that the nodes in the graph G are expected to represent something like a Planck length (or smaller) and x_a and y_a are “input bits”, i.e. some two-level quantum mechanical system (?). The idea of identifying the same x_a and y_a as G doesn’t seem obvious or trivial from a computational complexity perspective.
Tho, immediately following what I quoted above, Aaronson writes:
That doesn’t make sense to me as even a reasonable example of how to identify ‘the same’ qubits as G evolves. Aaronson seems to be equating vertices in G with a qubit but Wolfram’s idea is that a qubit is something much much bigger inside G.
I can’t follow the rest of that particular argument with any comprehensive understanding.
I wonder how much ‘criticism’ of Wolfram is a result of ‘independent discovery’. Aaronson points out that a lot of Wolfram’s ‘hypergraph physics’ is covered in work on loop quantum gravity. While Wolfram was a ‘professional physicist’ at one point, he hasn’t been a full-time academic in decades so it’s understandable that he isn’t familiar with all of the possibly relevant literature.
It’s also (still) possible that Wolfram’s ideas will revolutionize other sciences as he claims. I’m skeptical of this too tho!
Thanks! I just read another Aaronson paper recently – his ‘computation complexity for philosophers’ – and thought it was fantastic. (I’ve been following his blog for awhile now.)
I definitely appreciate, not even having (yet) read the paper to which you linked, that Wolfram might not be entirely up-to-date with the frontier of computation complexity. (I’m pretty sure he knows some, if not a lot, of the major less-recent results.)
Wolfram’s also something of a ‘quantum computing’ skeptic, which I think satisfyingly explains why he doesn’t discuss it much in NKS or elsewhere. (He also does somewhat explain his skepticism, and that he is skeptical of it, in NKS (IIRC).)
I can also understand and sympathize with experts not being impressed with the book, or his work generally. But Robin Hanson has expressed similar complaints about the reception of his own work, and interdisciplinary work more widely, and I think those complaints are valid and (sadly) true.
I don’t personally model academia as (effectively) producing truth or even insight as a particularly high priority.