For the Feynman quote, I don’t think Feynman was a programmer! And Chaos was late in his life. Incomprehensibly complex behaviour from simple rules is not surprising to me.
Feynman certainly wasn’t a professional programmer and I don’t believe he spent much time writing software, but he worked for a while as a consultant to Thinking Machines Corporation and gave a (published, rather good) set of Lectures on Computation.
So if he doesn’t sound like a programmer, it may be for reasons other than not being familiar with the insights that come from spending time making software.
(I don’t think Feynman had any more difficulty than you in believing that simple rules produce very complex behaviour. I think he was trying to express how most people feel about it.)
I thought I was reasonably comfortable with basic parallel programming… then I spent a few hours talking with a very smart woman who specialized in low level parallel programming(close to the silicon) and talking about the kind of things that can go wrong.
It’s like clicking into an article expecting some light reading and finding yourself staring into a lovecraftian madness-educing abyss filled with 5-D monsters.
Something like dealing with synchronization and locking instead of just “and then we send this command to the mapper, and a lot of executors work in parallel”?
No, I was comfortable with locking and the sort of stuff you’ll routinely see in high level languages where they lock away the complexity behind abstraction.
I’m talking about the “fun” of physical cores which few sane programmers touch.
Though most programmers outside of chip companies can’t touch those levels nowdays since even assembly is treated like a high-level language.
I had something of a similar experience writing software for a company that designed its own chips. I was writing in C, and occasionally, the programs had errors that I couldn’t debug. At that point I’d dive into the Verilog for the chip and see how it was all wired up. And often that would be the problem.
It’s a commonplace that the thing that determines the meaning of a computer program is itself a computer program. But it’s still weird to get bitten by that fact when you’re writing C.
The chip is a program, which determines the meaning of the machine code, which is a program produced by the C compiler, which is a program interpreting the program I’m writing. And whether you look at the chip as a compiled version of the verilog (compiled by another program, with silicon as the target) or as a logical program which you’re proving things about in your head, or as a program being interpreted by a simulator, that’s a lot of turtles.....
Perhaps, ‘laws’ would have been a better word than ‘rules’.
I was thinking of it more in terms of complexity. When things are looked at in isolation, it is much easier to see how the simple laws apply. But as things get more complex, we also need to figure out how the different systems interact and influence each other. This makes the simple laws harder to discern.
Simple systems have few components and their behavior is in all respects fully
understandable and predictable. An example would be a solid ball falling under the action of gravity through air. This simple system consists of the ball, the air, and the gravitational force. Here we usually assume a single ball, constant acceleration of gravity, a viscous drag on the ball, and Newton’s laws. When making these assumptions, we arrive at very useful answers. We did, however, neglect many aspects. If, for example, we would ask how the behavior changes when we go from one ball, to two, to three, or even more balls that fall close to each other, our “Simple System” assumption fails. It is not sufficient to generalize from one ball’s behavior to many. Instead we need to consider the interaction of the balls through their self‐generated vortices.
I thought I was reasonably comfortable with basic parallel programming… then I spent a few hours talking with a woman who specialized in low level parallel programming(close to the silicon) and the kind of things that can go wrong.
It’s like clicking into an article expecting some light reading and finding yourself staring into a lovecraftian madness-educing abyss filled with 5-D monsters.
For the Feynman quote, I don’t think Feynman was a programmer! And Chaos was late in his life. Incomprehensibly complex behaviour from simple rules is not surprising to me.
Feynman certainly wasn’t a professional programmer and I don’t believe he spent much time writing software, but he worked for a while as a consultant to Thinking Machines Corporation and gave a (published, rather good) set of Lectures on Computation.
So if he doesn’t sound like a programmer, it may be for reasons other than not being familiar with the insights that come from spending time making software.
(I don’t think Feynman had any more difficulty than you in believing that simple rules produce very complex behaviour. I think he was trying to express how most people feel about it.)
And Feynman was very good at making people understand him. His lectures are well known for that.
I thought I was reasonably comfortable with basic parallel programming… then I spent a few hours talking with a very smart woman who specialized in low level parallel programming(close to the silicon) and talking about the kind of things that can go wrong.
It’s like clicking into an article expecting some light reading and finding yourself staring into a lovecraftian madness-educing abyss filled with 5-D monsters.
Something like dealing with synchronization and locking instead of just “and then we send this command to the mapper, and a lot of executors work in parallel”?
No, I was comfortable with locking and the sort of stuff you’ll routinely see in high level languages where they lock away the complexity behind abstraction.
I’m talking about the “fun” of physical cores which few sane programmers touch.
Though most programmers outside of chip companies can’t touch those levels nowdays since even assembly is treated like a high-level language.
http://blog.erratasec.com/2015/03/x86-is-high-level-language.html#.VvQhiOaAnSg
I had something of a similar experience writing software for a company that designed its own chips. I was writing in C, and occasionally, the programs had errors that I couldn’t debug. At that point I’d dive into the Verilog for the chip and see how it was all wired up. And often that would be the problem.
It’s a commonplace that the thing that determines the meaning of a computer program is itself a computer program. But it’s still weird to get bitten by that fact when you’re writing C.
The chip is a program, which determines the meaning of the machine code, which is a program produced by the C compiler, which is a program interpreting the program I’m writing. And whether you look at the chip as a compiled version of the verilog (compiled by another program, with silicon as the target) or as a logical program which you’re proving things about in your head, or as a program being interpreted by a simulator, that’s a lot of turtles.....
Perhaps, ‘laws’ would have been a better word than ‘rules’.
I was thinking of it more in terms of complexity. When things are looked at in isolation, it is much easier to see how the simple laws apply. But as things get more complex, we also need to figure out how the different systems interact and influence each other. This makes the simple laws harder to discern.
I thought I was reasonably comfortable with basic parallel programming… then I spent a few hours talking with a woman who specialized in low level parallel programming(close to the silicon) and the kind of things that can go wrong.
It’s like clicking into an article expecting some light reading and finding yourself staring into a lovecraftian madness-educing abyss filled with 5-D monsters.