Also interested in helping on this—if there’s modelling you’d want to outsource.
Here’s one fairly-standalone project which I probably won’t get to soon. It would be a fair bit of work, but also potentially very impressive in terms of both showing off technical skills and producing cool results.
Short somewhat-oversimplified version: take a finite-element model of some realistic objects. Backpropagate to compute the jacobian of final state variables with respect to initial state variables. Take a singular value decomposition of the jacobian. Hypothesis: the singular vectors will roughly map to human-recognizable high-level objects in the simulation (i.e. the nonzero elements of any given singular vector should be the positions and momenta of each of the finite elements comprising one object).
Longer version: conceptually, we imagine that there’s some small independent Gaussian noise in each of the variables defining the initial conditions of the simulation (i.e. positions and momenta of each finite element). Assuming the dynamics are such that the uncertainty remains small throughout the simulation—i.e. the system is not chaotic—our uncertainty in the final positions is then also Gaussian, found by multiplying the initial distribution by the jacobian matrix. The hypothesis that information-at-a-distance (in this case “distance” = later time) is low-dimensional then basically says that the final distribution (and therefore the jacobian) is approximately low-rank.
In order for this to both work and be interesting, there are some constraints on both the system and on how the simulation is set up. First, “not chaotic” is a pretty big limitation. Second, we want the things-simulated to not just be pure rigid-body objects, since in that case it’s pretty obvious that the method will work and it’s not particularly interesting. Two potentially-interesting cases to try:
Simulation of an elastic object with multiple human-recognizable components, with substantial local damping to avoid small-scale chaos. Cloth or jello or a sticky hand or something along those lines could work well.
Simulation of waves. Again, probably want at least some damping. Full-blown fluid dynamics could maybe be viable in a non-turbulent regime, although it would have to parameterized right—i.e. Eulerian coordinates rather than Lagrangian—so I’m not sure how it well it would play with APIC simulations and the like.
If you wanted to produce a really cool visual result, then I’d recommend setting up the simulation in Houdini, then attempting to make it play well with backpropagation. That would be a whole project in itself, but if viable the results would be very flashy.
Important implementation note: you’d probably want to avoid explicitly calculating the jacobian. Code it as a linear operator—i.e. a function which takes in a vector, and returns the product of the jacobian times that vector—and then use a sparse SVD method to find the largest singular values and corresponding singular vectors. (Unless you know how to work efficiently with jacobian matrices without doing that, but that’s a pretty unusual thing to know.)
Been a while but I thought the idea was interesting and had a go at implementing it. Houdini was too much for my laptop, let alone my programming skills, but I found a simple particle simulation in pygame which shows the basics, can see below.
Planned next step is to work on the run-time speed (even this took a couple of minutes run, calculating the frame-to-frame Jacobian is a pain, probably more than necessary) and then add some utilities for creating larger, densely connected objects, will write up as a fuller post once done.
Curious if you’ve got any other uses for a set-up like this.
Make sure to check that the values in the jacobian aren’t exploding—i.e. there’s not values like 1e30 or 1e200 or anything like that. Exponentially large values in the jacobian probably mean the system is chaotic.
If you want to avoid explicitly computing the jacobian, write a method which takes in a (constant) vector u and uses backpropagation to return ∇x0(xt⋅u). This is the same as the time-0-to-time-t jacobian dotted with u, but it operates on size-n vectors rather than n-by-n jacobian matrices, so should be a lot faster. Then just wrap that method in a LinearOperator (or the equivalent in your favorite numerical library), and you’ll be able to pass it directly to an SVD method.
In terms of other uses… you could e.g. put some “sensors” and “actuators” in the simulation, then train some controller to control the simulated system, and see whether the data structures learned by the controller correspond to singular vectors of the jacobian. That could make for an interesting set of experiments, looking at different sensor/actuator setups and different controller architectures/training schemes to see which ones do/don’t end up using the singular-value structure of the system.
Another little update, speed issue solved for now by adding SymPy’s fortran wrappers to the derivative calculations—calculating the SVD isn’t (yet?) the bottleneck. Can now quickly get results from 1,000+ step simulations of 100s of particles.
Unfortunately, even for the pretty stable configuration below, the values are indeed exploding. I need to go back through the program and double check the logic but I don’t think it should be chaotic, if anything I would expect the values to hit zero.
It might be that there’s some kind of quasi-chaotic behaviour where the residual motion of the particles is impossibly sensitive to the initial conditions, even as the macro state is very stable, with a nicely defined derivative wrt initial conditions. Not yet sure how to deal with this.
If the wheels are bouncing off each other, then that could be chaotic in the same way as billiard balls. But at least macroscopically, there’s a crapton of damping in that simulation, so I find it more likely that the chaos is microscopic. But also my intuition agrees with yours, this system doesn’t seem like it should be chaotic...
Here’s one fairly-standalone project which I probably won’t get to soon. It would be a fair bit of work, but also potentially very impressive in terms of both showing off technical skills and producing cool results.
Short somewhat-oversimplified version: take a finite-element model of some realistic objects. Backpropagate to compute the jacobian of final state variables with respect to initial state variables. Take a singular value decomposition of the jacobian. Hypothesis: the singular vectors will roughly map to human-recognizable high-level objects in the simulation (i.e. the nonzero elements of any given singular vector should be the positions and momenta of each of the finite elements comprising one object).
Longer version: conceptually, we imagine that there’s some small independent Gaussian noise in each of the variables defining the initial conditions of the simulation (i.e. positions and momenta of each finite element). Assuming the dynamics are such that the uncertainty remains small throughout the simulation—i.e. the system is not chaotic—our uncertainty in the final positions is then also Gaussian, found by multiplying the initial distribution by the jacobian matrix. The hypothesis that information-at-a-distance (in this case “distance” = later time) is low-dimensional then basically says that the final distribution (and therefore the jacobian) is approximately low-rank.
In order for this to both work and be interesting, there are some constraints on both the system and on how the simulation is set up. First, “not chaotic” is a pretty big limitation. Second, we want the things-simulated to not just be pure rigid-body objects, since in that case it’s pretty obvious that the method will work and it’s not particularly interesting. Two potentially-interesting cases to try:
Simulation of an elastic object with multiple human-recognizable components, with substantial local damping to avoid small-scale chaos. Cloth or jello or a sticky hand or something along those lines could work well.
Simulation of waves. Again, probably want at least some damping. Full-blown fluid dynamics could maybe be viable in a non-turbulent regime, although it would have to parameterized right—i.e. Eulerian coordinates rather than Lagrangian—so I’m not sure how it well it would play with APIC simulations and the like.
If you wanted to produce a really cool visual result, then I’d recommend setting up the simulation in Houdini, then attempting to make it play well with backpropagation. That would be a whole project in itself, but if viable the results would be very flashy.
Important implementation note: you’d probably want to avoid explicitly calculating the jacobian. Code it as a linear operator—i.e. a function which takes in a vector, and returns the product of the jacobian times that vector—and then use a sparse SVD method to find the largest singular values and corresponding singular vectors. (Unless you know how to work efficiently with jacobian matrices without doing that, but that’s a pretty unusual thing to know.)
Been a while but I thought the idea was interesting and had a go at implementing it. Houdini was too much for my laptop, let alone my programming skills, but I found a simple particle simulation in pygame which shows the basics, can see below.
Planned next step is to work on the run-time speed (even this took a couple of minutes run, calculating the frame-to-frame Jacobian is a pain, probably more than necessary) and then add some utilities for creating larger, densely connected objects, will write up as a fuller post once done.
Curious if you’ve got any other uses for a set-up like this.
Nice!
A couple notes:
Make sure to check that the values in the jacobian aren’t exploding—i.e. there’s not values like 1e30 or 1e200 or anything like that. Exponentially large values in the jacobian probably mean the system is chaotic.
If you want to avoid explicitly computing the jacobian, write a method which takes in a (constant) vector u and uses backpropagation to return ∇x0(xt⋅u). This is the same as the time-0-to-time-t jacobian dotted with u, but it operates on size-n vectors rather than n-by-n jacobian matrices, so should be a lot faster. Then just wrap that method in a LinearOperator (or the equivalent in your favorite numerical library), and you’ll be able to pass it directly to an SVD method.
In terms of other uses… you could e.g. put some “sensors” and “actuators” in the simulation, then train some controller to control the simulated system, and see whether the data structures learned by the controller correspond to singular vectors of the jacobian. That could make for an interesting set of experiments, looking at different sensor/actuator setups and different controller architectures/training schemes to see which ones do/don’t end up using the singular-value structure of the system.
Another little update, speed issue solved for now by adding SymPy’s fortran wrappers to the derivative calculations—calculating the SVD isn’t (yet?) the bottleneck. Can now quickly get results from 1,000+ step simulations of 100s of particles.
Unfortunately, even for the pretty stable configuration below, the values are indeed exploding. I need to go back through the program and double check the logic but I don’t think it should be chaotic, if anything I would expect the values to hit zero.
It might be that there’s some kind of quasi-chaotic behaviour where the residual motion of the particles is impossibly sensitive to the initial conditions, even as the macro state is very stable, with a nicely defined derivative wrt initial conditions. Not yet sure how to deal with this.
If the wheels are bouncing off each other, then that could be chaotic in the same way as billiard balls. But at least macroscopically, there’s a crapton of damping in that simulation, so I find it more likely that the chaos is microscopic. But also my intuition agrees with yours, this system doesn’t seem like it should be chaotic...