I am interested in the GRT from an agent foundations point of view, not because I want to make better thermostats.
An agent with a goal needs to use the means available to it in whatever way will achieve that goal. That is practically the definition of a control system. So you do actually want to build better thermostats, even if you haven’t realised it.
I’m sure that GRT is pretty useless for most practical applications of control theory!
I’m sure that GRT is pretty useless, period.
Reducing entropy is often a necessary (but not sufficient) condition for achieving goals. A thermostat can achieve an average temperature of 25C by ensuring that the room temperature comes from a uniform distribution over all temperatures between 75C and −25C. But a better thermostat will ensure that the temperature is distributed over a narrower (lower entropy) distribution around 25C .
A worse thermostat will achieve an equally low entropy distribution around 40C. Reaching the goal is what matters, not precisely hitting the wrong target.
An agent with a goal needs to use the means available to it in whatever way will achieve that goal. That is practically the definition of a control system. So you do actually want to build better thermostats, even if you haven’t realised it.
I’m sure that GRT is pretty useless, period.
A worse thermostat will achieve an equally low entropy distribution around 40C. Reaching the goal is what matters, not precisely hitting the wrong target.