Deep Blue is not conscious either—yet it still predicts possible future chess positons, and makes moves based on its expectation of their future payoff.
Yes indeed, which is why I think it’s much easier to consider it a utility maximiser than organisms are. It explicitly “thinks about” the value of its position and tries to improve it. Organisms don’t. They just carry out whatever adaptations evolution has given them.
Take the term [expected fitness maximiser] as a behaviourist would. Organisms have sensors, actuators, and processing that mediates between the two. If they behave in roughly the same way as an expected fitness maximiser would if given their inputs, then the name fits.
But I don’t know how a behaviourist would take it. It’s not a term I’m familiar with.
From looking through Google hits, it seems that “expected fitness” is analogous to the “expected value” of a bet, and means “fitness averaged across possible futures”—but organisms don’t maximise that, because they often find themselves in situations where their strategies are sub-optimal. They often make bad bets.
(Deep Blue isn’t a perfect utility maximiser either, of course, since it can’t look far enough ahead. Only a perfect player would be a true maximiser.)
Deep Blue is not conscious either—yet it still predicts possible future chess positons, and makes moves based on its expectation of their future payoff.
Yes indeed, which is why I think it’s much easier to consider it a utility maximiser than organisms are. It explicitly “thinks about” the value of its position and tries to improve it. Organisms don’t. They just carry out whatever adaptations evolution has given them.
Take the term [expected fitness maximiser] as a behaviourist would. Organisms have sensors, actuators, and processing that mediates between the two. If they behave in roughly the same way as an expected fitness maximiser would if given their inputs, then the name fits.
But I don’t know how a behaviourist would take it. It’s not a term I’m familiar with.
From looking through Google hits, it seems that “expected fitness” is analogous to the “expected value” of a bet, and means “fitness averaged across possible futures”—but organisms don’t maximise that, because they often find themselves in situations where their strategies are sub-optimal. They often make bad bets.
(Deep Blue isn’t a perfect utility maximiser either, of course, since it can’t look far enough ahead. Only a perfect player would be a true maximiser.)