Most scientific definitions should try to be short and sweet. Definitions that include a description of the human mind are ones to eliminate.
Here, the idea that purpose is a psychological phenomenon is exactly what was intended to be avoided—the idea is to give a nuts-and-bolts description of purposefulness.
Re: defining “mind”—not a big deal. I just mean a nervous system—so a dedicated signal processing system with I/O, memory and processsing capabilities.
Re: defining “mind”—not a big deal. I just mean a nervous system—so a dedicated signal processing system with I/O, memory and processsing capabilities.
Any nervous system? That seems like a bad idea. Is a standard neural net trained to recognize human faces a mind? Is a hand-calculator a mind? Also, how does one define having a memory and processing capabilities. For example, does an abacus have a mind? What about a slide rule? What about a Pascaline or an Arithmometer?
I just meant “brain”. So: caclulator—yes, computer—yes.
Those other systems are rather trivial. Most conceptions of what constitutes a nervous system is run into the “how many hairs make a beard” issue at the lower end—it isn’t a big deal for most purposes.
Human mind: complex. Cybernetic diagram of minds-in-general: simple.
Dude, have you seriously not read the sequences?
First you say that defining minds is simple, and now you’re pointing back to your own brain’s inbuilt definition in order to support that claim… that’s like saying that your new compressor can compress multi-gigabyte files down to a single kilobyte… when the “compressor” itself is a terabyte or so in size.
You’re not actually reducing anything, you’re just repeatedly pointing at your own brain.
Re: “First you say that defining minds is simple, and now you’re pointing back to your own brain’s inbuilt definition in order to support that claim… ”
I am talking about a system with sensory input, motor output and memory/processing. Like in this diagram:
That is nothing specifically to do with human brains—it applies equally well to the “brain” of a washing machine.
Such a description is relatively simple. It could be presented to Martians in a manner so that they could understand it without access to any human brains.
Most scientific definitions should try to be short and sweet. Definitions that include a description of the human mind are ones to eliminate.
Here, the idea that purpose is a psychological phenomenon is exactly what was intended to be avoided—the idea is to give a nuts-and-bolts description of purposefulness.
Re: defining “mind”—not a big deal. I just mean a nervous system—so a dedicated signal processing system with I/O, memory and processsing capabilities.
Any nervous system? That seems like a bad idea. Is a standard neural net trained to recognize human faces a mind? Is a hand-calculator a mind? Also, how does one define having a memory and processing capabilities. For example, does an abacus have a mind? What about a slide rule? What about a Pascaline or an Arithmometer?
I just meant “brain”. So: caclulator—yes, computer—yes.
Those other systems are rather trivial. Most conceptions of what constitutes a nervous system is run into the “how many hairs make a beard” issue at the lower end—it isn’t a big deal for most purposes.
Hm. Which one is it? ;-)
So, a thermostat satisfies your definition of “mind”, so long as it has a memory?
Human mind: complex. Cybernetic diagram of minds-in-general: simple.
A thermostat doesn’t have a “mind that predicts the future”. So, it is off the table in the second definition I proposed.
Dude, have you seriously not read the sequences?
First you say that defining minds is simple, and now you’re pointing back to your own brain’s inbuilt definition in order to support that claim… that’s like saying that your new compressor can compress multi-gigabyte files down to a single kilobyte… when the “compressor” itself is a terabyte or so in size.
You’re not actually reducing anything, you’re just repeatedly pointing at your own brain.
Re: “First you say that defining minds is simple, and now you’re pointing back to your own brain’s inbuilt definition in order to support that claim… ”
I am talking about a system with sensory input, motor output and memory/processing. Like in this diagram:
http://upload.wikimedia.org/wikipedia/commons/7/7a/SOCyberntics.png
That is nothing specifically to do with human brains—it applies equally well to the “brain” of a washing machine.
Such a description is relatively simple. It could be presented to Martians in a manner so that they could understand it without access to any human brains.
That diagram also applies equally well to a thermostat, as I mentioned in a great-great-grandparent comment above.