Matching “first AGI will [probably] have internal structure analogous to that of a human” and “first AGI [will probably have] many interacting specialized modules” in a literal (cough uncharitable cough) manner, as evidenced by “heavier-than-air flying-machines had feathers and beaks”. Your phrasing hints at an anthropocentric architectural bias, analogous to the one you specifically distance yourself from regarding values.
Maybe you should clarify that part, it’s crucial to the current misunderstanding, and it’s not clear whether by “interacting specialized modules” you’d also refer to “Java classes not corresponding to anything ‘human’ in particular”, or whether you’d expect a “thalamus-module”.
Matching “first AGI will [probably] have internal structure analogous to that of a human” and “first AGI [will probably have] many interacting specialized modules” in a literal (cough uncharitable cough) manner, as evidenced by “heavier-than-air flying-machines had feathers and beaks”. Your phrasing hints at an anthropocentric architectural bias, analogous to the one you specifically distance yourself from regarding values.
I think that people should make more of an effort to pay attention to the nuances of people’s statements rather than using simple pattern matching.
Maybe you should clarify that part, it’s crucial to the current misunderstanding, and it’s not clear whether by “interacting specialized modules” you’d also refer to “Java classes not corresponding to anything ‘human’ in particular”, or whether you’d expect a “thalamus-module”.
There’s a great deal to write about this, and I’ll do so at a later date.
To give you a small taste of what I have in mind: suppose you ask “How likely is it that the final digit of the Dow Jones will be 2 in two weeks.” I’ve never thought about this question. A priori, I have no Bayesian prior. What my brain does, is to amalgamate
The Dow Jones index varies in a somewhat unpredictable way
The last digit is especially unpredictable.
Two weeks is a really long time for unpredictable things to happen in this context
The last digit could be one of 10 values between 0 and 9
The probability of a randomly selected digit between 0 and 9 being 2 is equal to 10%
Different parts of my brain generate the different pieces, and another part of my brain combines them. I’m not using a single well-defined Bayesian prior, nor am I satisfying a well defined utility function.
Matching “first AGI will [probably] have internal structure analogous to that of a human” and “first AGI [will probably have] many interacting specialized modules” in a literal (cough uncharitable cough) manner, as evidenced by “heavier-than-air flying-machines had feathers and beaks”. Your phrasing hints at an anthropocentric architectural bias, analogous to the one you specifically distance yourself from regarding values.
Maybe you should clarify that part, it’s crucial to the current misunderstanding, and it’s not clear whether by “interacting specialized modules” you’d also refer to “Java classes not corresponding to anything ‘human’ in particular”, or whether you’d expect a “thalamus-module”.
I think that people should make more of an effort to pay attention to the nuances of people’s statements rather than using simple pattern matching.
There’s a great deal to write about this, and I’ll do so at a later date.
To give you a small taste of what I have in mind: suppose you ask “How likely is it that the final digit of the Dow Jones will be 2 in two weeks.” I’ve never thought about this question. A priori, I have no Bayesian prior. What my brain does, is to amalgamate
The Dow Jones index varies in a somewhat unpredictable way
The last digit is especially unpredictable.
Two weeks is a really long time for unpredictable things to happen in this context
The last digit could be one of 10 values between 0 and 9
The probability of a randomly selected digit between 0 and 9 being 2 is equal to 10%
Different parts of my brain generate the different pieces, and another part of my brain combines them. I’m not using a single well-defined Bayesian prior, nor am I satisfying a well defined utility function.