Do you think that other sources give a different impression?
I was reading Wiener’s own writings, here and here
which gives the impression of urgency (a sense that he viewed it as a high priority and time-sensitive issue).
Wiener’s own writings do not seem to give such an impression of urgency, and I note that he didn’t do anything beyond contacting a few union leaders, such as lobbying directly to politicians. Here’s how he described his contact with union leaders:
To arrive at this society, we need a good deal of planning and a good deal of struggle, which, if the best comes to the best, may be on the plane of ideas, and otherwise – who knows? I thus felt it my duty to pass on my information and understanding of the position to those who have an active interest in the conditions and the future of labor, that is, to the labor unions. I did manage to make contact with one or two persons high up in the CIO, and from them I received a very intelligent and sympathetic hearing. Further than these individuals, neither I nor any of them was able to go.
Quoting you again:
Capable of what? Some tasks that previously required labor of humans of average intelligence have been automated, and others haven’t been automated. There’s still an abundance of jobs for people of average intelligence that pay above-minimum wage.
Capable of any job that a human of average intelligence could perform. I thought that’s pretty clear from “However, taking the second revolution as accomplished, the average human being of mediocre attainments or less has nothing to sell that it is worth anyone’s money to buy.”
The first paragraph that you quote gives the impression that he may have (mistakenly) thought that humans were on the brink of developing robotics that are sufficiently sophisticated to replace physical labor.
It seems clear, at least in his later writings (1960, second link above), that he really was thinking of AGI, not just robotics:
Complete subservience
and complete intelligence do not go
together. How often in ancient times
the clever Greek philosopher slave of
a less intelligent Roman slaveholder
must have dominated the actions of his
master rather than obeyed his wishes!
Similarly, if the machines become
more and more efficient and operate
at a higher and higher psychological
level, the catastrophe foreseen by
Butler of the dominance of the machine
comes nearer and nearer.
I was reading Wiener’s own writings, here and here
Thanks.
Wiener’s own writings do not seem to give such an impression of urgency, and I note that he didn’t do anything beyond contacting a few union leaders, such as lobbying directly to politicians. Here’s how he described his contact with union leaders:
Based on your quotation, I agree. I was reporting on what I read, and didn’t deep dive the situation, because I came to the conclusion that the case of Wiener and automation doesn’t have high relevance.
Capable of any job that a human of average intelligence could perform. I thought that’s pretty clear from “However, taking the second revolution as accomplished, the average human being of mediocre attainments or less has nothing to sell that it is worth anyone’s money to buy.”
We have a difference of interpretation. I thought he wasn’t talking about AGI because AGI could probably replace high intelligence people too, and he suggests that high intelligence people wouldn’t be replaced.
It seems clear, at least in his later writings (1957, second link above), that he really was thinking of AGI, not just robotics:
I think that he was writing about narrow AI in his earlier writings, and AGI in his later writings.
I was reading Wiener’s own writings, here and here
Wiener’s own writings do not seem to give such an impression of urgency, and I note that he didn’t do anything beyond contacting a few union leaders, such as lobbying directly to politicians. Here’s how he described his contact with union leaders:
Quoting you again:
Capable of any job that a human of average intelligence could perform. I thought that’s pretty clear from “However, taking the second revolution as accomplished, the average human being of mediocre attainments or less has nothing to sell that it is worth anyone’s money to buy.”
It seems clear, at least in his later writings (1960, second link above), that he really was thinking of AGI, not just robotics:
Thanks.
Based on your quotation, I agree. I was reporting on what I read, and didn’t deep dive the situation, because I came to the conclusion that the case of Wiener and automation doesn’t have high relevance.
We have a difference of interpretation. I thought he wasn’t talking about AGI because AGI could probably replace high intelligence people too, and he suggests that high intelligence people wouldn’t be replaced.
I think that he was writing about narrow AI in his earlier writings, and AGI in his later writings.