The definitions of words are a pragmatic matter; we choose them to make concepts easy to talk about. If the definition of AI were broadened to cover all software, then we would immediately need a new word for “software which is autonomous and general-purpose in a vaguely human-like way”, because that’s a thing which people want to talk about.
When you start thinking about “chauvinism”, with regard to software that exists now, it’s… kind of like if someone were to talk about how people were being mean to granite boulders. I’m just scratching my head about how you came to believe that.
Able to solve problems in a wide variety of environments.
Software tends to only be able to solve a small set of problems that it was designed for, and even then it needs to be told the problem in a very specific way.
I don’t understand your question. Are you saying that my comment wasn’t about AIs being like humans, or are you saying that it doesn’t matter if software is only able to solve a set of problems that it wasn’t designed for?
I am suggesting your comment implied to me you still compare AIs with humans a bit too much. We work to make software able to solve the set of problems it was designed for. This applies for Hello World, and for Singleton.
Single-use software has its place, but it’s not exactly singularity-inducing. Each piece of software can only do one thing. If you had a piece of software that could do anything, then you program that one piece of software and you and everyone else is set until the heat death of the universe.
Also, why bother with the word AI? Even if AGI isn’t its own cluster in thingspace, we already have the word “software”. Why replace it?
The more different subjects, venues, and experiences in the world that you open your eyes to, the more you will see that we are in a smooth, soft takeoff. Now.
We’ve been using tools to build on tools to get exponential progress for some time. This has been happening before computers were invented, and software isn’t the only thing in it.
I’m not denying the existence of a smooth, soft takeoff. I’m just saying that a fast one would be awesome.
The definitions of words are a pragmatic matter; we choose them to make concepts easy to talk about. If the definition of AI were broadened to cover all software, then we would immediately need a new word for “software which is autonomous and general-purpose in a vaguely human-like way”, because that’s a thing which people want to talk about.
When you start thinking about “chauvinism”, with regard to software that exists now, it’s… kind of like if someone were to talk about how people were being mean to granite boulders. I’m just scratching my head about how you came to believe that.
Why not in an AI like way? Turing’s child-processes are so much closer to us than a rock. Would you care to rephrase?
Able to solve problems in a wide variety of environments.
Software tends to only be able to solve a small set of problems that it was designed for, and even then it needs to be told the problem in a very specific way.
And? This is about not expecting AIs to be like humans, but to be like, well, AIs. Artificial deciders.
I don’t understand your question. Are you saying that my comment wasn’t about AIs being like humans, or are you saying that it doesn’t matter if software is only able to solve a set of problems that it wasn’t designed for?
I am suggesting your comment implied to me you still compare AIs with humans a bit too much. We work to make software able to solve the set of problems it was designed for. This applies for Hello World, and for Singleton.
Single-use software has its place, but it’s not exactly singularity-inducing. Each piece of software can only do one thing. If you had a piece of software that could do anything, then you program that one piece of software and you and everyone else is set until the heat death of the universe.
Also, why bother with the word AI? Even if AGI isn’t its own cluster in thingspace, we already have the word “software”. Why replace it?
The more different subjects, venues, and experiences in the world that you open your eyes to, the more you will see that we are in a smooth, soft takeoff. Now.
We’ve been using tools to build on tools to get exponential progress for some time. This has been happening before computers were invented, and software isn’t the only thing in it.
I’m not denying the existence of a smooth, soft takeoff. I’m just saying that a fast one would be awesome.
Turns out it was extremely slow, then fast, then suddenly asymptotic. I’m sure you mathletes must know the name for that type of curve.
Y / N / Cancel?