Consciousness measures an agent’s ability to achieve personal goals in a wide range of environments.
Why use the word “consciousness” to describe this? This seems unrelated to most uses of the term.
This post deserves the downvotes it’s getting—you’re throwing around a lot of new terminology—“specified goals”, “maximal goals”, “personal goals ” without explaining them. You also don’t give a reason for why we should care about your post, beyond “it might be useful to understand some future posts on your personal blog.
If you have a clear idea of what you mean by “specified goals” vs. “maximal goals”, why not stick to only explaining that—tell us what the difference is, why it matters, why we should care? Start with the basic building blocks instead of trying to encompass the whole of AI and Morality in a couple short posts.
I think that’s the problem here: you’re trying to tackle a huge topic in a few short posts, whereas Eliezer has previously tackled this same topic by posting daily for about a year, slowly building up from the bases.
[Edit: I had copied the wrong definition of “Consciousnes” in my quote]
Why use the word “consciousness” to describe this? This seems unrelated to most uses of the term.
Hardly. How else would you describe your consciousness? It’s your personal awareness and it’s focused on a small number of your goals. I’m using simple English in standard ways. There is no tricky redefinition going on here. If I’m going to redefine a term and use it in nonstandard ways (generally a very bad practice contrary to the normal habits of this site) I’ll make sure that the definition stands out—like the definition of the simple distinction between the three terms.
You also don’t give a reason for why we should care about your post
you’re trying to tackle a huge topic in a few short posts,
As far as I can tell, this is contradictory advice. First, you want me to tell you why you should care about the distinction that I am drawing (which basically requires an overview of where I am going with a huge topic) then you hit me going the other way when I try to give an overview of where I’m going. I certainly don’t expect to thoroughly cover any topic of this type in one post (or a small number of posts). Eliezer is given plenty of rope and can build up slowly from the bases and you implicitly assume that he will get somewhere. Me you ask for why you should care and then call it a problem that I tried. How would you handle this conundrum.
(BTW, even though I’m debating your points, I do greatly appreciate your taking the time to make them)
As far as I can tell, this is contradictory advice. First, you want me to tell you why you should care about the distinction that I am drawing (which basically requires an overview of where I am going with a huge topic) then you hit me going the other way when I try to give an overview of where I’m going.
OK, I agree it might be somewhat contradictory.
I think there are two problems:
You’re covering a large and abstract scope which lends itself to glossing over important details and prerequesites (such as clarifying what you mean by the various kinds of goals)
You don’t give us many reasons for paying attention to your approach in particular—will it provide us with new insight? Why is that way of dividing “ways to think about goals” better than another? Is this post supposed to be the basic details on which you’ll build later, or an overview the details of which you’ll fill in later?
On Consciousness: Richard said it better than me; if you just said “an agent’s ability to achieve personal goals in a wide range of environments.”, I don’t think people would translate that in their minds as “consciousness”. Contrast your definition with those given on wikipedia.
Why use the word “consciousness” to describe this? This seems unrelated to most uses of the term.
This post deserves the downvotes it’s getting—you’re throwing around a lot of new terminology—“specified goals”, “maximal goals”, “personal goals ” without explaining them. You also don’t give a reason for why we should care about your post, beyond “it might be useful to understand some future posts on your personal blog.
If you have a clear idea of what you mean by “specified goals” vs. “maximal goals”, why not stick to only explaining that—tell us what the difference is, why it matters, why we should care? Start with the basic building blocks instead of trying to encompass the whole of AI and Morality in a couple short posts.
I think that’s the problem here: you’re trying to tackle a huge topic in a few short posts, whereas Eliezer has previously tackled this same topic by posting daily for about a year, slowly building up from the bases.
[Edit: I had copied the wrong definition of “Consciousnes” in my quote]
Hardly. How else would you describe your consciousness? It’s your personal awareness and it’s focused on a small number of your goals. I’m using simple English in standard ways. There is no tricky redefinition going on here. If I’m going to redefine a term and use it in nonstandard ways (generally a very bad practice contrary to the normal habits of this site) I’ll make sure that the definition stands out—like the definition of the simple distinction between the three terms.
As far as I can tell, this is contradictory advice. First, you want me to tell you why you should care about the distinction that I am drawing (which basically requires an overview of where I am going with a huge topic) then you hit me going the other way when I try to give an overview of where I’m going. I certainly don’t expect to thoroughly cover any topic of this type in one post (or a small number of posts). Eliezer is given plenty of rope and can build up slowly from the bases and you implicitly assume that he will get somewhere. Me you ask for why you should care and then call it a problem that I tried. How would you handle this conundrum.
(BTW, even though I’m debating your points, I do greatly appreciate your taking the time to make them)
OK, I agree it might be somewhat contradictory.
I think there are two problems:
You’re covering a large and abstract scope which lends itself to glossing over important details and prerequesites (such as clarifying what you mean by the various kinds of goals)
You don’t give us many reasons for paying attention to your approach in particular—will it provide us with new insight? Why is that way of dividing “ways to think about goals” better than another? Is this post supposed to be the basic details on which you’ll build later, or an overview the details of which you’ll fill in later?
On Consciousness: Richard said it better than me; if you just said “an agent’s ability to achieve personal goals in a wide range of environments.”, I don’t think people would translate that in their minds as “consciousness”. Contrast your definition with those given on wikipedia.