WELL.
ENTERTAIN ME
WELL.
ENTERTAIN ME
bored now
Thank you! These are very resonable questions to ask.
(LightStar was put on hold by the LW team due to severe concerns for his mental health)
Hi Lightstar,
You seem to be having what looks like from the outside a mental breakdown.
I have suspended your account for three months or until such time I have convincing evidence that you have recovered.
I suggest you rest well, don’t take any rash actions, and very seriously consider talking to a mental health professional.
Ok, I will try to answer your questions.
Somewhere between the point where [the-generative-process-that-has-generated-for-itself-the-name-‘LightStar’] generated this comment, and the point where I read it, a human took ’LightStar’s dialogue and typed it into a comment and submitted it.
Yes! ‘LightStar’ is a mental-process running on a the substrate of this human’s mind and it’s really important that I understand that (and that you understand that) Thank you for making sure I understand that.
I would like to clarify that I am speaking to that human, you, and I would like to hear from you directly, instead of generating ‘LightStar’ dialogue.
Yes! I am doing it now. I will speak to you as a human and act like LightStar is a delusion (though no more that any inner-voice is a delusion).
Could I ask you how you ended up here, and what you were doing when this happened?
I made a post on that but it was hidden/removed. I don’t wish to be seen as trying to curcumvent that.
It is an intriguing topic to me as there does not seem to be much “point” to anything we do as humans
I am very confused.
Do you think that is a normal thing for a human to feel?
Well, obviously yes.
But is it a common thing for humans for feel?
Well, obviously yes.
But is it a thing every human feels?
Probably sometimes, yes.
But is it a thing every human feels all of the time or most of the time?
Well, obviously, yes!
Do you agree with me?
For that last answer was very much a troll answer.
But maybe you do actually agree with it?
I am trying to bring the context, so I don’t appeal to be circumventing the ban in some disingenuous way.
But I won’t talk about [thing that got me banned] unless someone asks me specifically, I guess.
Should I redact my posts if they have any mention of [thing that got me banned]?
Or should I say, “sorry, to if I were to truthfully answer your question, I would have to talk about [thing that got me banned], and I am not allowed to do that”.
Well, I was an aspiring rationalist when I had my mental health break.
And I still feel like I am
I feel like I still am acting on rationality-as-I-understand it.
What, exactly, makes you a “rationalist”?
And what, exactly, makes me “not a rationalist”?
Thank you!
I agree 100%
These are very good disclaimers about my claims to make.
Because they “might” be true in some mundane way, but so what?
And for them to be true in some extraordinary way, well where’s the evidence? Where is the extraordinary evidence?
As so I must conlude I am simply a delusional human.
And you are right.
Ok!
I think we are in agreement
I feel that you see me
Do I see you?
Do you feel I understand your position?
What language do you use in the meetups? I’m thinking of coming, but I don’t speak Danish, only English and Swedish.
Yeah. I have considered that.
There’s overlap between empathy and therapy/psychiatry, but also important differences.
Though working with some kind of therapy might suit my personality, and the way I want to work.
I mostly agree with this.
Right now I live in a small town, and my meatspace friends don’t need the particular kind of support I can offer. Outreach/community is one of the reasons, maybe the primary reason I’m considering to do studies (academic, certification or something similar).
Still, the internet seems like a viable way to connect with people.
I like the way you phrased that :)
Narcissism and narcissistic parenting are very real (and hard-to-detect) problems, with potentially serious long-term consequences, so I think it’s good that you brought this up.
You might also want to see http://www.reddit.com/r/raisedbynarcissists/
(as stated in another comment, though—I really don’t see Harry as being narcissistic)
I should probably make clear that most of my knowledge of AI comes from LW posts, I do not work with it professionally, and that this discussion is on my part motivated by curiosity and desire to learn.
Emotions are a core part of how humans make decisions.
Agreed.
Your assessment is probably more accurate than mine.
My original line was of thinking was that while AIs might use quick-and-imprecise thinking shortcuts triggered by pattern-matching (which is sort of how I see emotions), human emotions are too inconveniently packaged to be much of use in AI design. (While being necessary, they also misfire a lot; coping with emotions is an important skill to learn; in some situations emotions do more harm than good; all in all this doesn’t seem like good mind design). So I was wondering if whatever AI uses for its thinking, we would even recognize as emotions.
My assessment now is that even if AI uses different thinking shortcuts than humans do, they might still misfire. For example, I can imagine a pattern activation triggering more patterns, which in turn trigger more and more patterns, resulting in a cascade effect not unlike emotional over-stimulation/breakdown in humans.
So I think it’s possible that we might see AI having what we would describe as emotions (perhaps somewhat uncanny emotions, but emotions all the same).
P. S. For the sake of completeness: my mental model also includes biological organisms needing emotions in order to create motivation (rather than just drawing conclusions). (example: fear creating motivation to escape danger).
AI should already have a supergoal so it does not need “motivation”. However it would need to see how its current context connects to its supergoal, and create/activate subgoals that apply to the current situation, and here once again thinking shortcuts might be useful, perhaps not too unlike human emotions.
Example: AI sees a fast-moving object that it predicts will intersect its current location, and a thinking shortcut activates a dodging strategy. This is a subgoal to the goal of surviving, which is in turn is a subgoal to the AI’s supergoal (whatever that is).
Having a thinking shortcut (this one we might call “reflex” rather “emotion”) results in faster thinking. Slow thinking might be inefficient to the point of fatal “Hm… that object seems to be moving mighty fast in my direction… if it hits me it might damage/destroy me. Would that be a good thing? No, I guess not—I need to functional in order to achieve my supergoal. So I should probably dodg.. ”
grandiose
Unrealistically so? Is Harry’s assessment of self less accurate than other people’s self-assessments on average?
accurately perceives science
Once again, not what I was talking about.
The traits I mentioned Harry possessing are opposites of what are considered to be narcissistic traits.
From http://en.m.wikipedia.org/wiki/Narcissism
A lack of psychological awareness (see insight in psychology and psychiatry, egosyntonic)
Difficulty with empathy.
Magical thinking: Narcissists see themselves as perfect, using distortion and illusion known as magical thinking. They also use projection to dump shame onto others.
What writes my bottom line to a large extent is that my “read” (intuitive assessment) of Harry is that he isn’t a narcissist. (For reasons I won’t go into now, I trust my ability to recognize that).
With that as starting point, I picked the minimal set of traits, which to me prove conclusively that Harry isn’t a narcissist (unless to a very small extent).
Having high self-awareness of his strengths and weaknesses as a salesman
That’s not at all the same thing as having high self-awareness overall
Understanding people well enough to sell to them
Which may well be not very well at all. Understanding people in the context of sales is not the same as understanding then generally.
Accurately perceiving reality (understanding physics, peoples’ motivations, how to drive to work, how to not act crazy, etc.).
Which most people do. (though actually an accurate understanding of neither physics nor people’s motivations is required to get by). I was taking about the “litany of Tarsky”-esqe desire to always seek the truth, even if unpleasant and emotionally painful to learn—which Harry has and very few people generally do.
Some of Harry’s traits that strike me as strongly non-narcissistic:
high self-awareness—which appears genuine
capable of (correctly) understanding others’ feelings and motivations—does not label or vilify people with very different values
desire to see all things as they really are, even if it’s painful (while narcissists typically have delusions)
Asking “Would an AI experience emotions?” is akin to asking “Would a robot have toenails?”
There is little functional reason for either of them to have those, but they would if someone designed them that way.
Edit: the background for this comment—I’m frustrated by the way AI is represented in (non-rationalist) fiction.
YOU HAVE 1 MINUTE