I certanly think you’re right, that the conscious mind and conscious decisions can to a large extent re-write a lot of programming of the brain.
I am surprised to think that you think that most rationalists don’t think that. (That sentence is a mouthful, but you know what I mean.) A lot of rationalist writing is devoted to working on ways to do exactally that; a lot of people have written about how just reading the sequences helped them basically repogram their own brain to be more rational in a wide variety of situations.
Are there a lot of people in the rationalist community who think that conscious thought and decision making can’t do major things? I know there are philosophers who think that maybe consciousness is irrelevant to behavior, but that philosophy seems very much at odds with LessWrong-style rationality and the way people on LessWrong tend ot think about and talk about what consciousness is.
Are there a lot of people in the rationalist community who think that conscious thought and decision making can’t do major things?
It’s not that they think it cannot do major things at all. They don’t expect to do be able to do them overnight, and yes “major changes to subconscious programming overnight” is one of the things I’ve seen to be possible if you hit the right buttons. And of course, if you can do major things overnight, there are some even more major things you find yourself being able to do at all, and you couldn’t before.
This might be a violation of superrationality. If you hack yourself, in essence a part of you is taking over the rest. But if you do that, why shouldn’t part of an AI hack the rest of it and take over the universe?
I certanly think you’re right, that the conscious mind and conscious decisions can to a large extent re-write a lot of programming of the brain.
I am surprised to think that you think that most rationalists don’t think that. (That sentence is a mouthful, but you know what I mean.) A lot of rationalist writing is devoted to working on ways to do exactally that; a lot of people have written about how just reading the sequences helped them basically repogram their own brain to be more rational in a wide variety of situations.
Are there a lot of people in the rationalist community who think that conscious thought and decision making can’t do major things? I know there are philosophers who think that maybe consciousness is irrelevant to behavior, but that philosophy seems very much at odds with LessWrong-style rationality and the way people on LessWrong tend ot think about and talk about what consciousness is.
It’s not that they think it cannot do major things at all. They don’t expect to do be able to do them overnight, and yes “major changes to subconscious programming overnight” is one of the things I’ve seen to be possible if you hit the right buttons. And of course, if you can do major things overnight, there are some even more major things you find yourself being able to do at all, and you couldn’t before.
This might be a violation of superrationality. If you hack yourself, in essence a part of you is taking over the rest. But if you do that, why shouldn’t part of an AI hack the rest of it and take over the universe?