I don’t really have any recommendations, but I do have a story. LSD was actually one of the catalysts that helped me find out about the idea of AGI and the LessWrong community. The story ended up a little long, but I figured I’d share if anyone was interested.
When I was 17, I started experimenting with LSD. I tried successively higher doses until one day I had a very strong experience. I often describe LSD as something that forces you to “drop your filters,” the idea being that we take in so much sensory data at any moment, we can’t possibly process all of it at once. So what we do is we categorize things into more general levels of organization. Instead of looking at each discrete sight quantum (“pixel” for lack of a better word) of a tree, we instead see the general shape, color, texture, and classify it as a “tree” in order to save processing power. LSD inhibits this process, making you look closer and actually see the details. I suspect that this is why some people report having a better appreciation for nature and natural beauty after taking LSD.
As a parallel to this phenomenon, LSD makes you take a closer look at yourself, your actions, and your trajectory in the same fashion. For me, it started all the way at the beginning, at the level of individual atoms and molecules. At the time, I had a basic understanding of physics and evolution. During my trip, I looked at the sand and saw how it moved in the waves, then started chasing the details of what I was seeing all the way down to atoms and molecules (mentally, of course). Then I started visualizing a rough outline of how the first replicators might have come about; how many would be dashed against the rocks and destroyed. But eventually some would get more complicated and have a slightly higher chance of surviving. At this point, it’s all about survival and being able to replicate. Certain variations would make them more likely to survive, but often these would require more energy, like locomotion. Eventually, they would get larger and the energy demands would be greater and they would have to have more purposeful movements to obtain food. Once we’re at the point where they have brains, reward centers would develop to guide their actions to make them more likely to survive and replicate.
While many of my ideas at the time were not quite right, if not wrong entirely, I think this was the first time that I’d intuitively understood the idea of evolution. So eventually I got to the point in this process I was imagining where survival needs are all taken care of and I thought about how people would have those reward systems still in place and that might not be a good thing.
I started asking myself, “What is the next ‘noble’ step to take”. I remember that that was the way I phrased the question at the time. I pictured people getting to this stage and then just falling into a cycle of trying to bliss out their reward systems and never advancing. I didn’t yet understand the philosophical idea of objective vs subjective morality. I wondered if there was some “right” thing for people to do once they got to this step. What should happen next?
Well, I never got past that point on my trip, but I had gotten to the point where I could start asking the right kinds of questions. Once I was back in a sober state of mind, I took to the internet and started searching. Eventually, I found Eliezer’s Levels of Organization in General Intelligence, the technical details of which were over my head, but I learned from it. I was amazed at what I was finding, so I googled the author. That brought me to Yudkowsky.net which lead me to LessWrong and the sequences. I read through all of them and they gave me the foundations of my current understanding, along with helping me realize that AGI is quite possibly the most important problem in the world.
I was amazed with how often one of the articles in the sequences would directly address a question or an uncertainty I had spent a lot of time thinking and wondering about. However, more important than answering the questions, they helped me phrase them the right way.
So my deep wisdom on LSD is that it opens doors, but it won’t get you through them. It can help you see things with fresh eyes, without your filters, and gives the opportunity to reshape your understanding if you choose to.
I don’t really have any recommendations, but I do have a story. LSD was actually one of the catalysts that helped me find out about the idea of AGI and the LessWrong community. The story ended up a little long, but I figured I’d share if anyone was interested.
When I was 17, I started experimenting with LSD. I tried successively higher doses until one day I had a very strong experience. I often describe LSD as something that forces you to “drop your filters,” the idea being that we take in so much sensory data at any moment, we can’t possibly process all of it at once. So what we do is we categorize things into more general levels of organization. Instead of looking at each discrete sight quantum (“pixel” for lack of a better word) of a tree, we instead see the general shape, color, texture, and classify it as a “tree” in order to save processing power. LSD inhibits this process, making you look closer and actually see the details. I suspect that this is why some people report having a better appreciation for nature and natural beauty after taking LSD.
As a parallel to this phenomenon, LSD makes you take a closer look at yourself, your actions, and your trajectory in the same fashion. For me, it started all the way at the beginning, at the level of individual atoms and molecules. At the time, I had a basic understanding of physics and evolution. During my trip, I looked at the sand and saw how it moved in the waves, then started chasing the details of what I was seeing all the way down to atoms and molecules (mentally, of course). Then I started visualizing a rough outline of how the first replicators might have come about; how many would be dashed against the rocks and destroyed. But eventually some would get more complicated and have a slightly higher chance of surviving. At this point, it’s all about survival and being able to replicate. Certain variations would make them more likely to survive, but often these would require more energy, like locomotion. Eventually, they would get larger and the energy demands would be greater and they would have to have more purposeful movements to obtain food. Once we’re at the point where they have brains, reward centers would develop to guide their actions to make them more likely to survive and replicate.
While many of my ideas at the time were not quite right, if not wrong entirely, I think this was the first time that I’d intuitively understood the idea of evolution. So eventually I got to the point in this process I was imagining where survival needs are all taken care of and I thought about how people would have those reward systems still in place and that might not be a good thing.
I started asking myself, “What is the next ‘noble’ step to take”. I remember that that was the way I phrased the question at the time. I pictured people getting to this stage and then just falling into a cycle of trying to bliss out their reward systems and never advancing. I didn’t yet understand the philosophical idea of objective vs subjective morality. I wondered if there was some “right” thing for people to do once they got to this step. What should happen next?
Well, I never got past that point on my trip, but I had gotten to the point where I could start asking the right kinds of questions. Once I was back in a sober state of mind, I took to the internet and started searching. Eventually, I found Eliezer’s Levels of Organization in General Intelligence, the technical details of which were over my head, but I learned from it. I was amazed at what I was finding, so I googled the author. That brought me to Yudkowsky.net which lead me to LessWrong and the sequences. I read through all of them and they gave me the foundations of my current understanding, along with helping me realize that AGI is quite possibly the most important problem in the world.
I was amazed with how often one of the articles in the sequences would directly address a question or an uncertainty I had spent a lot of time thinking and wondering about. However, more important than answering the questions, they helped me phrase them the right way.
So my deep wisdom on LSD is that it opens doors, but it won’t get you through them. It can help you see things with fresh eyes, without your filters, and gives the opportunity to reshape your understanding if you choose to.
Wonderful story, and a good explanation of how psychedelics can be used to aid intuition. :)