16 years old, I’m interested in AI alignment, rationaluty & philosophy, economy and politics.
Crazy philosopher
I realized something important about psychology that is not yet publicly available, or that is very little known compared to its importance (60%). I don’t want to publish this as a regular post, because it may greatly help in the development of GAI (40% that it helps and 15% that it’s greatly helps), and I would like to help only those who are trying to create an alligned GAI. What should I do?
For a joke to be funny, you need a “wow effect” where the reader quickly connect together few evidences. But- go on! I’m sure you can do it!
This is a good philosophical exercise- can you define “humor” to make a good joke
The probability of the existence of the whole universe is much less than the existence of a single brain, so most likely we are an Eliezer dream.
Guessing the Teacher’s Password: Eliezer?
To modulate the actions of the evil genius in the book, Eliezer imagines that he is evil.
ok thanks
I realized something important about psychology that is not yet publicly available, or that is very little known compared to its importance (60%). I don’t want to publish this as a regular post, because it may greatly help in the development of GAI (40% that it helps and 15% that it’s greatly helps), and I would like to help only those who are trying to create an alligned GAI. What should I do?
Moloch’s Army
We worship what brings success. Therefore, crime bosses worship power, philosophers from lesswrong worship intelligence, and middle managers worship Moloch. And just as we are ready to be curious, even if it is not optimal in this case, and to persuade others to be curious, middle managers will spread the “cult of Moloch”. The same psychological mechanism.
The fascist project was an attempt to turn national politics into a maze. Fascism consists in creating the most competitive state possible, and so that individual parts of the nation do not fight with each other, taking away resources that can be directed to fight with other nations. This is literally everything that fascism has. And indeed, at first, the fascist states, which directed most of their economy to the army (competitiveness), won, and began to form alliances with each other in order to fight together against those who did not worship Moloch. Many countries even began to become fascist only because the fascist states in that epoch were strong and an alliance with them was beneficial. But, thanks to coordination, we were able to reverse this process. This happened because we had all the information about politics and wars (in corporations the boss may not even notice the conflicts between subordinates and win of a maze on some level).
Moral: to resist the mazes, we need to have as much information as possible about the psychology of other members of the company, and coordinate with other opponents of the mazes against the Moloch cultists.
Professional sport is a maze in the sense that there is a huge competition there, and if you want to reach the level of professional sports, you will have to sacrifice all the health and personal time that will be required.
Let me reformulate this essay in one paragraph:
Glomarization is good, but sometimes we can’t use it because others don’t understand the principle of Glomarization, or because you have too many counterfactual selves, and some of them won’t like just telling the truth. Therefore, when you are asked about Jews in the attic, it is acceptable to lie, but when you are asked if you would lie about Jews in the attic, you must ALWAYS tell the truth. So meta honesty is just a way to use glomarization as often as you want.
So it shouldn’t be surprising if acting like you have more status than I assign to you triggers a negative emotion, a slapdown response.
I think there’s a different mechanism here. I don’t like it if Mr. A can’t do X, but doesn’t know about it, publicly announces that he’s going to do X, and gets a lot of prestige upfront. At the same time, I understand that he will not succeed, and he should not get prestige. And after that, A fails, and it makes me feel worse about those who claim that they can do X if they have no experience.
Imagine that some philosopher announces that he is going to create an aligned AGI in a month, after which everyone begins to admire him. That’s exactly the feeling.
In other words, the problem is not that Mr. A doesn’t have enough prestige, but that he doesn’t have enough chances to succeed.
… but even if Mr. A decides to create an aligned AGI in a month without announcing it publicly, then you will wisely say, “This is impossible. Once I also thought that I could do it in a month, but it’s not like that.”. Wait—this is the reaction “juggling 3 balls is impossible”!
What did I understand: most of the exclamations “you don’t have enough experience / look at yourself from the outside / it’s not possible” from experts in this domainare true. I mean, if you decide to do X, but all the experts in the domain say that you will not succeed, this is quite strong Bayesian evidence in favor of the fact that you will not succeed. You can’t dismiss it by deciding that they’re just afraid to share their status.
But otherwise I agree with Eliezer.
Sometimes, maybe you don’t have time for friends to let you know. You’re living an hour away from a wildfire that’s spreading fast. And the difference between escaping alive and asphyxiating is having trained to notice and act on the small note of discord as the thoughts flicker by:
“Huh, weird.”
Our civilization lives an hour away from a dozen metaphorical fires, some of which no living person has seriously thought about
We have a lot of people showing up, saying “I want to help.” And the problem is, the thing we most need help with is figuring out what to do. We need people with breadth and depth of understanding, who can look at the big picture and figure out what needs doing
Figure out how best to spread rationality, or at least ideas about X-risks. This is quite possible with our resources equal to zero, but if we can spread these ideas to, for example, 20% of the population, it will greatly help us with the fight against X-risks. In addition, we will have more people who will help us… to think about what we should to do, lol
1) “I think we call this “taxes”.”
So I invented taxes for charitable donations.
2) The second option is better for most participants, but not for everyone, you are right
Joint mandatory donation as a way to increase the number of donations
This is a very useful article that helped me understand many things about myself and society. Thanks!
This is a very useful article that helped me understand many things about myself and society. Thanks!
Okay I’ll rewrite the post. Thanks for your answers
That’s true, but Program B will still be worse than a human-written program, so we aim to avoid spaghetti towers.
Spaghetti towers work especially poorly in changing environments: if evolution were reasonable, it would force us to try to maximize the number of our genes in the next generation. But instead, she created several heuristics like hunger and desire for groin friction. So when people came up with civilization, we started eating fast food and having sex with condoms.
People with the simulacra level 4th can praise their political allies.
I understand. My question is, can I publish an article about this so that only MIRI guys can read it, or send in Eliezer e-mail, or something.