This is exactly the reason why I asked the initial question. There is a reading of tailcalled’s statement which makes it correct and there is a reading which makes it wrong. I was curious which meaning is implied, and whether the difference between the two even understood.
When talking about top performance in highly specific domains, one should indeed use lots of domain specific tricks. But in a grand scheme of things the rule of “coherence + contact with the world” is extremely helpful, among other things it allows to derive all the specific tricks for all the different domains.
Likewise, there is a sense in which rationalist-empiricists project didn’t deliver to the fullest of our expectations when solving multiple specific technical problems. On the other hand, it definetely has succeed in a sense that philosophies based on this approach were so triumphant and delivered so many fruits, that we put them in a league of their own called “science”.
When talking about top performance in highly specific domains, one should indeed use lots of domain specific tricks. But in a grand scheme of things the rule of “coherence + contact with the world” is extremely helpful, among other things it allows to derive all the specific tricks for all the different domains.
This assumes you have contact with all the different domains, which you don’t, rather than just some of them.
Can you give an example of a domain which I have no contact with, so the coherence + contact with the world methodology won’t help me to figure out the corresponding domain specific tricks for succeeding in it, yet such tricks exist in principle?
But I don’t think you are doing space colonization. I’d guess you are doing reading/writing on social media, programming, grocery shopping, cooking, … . And I think recursive self-improvement is supposed to work with no experience in space colonization.
Meaning of my comment was “your examples are very weak in proving absense of cross-domain generalization”.
I can buy that there’s a sort of “trajectory of history” that makes use of all domains at once, I just think this is the opposite of what rationalist-empiricists are likely to focus on.
And if we are talking about me, right now I’m doing statistics, physics and signal processing, which seems to be awfully generalizable.
This is precisely the position that I am referring to when I say “the assumption was that the world is mostly homogeneous”. Like physics is generalizable if you think the nature of the world is matter. And you can use energy from the sun to decompose anything into matter, allowing you to command universal assent that everything is matter. But does that mean matter is everything? Does your physics knowledge tell you how to run a company? If not, why say it is “awfully generalizable”?
I don’t see how it makes sense in the context we are talking about.
Let’s take farming. Clearly, it’s not some separate magisteria which I do not have any connection to. Farming is happening in the same reality. I can see how people farm things, do it myself, learn about different methods, do experiments myself and so on. The “coherence + contact with the world” seems to be very helpful here.
I think of “the rationalist project” as “having succeeded” in a very limited and relative sense that is still quite valuable.
For example, back when the US and Chinese governments managed to accidentally make a half-cocked bioweapon and let it escape from a lab and then not do any adequate public health at all, or hold the people who caused the megadeath event even slightly accountable, and all of the institutions of basically every civilization on Earth failed to do their fucking jobs, the “rationalists” (ie the people on LW and so on) were neck and neck with anonymous anime catgirls on twitter (who overlap a lot with rationalists in practice) in terms of being actually sane and reasonable voices in the chaos… and it turns out that having some sane and reasonable voices is useful!
Eliezer says “Rationalists should win” but Yvain said “its really not that great” and Yvain got more upvotes (90 vs 247 currently) so Yvain is prolly right, right? But either way it means rationality is probably at least a little bit great <3
This is exactly the reason why I asked the initial question. There is a reading of tailcalled’s statement which makes it correct and there is a reading which makes it wrong. I was curious which meaning is implied, and whether the difference between the two even understood.
When talking about top performance in highly specific domains, one should indeed use lots of domain specific tricks. But in a grand scheme of things the rule of “coherence + contact with the world” is extremely helpful, among other things it allows to derive all the specific tricks for all the different domains.
Likewise, there is a sense in which rationalist-empiricists project didn’t deliver to the fullest of our expectations when solving multiple specific technical problems. On the other hand, it definetely has succeed in a sense that philosophies based on this approach were so triumphant and delivered so many fruits, that we put them in a league of their own called “science”.
This assumes you have contact with all the different domains, which you don’t, rather than just some of them.
Can you give an example of a domain which I have no contact with, so the coherence + contact with the world methodology won’t help me to figure out the corresponding domain specific tricks for succeeding in it, yet such tricks exist in principle?
Farming, law enforcement, war, legislation, chip fabbing, space colonization, cargo trucking, …
Space colonization obviously includes cargo trucking, farming, legislation, chip fabbing, law enforcement, and, for appreciators, war.
But I don’t think you are doing space colonization. I’d guess you are doing reading/writing on social media, programming, grocery shopping, cooking, … . And I think recursive self-improvement is supposed to work with no experience in space colonization.
Meaning of my comment was “your examples are very weak in proving absense of cross-domain generalization”.
And if we are talking about me, right now I’m doing statistics, physics and signal processing, which seems to be awfully generalizable.
I can buy that there’s a sort of “trajectory of history” that makes use of all domains at once, I just think this is the opposite of what rationalist-empiricists are likely to focus on.
This is precisely the position that I am referring to when I say “the assumption was that the world is mostly homogeneous”. Like physics is generalizable if you think the nature of the world is matter. And you can use energy from the sun to decompose anything into matter, allowing you to command universal assent that everything is matter. But does that mean matter is everything? Does your physics knowledge tell you how to run a company? If not, why say it is “awfully generalizable”?
I don’t see how it makes sense in the context we are talking about.
Let’s take farming. Clearly, it’s not some separate magisteria which I do not have any connection to. Farming is happening in the same reality. I can see how people farm things, do it myself, learn about different methods, do experiments myself and so on. The “coherence + contact with the world” seems to be very helpful here.
I think of “the rationalist project” as “having succeeded” in a very limited and relative sense that is still quite valuable.
For example, back when the US and Chinese governments managed to accidentally make a half-cocked bioweapon and let it escape from a lab and then not do any adequate public health at all, or hold the people who caused the megadeath event even slightly accountable, and all of the institutions of basically every civilization on Earth failed to do their fucking jobs, the “rationalists” (ie the people on LW and so on) were neck and neck with anonymous anime catgirls on twitter (who overlap a lot with rationalists in practice) in terms of being actually sane and reasonable voices in the chaos… and it turns out that having some sane and reasonable voices is useful!
Eliezer says “Rationalists should win” but Yvain said “its really not that great” and Yvain got more upvotes (90 vs 247 currently) so Yvain is prolly right, right? But either way it means rationality is probably at least a little bit great <3