nazgulnarsil, just because you wouldn’t have to call it a belief doesn’t mean it wouldn’t be one; I believe in the Atlantic Ocean even though I wouldn’t usually say so in those words.
It was rather tiresome the way that Lanier answered so many things with (I paraphrase here) “ha ha, you guys are so hilariously, stupidly naive” without actually offering any justification. (Apparently because the idea that you should have justification for your beliefs, or that truth is what matters, is so terribly terribly out of date.) And his central argument, if you can call it that, seems to amount to “it’s pragmatically better to reject strong AI, because I think people who have believed in it have written bad software and are likely to continue doing so”. Lanier shows many signs of being a smart guy, but ugh.
nazgulnarsil, just because you wouldn’t have to call it a belief doesn’t mean it wouldn’t be one; I believe in the Atlantic Ocean even though I wouldn’t usually say so in those words.
It was rather tiresome the way that Lanier answered so many things with (I paraphrase here) “ha ha, you guys are so hilariously, stupidly naive” without actually offering any justification. (Apparently because the idea that you should have justification for your beliefs, or that truth is what matters, is so terribly terribly out of date.) And his central argument, if you can call it that, seems to amount to “it’s pragmatically better to reject strong AI, because I think people who have believed in it have written bad software and are likely to continue doing so”. Lanier shows many signs of being a smart guy, but ugh.