One of my all-time favorite articles is “The Curse of Xanadu,” by Gary Wolf, which ran in WIRED Magazine in 1995. On the surface, it’s a piece of tech history, a story of a dramatic failure. But look closer, and you can find deep philosophical insight.
Xanadu was a grand vision of a hypertext system, conceived long before the Web, that at the time of this article had been “under development” for three decades without launching. The visionary behind it was Ted Nelson, one of the originators of the concept of hypertext. Here’s how the article describes him and the project:
Nelson’s life is so full of unfinished projects that it might fairly be said to be built from them, much as lace is built from holes or Philip Johnson’s glass house from windows. He has written an unfinished autobiography and produced an unfinished film. His houseboat in the San Francisco Bay is full of incomplete notes and unsigned letters. He founded a video-editing business, but has not yet seen it through to profitability. He has been at work on an overarching philosophy of everything called General Schematics, but the text remains in thousands of pieces, scattered on sheets of paper, file cards, and sticky notes.
All the children of Nelson’s imagination do not have equal stature. Each is derived from the one, great, unfinished project for which he has finally achieved the fame he has pursued since his boyhood. During one of our many conversations, Nelson explained that he never succeeded as a filmmaker or businessman because “the first step to anything I ever wanted to do was Xanadu.”
Xanadu, a global hypertext publishing system, is the longest-running vaporware story in the history of the computer industry. It has been in development for more than 30 years. This long gestation period may not put it in the same category as the Great Wall of China, which was under construction for most of the 16th century and still failed to foil invaders, but, given the relative youth of commercial computing, Xanadu has set a record of futility that will be difficult for other companies to surpass.
The project had many of the earmarks of other failed or long-overdue efforts. As a product, it was over-designed:
Xanadu was meant to be a universal library, a worldwide hypertext publishing tool, a system to resolve copyright disputes, and a meritocratic forum for discussion and debate. By putting all information within reach of all people, Xanadu was meant to eliminate scientific ignorance and cure political misunderstandings. And, on the very hackerish assumption that global catastrophes are caused by ignorance, stupidity, and communication failures, Xanadu was supposed to save the world.
In contrast to the later Web, links in Xanadu did not point to entire documents, but to any arbitrary range of characters within any document. Links were to be bi-directional, so they could not be broken. And there was an advanced feature in which “parts of documents could be quoted in other documents without copying”:
The idea of quoting without copying was called transclusion, and it was the heart of Xanadu’s most innovative commercial feature—a royalty and copyright scheme. Whenever an author wished to quote, he or she would use transclusion to “virtually include” the passage in his or her own document.…
The key to the Xanadu copyright and royalty scheme was that literal copying was forbidden in the Xanadu system. When a user wanted to quote a portion of document, that portion was transcluded. With fee for every reading.
Transclusion was extremely challenging to the programmers, for it meant that there could be no redundancy in the grand Xanadu library. Every text could exist only as an original. Every user in the world would have to have instant access to the same underlying collection of documents.
The vision for the application of this technology was nothing short of utopian, based on delusions of technological solutions to social and epistemic problems:
… the Xanadu architects became obsessed with developing the widest possible applications of hypertext technology. A universal democratic library, they decided, was only the beginning. Xanadu could also provide a tool for rational discussion and decision making among very large groups. In the Xanadu docuverse, an assertion could always be followed back to its original source. An idea would never become detached from its author. Public discussion on important issues would move forward logically, rather than merely swirling ineffectively through eddies of rhetoric. In fact, any reader could, by creating and following links, freeze the chaotic flow of knowledge and grasp the lines of connection and influence.
The design also blithely ignored the realities of computer performance, developing as they were on minicomputers and early workstations:
The Onyx also had 128 Kbytes of RAM, which they later doubled to a screaming 256 Kbytes. Looking back at the specifics of the endeavor, the approach of the Xanadu programmers seems quixotic. [Xanadu collaborator Roger] Gregory and his colleagues were trying to build a universal library on machines that could barely manage to edit and search a book’s worth of text.
The project suffered from infighting and a lack of good management:
“It was not rapid prototyping—it was rabid prototyping,” said one of [Xanadu programmer Michael] McClary’s friends who watched the project closely.
There was never a realistic schedule: the team perpetually believed they were six months away from completion. The project was so badly conceived and managed that it couldn’t ship even after being acquired by Autodesk and given a full budget:
[Autodesk founder] John Walker, Xanadu’s most powerful protector, later wrote that during the Autodesk years, the Xanadu team had “hyper-warped into the techno-hubris zone.” Walker marveled at the programmers’ apparent belief that they could create “in its entirety, a system that can store all the information in every form, present and future, for quadrillions of individuals over billions of years.” Rather than push their product into the marketplace quickly, where it could compete, adapt, or die, the Xanadu programmers intended to produce their revolution ab initio.
“When this process fails,” wrote Walker in his collection of documents from and about Autodesk, “and it always does, that doesn’t seem to weaken the belief in a design process which, in reality, is as bogus as astrology. It’s always a bad manager, problems with tools, etc.—precisely the unpredictable factors which make a priori design impossible in the first place.”
There are too many good quotes in the article to include them all here—read the whole thing.
What struck me most deeply, however, was the response of some of the Xanadu team to the rise of the World Wide Web. You would think that the web would be an object lesson for them—a slap in the face hard enough to wake them from their pie-in-the-sky reverie and bring them back to Earth.
Indeed, one junior programmer on the later team, Rob Jellinghaus—who was born after the Xanadu project had begun (!)—did have such an awakening:
While the Xanaduers paid lip service to libertarian ideals, they imagined a more traditional revolution in which all users would be linked to a single, large, utopian system. But in their quest for a 21st-century model, they created a Byzantine maze.
“There were links, you could do versions, you could compare versions, all that was true,” Jellinghaus reports, “provided you were a rocket scientist. I mean, just the code to get a piece of text out of the Xanadu back end was something like 20 lines of very, very hairy C++, and it was not easy to use in any sense of the word. Not only was it not easy to use, it wasn’t anything even remotely resembling fast. The more I worked at it, the more pessimistic I got.”
The young programmer’s doubts were magnified by his dawning realization that a grand, centralized system was no longer the solution to anything. He had grown up with the Internet—a redundant, ever-multiplying and increasingly chaotic mass of documents. He had observed that users wanted and needed ever more clever interfaces to deal with the wealth of information, but they showed little inclination to obey the dictates of a single company.…
Although he sympathized with the fanaticism of his colleagues, Jellinghaus also began to question whether a hypertext revolution required the perfect preservation of all knowledge. He saw the beauty of the Xanadu dream—“How do you codify all the information in the world in a way that is infinitely scalable?”—but he suspected that human society might not benefit from a perfect technological memory. Thinking is based on selection and weeding out; remembering everything is strangely similar to forgetting everything. “Maybe most things that people do shouldn’t be remembered,” Jellinghaus says. “Maybe forgetting is good.” …
After a couple of months, he began to come to his senses. “What was I doing?” he remembers saying to himself. “This is silly. This was silly all along.”
But here was the reaction of Mark Miller, one of the original developers:
I asked Miller if the Internet was accomplishing his dreams for hypertext. “What the Web is doing is easy,” Miller answered. He pointed out that the Web still lacks nearly every one of the advanced features he and his colleagues were trying to realize. There is no transclusion. There is no way to create links inside other writers’ documents. There is no way to follow all the references to a specific document. Most importantly, the World Wide Web is no friend to logic. Rather, it permits infinite redundancy and encourages maximum confusion. With Xanadu—that is, with tranclusion and freedom to link—users would have had a consistent, easily navigable forum for universal debate.
“This is really hard,” Miller said.
And what about Nelson himself?
Nelson’s response to the Web was “nice try.” He said it is a trivial simplification of his hypertext ideas, though cleverly implemented. And he has not entirely given up hope for the old Xanadu code. “I’d like to stress that everyone involved in Xanadu believes that the software is valid and can be finished,” he asserted.
“It will be finished,” Nelson added. “The only question is which decade.”
Miller, Nelson, and the rest of the Xanadu team might have benefitted from reading another one of my all-time favorite articles: Clay Shirky’s “In Praise of Evolvable Systems”.
Shirky begins by pointing out several ways in which the fundamental standards of web technology have seemingly absurd limitations and inefficiencies: HTTP doesn’t use persistent connections and incurs the entire overhead of a new session for each file transferred, web servers have no built-in load balancing, HTML uses one-directional hypertext links that are easily broken, etc.:
HTTP and HTML are the Whoopee Cushion and Joy Buzzer of Internet protocols, only comprehensible as elaborate practical jokes. For anyone who has tried to accomplish anything serious on the Web, it’s pretty obvious that of the various implementations of a worldwide hypertext protocol, we have the worst one possible.
Except, of course, for all the others.…
The problem with that list of deficiencies is that it is also a list of necessities—the Web has flourished in a way that no other networking protocol has except e-mail, not despite many of these qualities but because of them. The very weaknesses that make the Web so infuriating to serious practitioners also make it possible in the first place. In fact, had the Web been a strong and well-designed entity from its inception, it would have gone nowhere.
Contrasting the “evolvable system” of the web with centrally designed protocols such as Gopher and WAIS, he concludes:
Centrally designed protocols start out strong and improve logarithmically. Evolvable protocols start out weak and improve exponentially. It’s dinosaurs vs. mammals, and the mammals win every time.
The Xanadu project still exists. I was able to quickly learn its current status, because it has a homepage on a global hypertext-based information system: https://xanadu.com.
“With ideas which are still radical, WE FIGHT ON. We hope for vindication, the last laugh, and recognition as an additional standard…” It complains that “everyone is hypnotized by the Web browser,” which is “basically crippled.”
The project is no longer entirely vaporware. There are two demo viewers of Xanadocs: “XanaduSpace is our best-looking viewer, our flagship demo—but alas, it’s a stuck demo and can’t go further.” The “new working viewer”, xanaviewer3, makes it “possible (but not easy) for anyone who is determined enough to create a xanadoc, and send it to others, who may open and use it.” One supposes that, in order to view it, the recipients of the document must be equally determined.
“With our limited resources,” they explain, “we can only go slowly, unlike today’s Red Bull–fueled young teams.”
The WIRED article describing Xanadu as running for over 30 years is now 27 years old, meaning Xanadu itself is nearing 60. If its “record of futility” was difficult to surpass back then, it is doubly so now.
The lessons of Xanadu can be learned at multiple levels.
On one level, the lesson is to scope projects realistically and to strive for simplicity of design.
On a deeper level, the lesson is to ship continually. Doing so keeps the schedule honest, forces difficult scope decisions, and allows for feedback from real users.
On a still deeper level, the lesson is to learn from failure, which the vast majority of the Xanadu team does not appear to have done: thirty years of missed deadlines did not cause them to fundamentally question their schedule, project management, or design scope.
But the deepest lesson, I think, is to value real-world results. Nelson and Miller didn’t fail to notice the Web, they failed to care about its success or even to recognize it as a success. Its epic, world-changing status in the history of technology is meaningless to them beside the fantasy system they had dreamed up.
In the end, despite the title of the WIRED article, Xanadu was not, in fact, cursed. It achieved exactly what its originators wanted: theoretical perfection in a Platonic realm of forms so idealized that it can never quite be brought to Earth.
The Xanadu story is indeed a classic full of interesting lessons. Specifically “ship early” is now a platitude in web software development.
But I also have some sympathy for the developers who still dream of finishing Xanadu: From a purely academic perspective, it would be interesting what such a system would look like and whether it could work in any way. It appears to have anticipated some form of DRM (digital rights management) which is now routinely used for things like Netflix.
Xanadu reminds me of “Google Wave“ from 2009: A quite ambitious (though much less ambitious than Xanadu) project which tried to replace email with something much more dynamic and flexible. Wave didn’t succeed, but I think not because Wave was too rigid and overengineered compared to email (it arguably wasn’t), but because email was already firmly established. Any alternative didn’t have the network effect on their side.
This leads me to think: Are there are any exceptions to the Xanadu lessons?
One case is perhaps the computer game “Star Citizen”. It’s an completely overambitious crowdfunding project which is in development for more than a decade. Apparently it is still nowhere near finished, and many think it will never be. But unlike Xanadu, Star Citizen did definitely “evolve” pretty much from the start with its users, who got access to the unfinished game very early on. But dynamically evolving development with the userbase seems not sufficient for a project not to get hung up in unrealistic platonic ideals.
Another case is, perhaps, the invention of the computer. I don’t mean the actual invention(s) during the second world war, but Charles Babbage’s Analytical Engine a hundred years earlier. He mostly just produced elaborate plans, since he didn’t have the financial means of implementing them. So his visionary ambitions didn’t get anywhere. He was mostly forgotten, and computers were set back for a century. But it seems plausible today that his project could well have succeeded at the time if he had access to a substantial engineering and research team. Some projects can apparently only succeed with a ton of antecedent effort, because they simply have (unlike Xanadu with the WWW) no small analogue.
But I generally agree: for most projects the Xanadu lessons seem applicable.
I don’t think so. Several computing machines were prototyped in the early 20th century, and none of them really took off until they were made fully electronic (ENIAC). Any system with moving parts is just too slow to be of practical interest.
Back then there was already significant demand for “human computers”: https://en.wikipedia.org/wiki/Computer_(occupation)#Origins_in_sciences I think it is plausible than one of Babbage’s steam powered machines could have been faster than quite a few people.
Well, as far as I can tell, even the electromechanical computers of the 1930s were not significantly faster than humans using mechanical calculators. That’s why I don’t think it would have worked in Babbage’s day. More details in the middle of this essay https://rootsofprogress.org/epistemic-standards-for-why-it-took-so-long
Thanks!
My impression was that wave failed because it was slow and miserable to use. Maybe it would have failed later on for your reason as well, but this was the reason it failed for me.
The great and mighty Google didn’t actually have the ability to make a fairly complex UI for the web that could scale to 100 items. As of today, the craft of UI programming has been lost to all but a few Qt programmers. Google are now gradually rebuilding the art, with Flutter, and I think they may succeed, and this will have a shocking quantity of downstream consequences.
The only thing I remember about Wave was that it was a collaboration tool, but it was in private beta, and when I got in, I couldn’t invite others to the private beta to collaborate with them. So I poked around a bit and then never came back to it. I suspect many others had the same issue.
Sometimes projects are killed not for deep reasons but just from simple, basic execution failures.
What does Flutter do that various JS/CS frameworks don’t? Does it give design advantages on web, or is the benefit that it brings web design to non-web applications?
It makes non-web applications possible. It has a better layout system, rendering system. Animates everything properly. Centers Dart, which seems to be a pretty good language: It can be compiled ahead of time for faster boots (although I’m not completely sure that typescript wont be basically just as compilable once wasm-gc is up), has better type reflection, will have better codegen (already supports annotations), has a reasonable import system, better data structures, and potentially higher performance due to the type system not being an afterthought (although typescript is still very good relative to dart).
Reminds me of arbital.
Fun trivia: Arbital was internally called project zanadu pre-naming, as an attempt to keep awareness of these failure modes in mind.
Arbital was functional and fine. It only failed at the adoption stage for reasons that’re still mysterious to me. I’m reluctant to even say that it did fail, I still refer people to a couple of the articles there pretty often.
It doesn’t seem that way to me. As best I can tell Arbital failed at its own goals because we’re not using it as a repository for knowledge about AI alignment, so by its own objectives it was not working fine.
I dread when I have to look something up in Arbital. It’s slow, which is fixable, but also somewhat confusing to navigate. I actually agree with Tofly’s sentiment that I think from the outside it seems like Arbital tried to bite off more than it could chew rather than trying to address problems incrementally.
I recall another over-ambitious project from way back, when USENET was the place to discuss everything and spam had not been invented. This was a time when email typically took anything from minutes to days to reach someone, depending on how well-connected both sender and recipient were, and USENET posts likewise. And there was nothing faster.
Unfortunately, I’ve never found any archive of USENET that goes back far enough to find this again, so the following is from memories that must be over 30 years old. I would be interested to know if anyone else remembers.
Someone had the idea of what we nowadays call an MMORPG: a massively multi-player online game of galactic exploration and conquest. Of course, this would not happen in real time. Nothing did in those days except your interaction with the machine directly in front of you, and not always then. No, this would be roughly turn-based. You would submit your moves, and a few days later you would see the outcome. Imagine Eve Online played by postal mail.
The global state of the whole game would be maintained using the same protocols that USENET itself used to distribute articles. (USENET had no centralised store, like a bulletin board would: every message was copied to every participating machine, each machine communicating with a few neighbours on the net to share new messages.) To save on memory, all of the assets would be pseudo-randomly generated, so the whole geography of a planet could be represented by its random seed, as would the distribution of stars across the galaxy. A huge discussion sprang up across several USENET newsgroups, and ran for months, with people posting idea after idea of what to put in the game and how things would work. It was awesome.
To the best of my knowledge, no code for this game was ever written.
I took from this a warning against continually raising one’s sights and never firing through them, dreaming ever greater dreams but never rising from the opium couch, all exploration and no exploitation.
It is possible that some of the people involved might have been inspired by it to make their own games that did see the light of day, although I’ve never heard anyone say so.
Sounds a little like StarWeb? Recently read a lovely article about a similar but different game, Monster Island, which was a thing from 1989 to 2017.
But yes, my default assumption would be that the particular conversation you’re referring to never resulted in a game that saw the light of day; I’ve seen many detailed game design discussions among people I’ve known meet the same fate.
In How Not to Network a Nation, Benjamin Peters points out that Soviet computer networking projects failed partially due to this same centralizing idealism. Soviet networking projects all saw the ideal network as a central nervous system, connecting productive outposts with the central “brain” in Moscow. By contrast, the developers of ARPANET envisioned their network as a brain with the individual computers as the neurons. The formulation of ARPANET was as a distribution of homogeneous elements, whereas the formulation of the OGAS (the largest Soviet project) was hierarchical and heterogeneous from the start. It’s interesting to see the similarities between all of these unsuccessful networking projects.
I’m in this picture and I don’t like it.
I’ve been trying for years to design (entirely inside my head and tons of disconnected notes, with no cohesion or organization or any clear idea of how to build prototypes) a… thingy, that is very kitchen sink ish (it’s a social media network! it makes people rational by prompting them to think about things in ways! it’s a prediction market! it’s an effing MMO!!), never settles down into a specific defined form… etc.
This is a perennial failure mode of my cognition—I’ve done it with other things too—when I was a young teen and into New Age stuff I tried to build a grand unified theory of everything as modes of vibration in cosmic consciousness! And then there’s my tendency to try to fit every story idea I have into the same fictional universe...
Oh, but that last one could be fun...
The obvious thing to do, of course, is to figure out why you are trying to design one thing that is all those things, and see if a simpler thing to do becomes obvious at that point. The reason could simply be that you like to plan complicated things, in which case settle on a problem that is genuinely complicated and doesn’t need any complications added, or it could be because your interests change rapidly, in which case you focus on a simple but expansive framework so that you can hit the ground running on whatever your new interest is, and achieve something (even just a beautiful thought) in that area before your interests change again. More interesting is if there is some commonality between all the things you switch between, and you can just focus on that part, because that is where your true interest lies.
Obvious, boring but surprisingly difficult to follow advice. Practice taking notes, organizing things, building working projects (which could be as simple as fixing a door that doesn’t close quite right, or a floorboard that squeaks, or as complicated as learning a new programming language to write a short program that does one thing.), and so on. Planning things successfully is a big deal for people that currently don’t. (I should take my own advice...)
A place where nobody dared to go…
I can’t agree with your proposals for what to learn from this failure, which honestly read like cached thoughts to me. The actual issues outlined are hubris, which is perhaps the oldest known human bias, and an immense focus on centralized coordination with all the immense failures that has caused throughout human history not being solvable by fiat.
The lessons aren’t ‘evolve’ and ‘ship’. They are ‘pick an achievable goal’ and ‘let the people coordinate amongst themselves’. The former are simply methods aimed at achieving ‘pick an achievable goal’. Sometimes they are the right methods, and other times, they are not. The achievable goal of the web: ‘follow links’. The web is also obviously not centralized (no matter what the people complaining about social media tell you.). Local centers of coordination will pop up organically, without having to solve the hard problem of planning them in totality.
We tend to ‘learn’ what we want to learn, and that mix of hubris and lack of new thought makes it hard to learn when we fail due to our hubris.