Hello, I am asking for some insights for a research I am doing. Can you cite examples of technologies that have been forgotten? What I mean by “forgotten” is not things we don’t know how to do but used to (I suspect there aren’t that many), nor things that are no longer in use but used to (mechanical television), but things that were decently developed (either in theory or in practice) but never “saw the light of day” anyway.
It’s my first time posting, so I won’t do much policing on the answers, thanks in advance.
I would argue that spaced repetition is one such technology. We’ve known about forgetting curves and spaced repetition as a way of efficiently memorizing data since at least the ’60s, if not before. Yet, even today, it’s hardly used and if you talk to the average person about spaced repetition, they won’t have a clue as to what you’re referring to.
Here we have a really cool technology, which could significantly improve how we learn new information, and it’s being used maybe 5% as often as it should be.
It really needs a personal computer to schedule the repetitions, and we’re only now getting to the point where every schoolchild having their own handheld computer is a somewhat practical proposition.
The Pimsleur series of language courses are just audio, and they use spaced repetition (among other research-backed techniques) without a computer. They’ve got an app now, but the original tapes would work on a Walkman. You’re supposed to do one lesson per day. They’ve scheduled the material to bring vocabulary words up when you’re about to forget them.
Still worse than a computer, since they can’t take feedback on words that you’ve learned better. It only works if your learning rates for different words are what the tape maker expected.
Also this won’t work for the end run of spaced repetition where a well-practiced card might pop up a year after it was last reviewed. The long-lived cards are going to be a very eclectic mix. Then again, school courses usually don’t expect you to retain the stuff from each course past the duration of the course, so this isn’t that much of a shortcoming for education.
There’s a large class of viable pharmaceuticals which don’t see the light of day because their unpatentability causes companies not to fund the clinical trials which would be necessary to clear regulatory approval.
Could you cite any? Or at least point me at some research/source on the subject?
The Smalltalk programming language and environment was revolutionary at the time and still highly influential to this day. Lots of later languages have copied some of its features, and none of them really got it right.
The grammar is extremely simple and easy to pick up compared to most industry languages. A famous small program demonstrating all of the language (but not the library) fits on a postcard.
Using the debugger, you can catch an exception, walk up the stack, and correct, recompile and swap in individual methods while the program is still running. You can save the entire state of the program at any time and resume it at a later time, even on another machine. You need an entire OS in a VM to do this in almost any other language.
The tight feedback loops you get from its interactive programming stye is arguably superior to almost anything else we have today, although e.g. Python or ClojureScript can approach this level of interactivity, it isn’t their default.
Smalltalk’s first stable release was in 1980 and we still haven’t caught up to its level in industry. It’s hard to understand exactly how this happened historically, but it seems to be path dependence based on some combination of (relatively) poor marketing, early mistakes in design, and the limitations of older hardware that could barely handle those features when the industry was first taking off.
But there are open-source Smalltalks now, most deriving from Squeak. Pharo, Dolphin, and Cuis are notable. There is even a VM written in JavaScript so you can try it in your web browser.
Aerospike rockets are supposed to be much more fuel-efficient in atmosphere than are the conventional bell nozzles.
There are good reasons “rocket science” has become a synonym for “difficult”. Nobody wants to take a chance on unproven technology when designing rockets is already hard enough. Not even Elon Musk, at least so far.
Digital knowledge management tools envisioned in the 1950s and 60s such as Douglas Engelbart’s hyperdocument system has not been fully implemented (to my knowledge) and certainly not widely embraced. The World Wide Web failed to implement key features from Engelbart’s proposal such as the ability to directly address arbitrary sub-documents, or the ability to live embed a sub-document inside another document.
Similarly both Engelbart and Ted Nelson emphasized the importance of hyperlinks being two-directional so that the link is browsable from both the source and the target document. In other words, you could look at any webpage and immediately see all the pages that link to that page. However, Tim Berners-Lee chose to make web hyperlinks one directional from source to target, and we are still stuck with that limitation today. Google’s PageRank algorithm gets around this by doing massive crawling the web and then tracing the back-links through the network, but back-links could have been built into the web as a basic feature available to everybody.
https://www.dougengelbart.org/content/view/156/88/
Depending on your sensitivity filter, there are over 300,000 US patents, of which perhaps 10% have been incorporated into a commercially successful product. https://www.uspto.gov/web/offices/ac/ido/oeip/taf/h_counts.htm
I think commercial applications of nuclear fission sources are another good example.
Through the 1940s, there were lots of industrial processes, and commercial products which used nuclear fission or nuclear materials in some way. Beta sources are good supplies of high-energy electrons (used in a bunch of polymer processes, among other things), alpha sources are good supplies of positively charged nuclei (used in electrostatic discharge, and some sensing applications).
I think one of the big turning points was the Atomic Energy Act, in the US, though international agreements might also be important factors here.
The world seems to have collectively agreed that nuclear risks are high, and we seem to have chosen to restrict proliferation (by regulating production and sale of nuclear materials) -- and as a side effect have “forgotten” the consumer nuclear technology industry.
I am interested in this because its also an example where we seem to have collectively chose to stifle/prevent innovation in an area of technology to reduce downside risk (dirty bombs and other nuclear attacks).
I think the EBR-II reactor was a notable example. The government cut funding three years before the completion of the program. Its design is what we now call an “integral fast reactor”. Its passive safety features demonstrated that it literally cannot melt down. An IFR design would also produce much less waste than a conventional light-water reactor.
I think Google Wave/Apache Wave is a good candidate here, at least for the crowd familiar with it.
Designed to be a new modality of digital communication, it combined features of email, messengers/chat, collaborative document editing, etc.
It got a ton of excitement from a niche crowd while it was in a closed beta.
It never got off the ground, though, and less than a year after finishing the beta, it was slowly turned down and eventually handed over to Apache.