I designed, with a co-worker, a cognitive infrastructure for DARPA that is supposed to let AIs share code. I intended to have cognitive modules be web services (at present, they’re just software agents). Every representation used was to be evaluated using a subset of Prolog, so that expressions could be automatically converted between representations. (This was never implemented; nor was ontology mapping, which is really hard and would also be needed to translate content.) Unfortunately, my former employer didn’t let me publish anything on it. Also, it works only with symbolic AI.
It wouldn’t change the picture Eliezer is drawing much even if it worked perfectly, though.
on any sufficiently short timescale, progress should locally approximate an exponential because of competition between interest rates (even in the interior of a single mind).
I designed, with a co-worker, a cognitive infrastructure for DARPA that is supposed to let AIs share code. I intended to have cognitive modules be web services (at present, they’re just software agents). Every representation used was to be evaluated using a subset of Prolog, so that expressions could be automatically converted between representations. (This was never implemented; nor was ontology mapping, which is really hard and would also be needed to translate content.) Unfortunately, my former employer didn’t let me publish anything on it. Also, it works only with symbolic AI.
It wouldn’t change the picture Eliezer is drawing much even if it worked perfectly, though.
Huh?