I would like for the system to provide humans with information. So if a human asks a reasonable question (How do I get a strawberry?) the system gives information on cultivating strawberries. If a human asks for the dna sequence of a strawberry and how to create a strawberry from that, the system gives safety information and how to do that. If a human asks how to create a thermonuclear bomb, the system asks why, and refuses to answer unless the human can provide a verifiable reason why creating this is necessary to solve an existential threat to humanity. I would like the system to be able to provide this information in a variety of ways, such as interactive chat or a written textbook.
I would like the system to gain scientific and engineering knowledge. So I would like the system to do things like setup telescopes and send probes to other planets. I would like the system to provide monitoring of Earth from orbit. If it needs to run safe experimental facilities, I would like the system to be able to do that.
I would like the system to leave most of the universe alone. So I would like it to leave most of the surfaces of planets and other natural bodies untouched. (If the system dug up more than 10 cubic kilometers of a planet, or disturbed more than 1% of the surface area or volume area, I would consider that a violation of this goal) (Tearing apart say an O-type main sequence star that will not ever have life would be okay if necessary for a really interesting experiment that could not be done in any other way, ripping apart the majority of stars in a galaxy is not something I would want except to prevent an existential threat.)
I would like the system to be incredibly careful not to disturb life. So on Earth, it should only disturb life with human’s permission, and elsewhere should entirely avoid any resource extraction on any planet or place with existing life.
I would like the system to use a reasonable effort to prevent humans or other intelligent lifeforms from completely destroying themselves. (So if banning nanotechnology and nuclear bombs is needed, okay, but banning bicycles or knives is going too far. Diverting asteroids from hitting Earth would be good.)
I would like the system to have conversations with humans, about ethics and other topics, and try to help humans figure out what would be truly good.
(And of course, what we want AGIs to do and how to get AGIs to do that are two separate questions. Also, this list is partially based on Ursula K. Le Guin’s The City of Mind (Yaivkach) AGIs in Always Coming Home.)
I would like for the system to provide humans with information. So if a human asks a reasonable question (How do I get a strawberry?) the system gives information on cultivating strawberries. If a human asks for the dna sequence of a strawberry and how to create a strawberry from that, the system gives safety information and how to do that. If a human asks how to create a thermonuclear bomb, the system asks why, and refuses to answer unless the human can provide a verifiable reason why creating this is necessary to solve an existential threat to humanity. I would like the system to be able to provide this information in a variety of ways, such as interactive chat or a written textbook.
I would like the system to gain scientific and engineering knowledge. So I would like the system to do things like setup telescopes and send probes to other planets. I would like the system to provide monitoring of Earth from orbit. If it needs to run safe experimental facilities, I would like the system to be able to do that.
I would like the system to leave most of the universe alone. So I would like it to leave most of the surfaces of planets and other natural bodies untouched. (If the system dug up more than 10 cubic kilometers of a planet, or disturbed more than 1% of the surface area or volume area, I would consider that a violation of this goal) (Tearing apart say an O-type main sequence star that will not ever have life would be okay if necessary for a really interesting experiment that could not be done in any other way, ripping apart the majority of stars in a galaxy is not something I would want except to prevent an existential threat.)
I would like the system to be incredibly careful not to disturb life. So on Earth, it should only disturb life with human’s permission, and elsewhere should entirely avoid any resource extraction on any planet or place with existing life.
I would like the system to use a reasonable effort to prevent humans or other intelligent lifeforms from completely destroying themselves. (So if banning nanotechnology and nuclear bombs is needed, okay, but banning bicycles or knives is going too far. Diverting asteroids from hitting Earth would be good.)
I would like the system to have conversations with humans, about ethics and other topics, and try to help humans figure out what would be truly good.
(And of course, what we want AGIs to do and how to get AGIs to do that are two separate questions. Also, this list is partially based on Ursula K. Le Guin’s The City of Mind (Yaivkach) AGIs in Always Coming Home.)