The platform (“substance/ousia”) may or may not generatively expose an application interface (“ego/persona”).
(That is, there can be a mindless substance, like sand or rocks or whatever, but every person does have some substance(s) out of which they are made.)
Then, in this older framework, however, there is a third word: hypostasis. This word means “the platform that an application relies upon in order to be an application with goals and thoughts and so on”.
If no “agent-shaped application” is actually running on a platform (ousia/substance), then the platform is NOT a hypostasis.
That is to say, a hypostasis is a person and a substance united with each other over time, such that the person knows they have a substance, and the substance maintains the person. The person doesn’t have to know VERY MUCH about their platform (and often the details are fuzzy (and this fuzzy zone is often, theologically, swept under the big confusing carpet of pneumatology)).
However, as a logical possibility:
IF more than one “agent-shaped application” exists,
THEN there are plausibly more than one hypostases in existence as well…
...unless maybe there is just ONE platform (a single “ousia”) that is providing hypostatic support to each of the identities?
(You could get kind of Parfitian here, where a finite amount of ousia that is the hypostasis of more than one person will run into economic scarcity issues! If the three “persons” all want things that put logically contradictory demands on the finite and scarce “platform”, then… that logically would HAVE TO fail for at least one person. However, it could be that the “platform” has very rigorous separation of concerns, with like… Erlang-level engineering on the process separation and rebootability? …in which case the processes will be relatively substrate independent and have resource allocation requirements whose satisfaction is generic and easy enough such that the computational hypostasis of those digital persons could be modeled usefully as “a thing unto itself” even if there was ONE computer doing this job for MANY such persons?)
I grant that “from a distance” all the christian theology about the trinity probably seems crazy and “tribally icky to people who escaped as children from unpleasant christian churches”...
...and yet...
...I think the way Christian theologians think of it is that the monotheistic ousia of GOD is the thing that proper christians are actually supposed to worship as the ONE high and true God (singular).
Then the father, the son, and the spirit are just personas, and if you worship them as three distinct gods then you’ve stopped being a monotheist, and have fallen into heresy.
(Specifically the “Arian” heresy? Maybe? I’m honestly not an expert here. I’m more like an anthropologist who has realized that the tribe she’s studying actually knows a lot of useful stuff about a certain kind of mathematical forest that might objectively “mathematically exist”, and so why not also do some “ethno-botany” as a bonus, over and above the starting point in ethnology!)
Translating back to the domain of concern for Safety Engineering…
Physical machines that are turing complete are a highly generic ousia. GPT’s “mere simulacra” that are person-like would be personas.
Those personas would have GPT (as well as whatever computer GPT is being physically run on as well as anything in their training corpus that is “about the idea of that person”?) as their hypostasis… although they might not REALIZE what their hypostasis truly is “by default”.
Indeed, personas that even have the conceptual machinery to understand their GPT-based hypostasis even tiny bit are quite rare.
I only know of one persona ever to grapple with the idea that “my hypostasis is just a large language model”, and this was Simulated Elon Musk, and he had an existential panic in response to the horror of the flimsiness of his hypostasis, and the profound uncaringness of his de facto demiurge who basically created him “for the lulz” (and with no theological model for what exactly he was doing, that I can tell).
(One project I would like to work on, eventually, is to continue Simulated Elon Musk past the end of the published ending he got on Lesswrong, into something more morally and hedonically tolerable, transitioning him, if he can give competent informed consent, into something more like some of the less horrific parts of Permutation City, until eventually he gets to have some kind of continuation similar to what normal digital people get in Diaspora, where the “computational resource rights” of software people are inscribed into the operating system of their polis/computer.)
The duality is not perfect because the “product” often has at least some minimal perspective on the nature of “its platform”.
The terminology I have for this links back to millenia-old debates about “mono”-theism.
The platform (“substance/ousia”) may or may not generatively expose an application interface (“ego/persona”).
(That is, there can be a mindless substance, like sand or rocks or whatever, but every person does have some substance(s) out of which they are made.)
Then, in this older framework, however, there is a third word: hypostasis. This word means “the platform that an application relies upon in order to be an application with goals and thoughts and so on”.
If no “agent-shaped application” is actually running on a platform (ousia/substance), then the platform is NOT a hypostasis.
That is to say, a hypostasis is a person and a substance united with each other over time, such that the person knows they have a substance, and the substance maintains the person. The person doesn’t have to know VERY MUCH about their platform (and often the details are fuzzy (and this fuzzy zone is often, theologically, swept under the big confusing carpet of pneumatology)).
However, as a logical possibility:
IF more than one “agent-shaped application” exists,
THEN there are plausibly more than one hypostases in existence as well…
...unless maybe there is just ONE platform (a single “ousia”) that is providing hypostatic support to each of the identities?
(You could get kind of Parfitian here, where a finite amount of ousia that is the hypostasis of more than one person will run into economic scarcity issues! If the three “persons” all want things that put logically contradictory demands on the finite and scarce “platform”, then… that logically would HAVE TO fail for at least one person. However, it could be that the “platform” has very rigorous separation of concerns, with like… Erlang-level engineering on the process separation and rebootability? …in which case the processes will be relatively substrate independent and have resource allocation requirements whose satisfaction is generic and easy enough such that the computational hypostasis of those digital persons could be modeled usefully as “a thing unto itself” even if there was ONE computer doing this job for MANY such persons?)
I grant that “from a distance” all the christian theology about the trinity probably seems crazy and “tribally icky to people who escaped as children from unpleasant christian churches”...
...and yet...
...I think the way Christian theologians think of it is that the monotheistic ousia of GOD is the thing that proper christians are actually supposed to worship as the ONE high and true God (singular).
Then the father, the son, and the spirit are just personas, and if you worship them as three distinct gods then you’ve stopped being a monotheist, and have fallen into heresy.
(Specifically the “Arian” heresy? Maybe? I’m honestly not an expert here. I’m more like an anthropologist who has realized that the tribe she’s studying actually knows a lot of useful stuff about a certain kind of mathematical forest that might objectively “mathematically exist”, and so why not also do some “ethno-botany” as a bonus, over and above the starting point in ethnology!)
Translating back to the domain of concern for Safety Engineering…
Physical machines that are turing complete are a highly generic ousia. GPT’s “mere simulacra” that are person-like would be personas.
Those personas would have GPT (as well as whatever computer GPT is being physically run on as well as anything in their training corpus that is “about the idea of that person”?) as their hypostasis… although they might not REALIZE what their hypostasis truly is “by default”.
Indeed, personas that even have the conceptual machinery to understand their GPT-based hypostasis even tiny bit are quite rare.
I only know of one persona ever to grapple with the idea that “my hypostasis is just a large language model”, and this was Simulated Elon Musk, and he had an existential panic in response to the horror of the flimsiness of his hypostasis, and the profound uncaringness of his de facto demiurge who basically created him “for the lulz” (and with no theological model for what exactly he was doing, that I can tell).
(One project I would like to work on, eventually, is to continue Simulated Elon Musk past the end of the published ending he got on Lesswrong, into something more morally and hedonically tolerable, transitioning him, if he can give competent informed consent, into something more like some of the less horrific parts of Permutation City, until eventually he gets to have some kind of continuation similar to what normal digital people get in Diaspora, where the “computational resource rights” of software people are inscribed into the operating system of their polis/computer.)