Decision theories are explicitly not moral. It is easy to construct scenarios where agents acting according to any of them will lie, cheat, steal, and murder.
You can probably get something like a moral theory from some bunch of assumptions about preferences + various types of shared interests + universally applying some decision theory. But it still won’t be a moral theory, and I doubt you can get anything like most human (including religious) morality out of it.
Decision theories are explicitly not moral. It is easy to construct scenarios where agents acting according to any of them will lie, cheat, steal, and murder.
You can probably get something like a moral theory from some bunch of assumptions about preferences + various types of shared interests + universally applying some decision theory. But it still won’t be a moral theory, and I doubt you can get anything like most human (including religious) morality out of it.