I was going by the initial description from Douglas_Reay:
Albert is a relatively new AI, who under the close guidance of his programmers is being permitted to slowly improve his own cognitive capability.
That does not sound like an entity that should be handling a lot of high impact utility calculations. If an entity was described as that and was constantly announcing it was making high impact utility decisions, that either sounds like a bug or people are giving it things it isn’t meant to deal with yet.
I was going by the initial description from Douglas_Reay:
That does not sound like an entity that should be handling a lot of high impact utility calculations. If an entity was described as that and was constantly announcing it was making high impact utility decisions, that either sounds like a bug or people are giving it things it isn’t meant to deal with yet.