I don’t think that your experience with consumer software is at all similar to Engineer/scientist/modeler experience with large language and predictive models. To be clear, those suck too, and require an insane amount of effort and thought to get anything useful done. But they suck in ways that can be fixed over time, and in ways that (seem to) correlate with the underlying complexity of the world.
Complaining about corporate decisions that happen to be implemented in software doesn’t quite connect. At least by that pathway. Worrying that consumer software usually seems adversarial to the consumer, and that there may be a similar problem where AI is adversarial to everyone but the “owner” of the AI is probably justified.
But that’s not “software sucks”, it’s “software creators are a mix of evil and stupid”.
Yes, but it does show a tendency of huge complex networks (operating system userbases, the internet, human civilization) to rapidly converge to a fixed level of crappiness that absolutely won’t improve, even as more resources become available. Of course there could be a sudden transition to a new state with artificial networks larger than the above.
I don’t think that your experience with consumer software is at all similar to Engineer/scientist/modeler experience with large language and predictive models. To be clear, those suck too, and require an insane amount of effort and thought to get anything useful done. But they suck in ways that can be fixed over time, and in ways that (seem to) correlate with the underlying complexity of the world.
Complaining about corporate decisions that happen to be implemented in software doesn’t quite connect. At least by that pathway. Worrying that consumer software usually seems adversarial to the consumer, and that there may be a similar problem where AI is adversarial to everyone but the “owner” of the AI is probably justified.
But that’s not “software sucks”, it’s “software creators are a mix of evil and stupid”.
Yes, but it does show a tendency of huge complex networks (operating system userbases, the internet, human civilization) to rapidly converge to a fixed level of crappiness that absolutely won’t improve, even as more resources become available.
Of course there could be a sudden transition to a new state with artificial networks larger than the above.