“Relying on humans to create and manage allocations, is not a scale-able solution to the normal computer control problem.”
We are creating more and more heterogeneous computers. Home automation, mobile phones etc etc. Ubiquitous computing is coming of age but that means that there is more and more computers to manage, we can’t so we get IoT botnets)
Relying on humans to create and manage allocations, is not a scale-able solution
But humans don’t do that. When I run a program I don’t specify its CPU slice and memory pool: I just run it. With automatic updates I don’t even control (never mind “create and manage”) which software gets installed on my hardware. Your average computer user certainly doesn’t make much if any decisions about managing allocations.
And IoT botnets are not a consequence of human management scaling badly, they are a consequence of an unfortunate set of incentives for IoT manufacturers.
When you close down a program that is using up too much memory or processing power (google chrome I’m looking at you) you are making an allocation decision (there are lots of others allocations I could point at, if you look at it from that point of view).
So I would argue that they do, they just do it after the fact and in a very disruptive way.
Running a program is making an allocation decision of sorts too, again not fine grained or very controlled but still there.
If this is contrasted to brains, there is no central authority that picks and chooses what the system is doing. So things don’t have to be the way they are.
And IoT botnets are not a consequence of human management scaling badly, they are a consequence of an unfortunate set of incentives for IoT manufacturers.
If management of compute was easy and cheap (as cheap as compute has become), then you wouldn’t have to rely on the incentives of manufacturers.
I think the answer is here
“Relying on humans to create and manage allocations, is not a scale-able solution to the normal computer control problem.”
We are creating more and more heterogeneous computers. Home automation, mobile phones etc etc. Ubiquitous computing is coming of age but that means that there is more and more computers to manage, we can’t so we get IoT botnets)
But humans don’t do that. When I run a program I don’t specify its CPU slice and memory pool: I just run it. With automatic updates I don’t even control (never mind “create and manage”) which software gets installed on my hardware. Your average computer user certainly doesn’t make much if any decisions about managing allocations.
And IoT botnets are not a consequence of human management scaling badly, they are a consequence of an unfortunate set of incentives for IoT manufacturers.
When you close down a program that is using up too much memory or processing power (google chrome I’m looking at you) you are making an allocation decision (there are lots of others allocations I could point at, if you look at it from that point of view).
So I would argue that they do, they just do it after the fact and in a very disruptive way.
Running a program is making an allocation decision of sorts too, again not fine grained or very controlled but still there.
If this is contrasted to brains, there is no central authority that picks and chooses what the system is doing. So things don’t have to be the way they are.
If management of compute was easy and cheap (as cheap as compute has become), then you wouldn’t have to rely on the incentives of manufacturers.