Yes, there will always be some off-by-one errors, so the best we can hope for is to pick the convention that creates less of them. That said, the fact that most programming languages choose the zero-based convention seems to suggest that that’s the best one.
Yes, there will always be some off-by-one errors, so the best we can hope for is to pick the convention that creates less of them. That said, the fact that most programming languages choose the zero-based convention seems to suggest that that’s the best one.
There’s also the revealed word of our prophet Dijkstra: EWD83 - Why numbering should start at zero.
The fact that humans count “one, two, three...” and not “zero, one, two...” does suggest that there is a best one and it’s not zero-based.