My take is that quicksort, merge sort and all other common sorts involve physically rearranging memory in a way that would be a bad fit for a list on paper. This one doesn’t change the original list, it just emits its elements one at a time in order.
This is worst-case O(n2) - if they’re currently in order (i.e. sorted from highest to lowest priority), you’d do n(n−1)2 comparisons. Best case O(n) comparisons, if they’re currently in reverse order. (You could interpret it as high priority elements being high in the sort order, but then the algorithm emits elements one at a time in reverse order.) Memory use is O(n) pointers/indexes (the list of marked elements), n bools (a list of which items have already been crossed off) and O(1) bookkeeping.
You could see it as a variant on selection sort—in particular it has the same property of “find the least element, then the next least, and so on” that lets you stop part way through if you only want to do three tasks. But because of the “emit items instead of changing the original list” behaviour, instead of just keeping track of the current minimum, we can keep track of a descending subsequence of the original list.
In pseudocode, I think we get something like...
let input: [Task] be the input list (n elements)
initialize:
emitted: [Bool] = [False, False, ..., False] (n elements)
descending-stack: [Int] = [-1]
current-min: (Infinity | Task) = Infinity
last-pop: Int = -1
repeat:
for i from last-pop + 1 to n-1:
if not emitted[i] and input[i] < current-min:
push(descending-stack, i)
current-min = input[i]
last-pop = pop(descending-stack)
if last-pop == -1:
finished()
emit(input[i])
emitted[i] = True
My take is that quicksort, merge sort and all other common sorts involve physically rearranging memory in a way that would be a bad fit for a list on paper. This one doesn’t change the original list, it just emits its elements one at a time in order.
This is worst-case O(n2) - if they’re currently in order (i.e. sorted from highest to lowest priority), you’d do n(n−1)2 comparisons. Best case O(n) comparisons, if they’re currently in reverse order. (You could interpret it as high priority elements being high in the sort order, but then the algorithm emits elements one at a time in reverse order.) Memory use is O(n) pointers/indexes (the list of marked elements), n bools (a list of which items have already been crossed off) and O(1) bookkeeping.
You could see it as a variant on selection sort—in particular it has the same property of “find the least element, then the next least, and so on” that lets you stop part way through if you only want to do three tasks. But because of the “emit items instead of changing the original list” behaviour, instead of just keeping track of the current minimum, we can keep track of a descending subsequence of the original list.
In pseudocode, I think we get something like...
Thanks a lot for the thoughtful reply! Not having to move the items around could indeed help to lower the cognitive burden.