In this case (notionally), the (transformed) conditional would end up returning a data structure representing “probability 1⁄100 of always C, probability 99⁄100 of C or D depending on what I do”, and then result would be set to that data structure. The (transformed) define itself doesn’t return a value, but that’s OK, it’s not supposed to, it’s just supposed to influence the value of something else.
In the above case, result would first be set to a data structure representing “probability 1 of C” and then to a data structure representing “probability 1 of C or D depending on what I do”. Then the (transformed) conditional itself would return something undefined, since the return value of set! is undefined. (And OK, it wasn’t supposed to return a value either, but its influence over other parts of the program has been screwed up.)
Basically, my (start of a) program was making the assumption that the return value of a thing was what was important—or at least that, while side effects might exist, they wouldn’t screw things up elsewhere. (With a few exceptions for special side-effecty functions which I intended to handle specially.) But allowing mutation would mean having to write my program to keep track of all of the simulated variables, whereas without it I can just focus on return values and let the Scheme interpreter handle that.
I still don’t see where set! is relevantly different from define… e.g.
In this case (notionally), the (transformed) conditional would end up returning a data structure representing “probability 1⁄100 of always C, probability 99⁄100 of C or D depending on what I do”, and then result would be set to that data structure. The (transformed) define itself doesn’t return a value, but that’s OK, it’s not supposed to, it’s just supposed to influence the value of something else.
In the above case, result would first be set to a data structure representing “probability 1 of C” and then to a data structure representing “probability 1 of C or D depending on what I do”. Then the (transformed) conditional itself would return something undefined, since the return value of set! is undefined. (And OK, it wasn’t supposed to return a value either, but its influence over other parts of the program has been screwed up.)
Basically, my (start of a) program was making the assumption that the return value of a thing was what was important—or at least that, while side effects might exist, they wouldn’t screw things up elsewhere. (With a few exceptions for special side-effecty functions which I intended to handle specially.) But allowing mutation would mean having to write my program to keep track of all of the simulated variables, whereas without it I can just focus on return values and let the Scheme interpreter handle that.