Eliezer seems to be taking a page from Alicorn’s book. In Luminosity Alice is plagued by differing visions as Bella constantly changes her mind about her future, and then the actual future snaps into place when a final choice is made.
Secondary source: I have seen the first 3 films, and Alice explicitly (and repeatedly, I think) states that “a decision has been made” when she has a vision. That decision needn’t be made by Bella specifically though.
Essentially? It has to happen at some point along the timeline, and whatever engine runs magic finds it simplest to give visions simultaneous to the decisions that cause them. (Or at least, contribute in some major way to them.)
Take the present state of the universe and use an imperfect tool to extrapolate likely future outcomes. Changing your mind causes the present state to shift towards predicting a certain future outcome more.
The only weird thing is that you can actually fool people by pretending. The prediction mechanism has to have some very specific flaws for that to work.
Eliezer seems to be taking a page from Alicorn’s book. In Luminosity Alice is plagued by differing visions as Bella constantly changes her mind about her future, and then the actual future snaps into place when a final choice is made.
That’s how it is in the canon Twilight (Eclipse).
Try not to take this as me being a big snobby snob, but did you actually read them?
Secondary source: I have seen the first 3 films, and Alice explicitly (and repeatedly, I think) states that “a decision has been made” when she has a vision. That decision needn’t be made by Bella specifically though.
Weirdly enough, I have read both the canon and the Alicorn’s fanfic.
And I already remarked in the Luminosity thread that that makes no sense. It makes even less sense in a universe with time turners.
Essentially? It has to happen at some point along the timeline, and whatever engine runs magic finds it simplest to give visions simultaneous to the decisions that cause them. (Or at least, contribute in some major way to them.)
Or, in other words, enforced narrative causality.
Take the present state of the universe and use an imperfect tool to extrapolate likely future outcomes. Changing your mind causes the present state to shift towards predicting a certain future outcome more.
The only weird thing is that you can actually fool people by pretending. The prediction mechanism has to have some very specific flaws for that to work.