Something already does happen for large distances.
That’s an observable fact. It’s redshift.
What causes it?
The standard answer is expansion, which needs inflation and dark energy and an arbitrary multiverse to do that. Al lthings that make the theory more complicated with distance.
Alternatively, what if light doesn’t travel forever?
How would such a reality look if things existed farther away than light could travel?
Is it not exactly what is observed?
What is more complex,
0 frequency photons ceasing to be photons,
or
infinite wavelength photons everywhere never interacting with anything in space expanding faster than c?
If it’s a matter of complexity, the 0 frequency photons ceasing to exist is less complex than there being infinite wavelength photons everywhere.
The thing you need to evaluate the complexity of is an actual theory, with equations and everything, not a vague suggestion that maybe photons lose energy as they travel.
I don’t think the conventional theory says you have infinite-wavelength photons, and I think your thought experiment with your two hands is wrong. Light from an object at the Hubble limit not only never reaches us, but also never reaches (say) a point 1m “inward” from us. It never gets any closer to us. Note that this is not the same as saying that light from 1m less than the Hubble limit never reaches us, which of course is false.
We get arbitrarily long-wavelength photons, if we wait long enough, but we have to wait longer for the longer-wavelength ones and we would have to wait infinitely long to get ones of infinite wavelength.
The actual theory, equation and everything is that distant galaxies do not recede at v = H D (Hubble’s Old Law), instead a photon travels at v = c—H d (let’s call it Hubble’s New Law).
That’s the cause of redshift.
At c/H, in both the old and new law, the frequency of the photon reaches 0.
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it. What happens at this point is poorly addressed by the standard model, for obvious reasons (it’s ridiculous).
In Hubble’s New Law, the photon has lost all energy, and thus there is no photon.
Old = photons piling up in ever increasing space. do they redshift into the negative frequency?
The actual theory, equation and everything is [...]
That’s not the actual theory. It’s a tiny fraction of a theory. It’s not even clear that it makes sense. (What exactly is d here? Total distance travelled by the photon since it first came into being, I guess. But what exactly does that mean? For instance, can there be interference between two photons with different values of d, and if so what happens?)
In your theory, photons travel slower than c, the exact speed depending on their “d” value. That’s going to mess up pretty much everything in quantum electrodynamics, so what do you put in its place?
In your theory, photons pop out of existence when their velocity and frequency reach zero. Again, that violates local conservation of energy and CPT invariance and so forth; again, how are you modifying the fundamentals of conventional physics to deal with this?
At c/H, in both the old and new law, the frequency of the photon reaches 0.
But “at c/H” the photons never reach us because their source is receding from us at the speed of light. We never see zero-frequency photons. We do see substantially-lowered-frequency photons (once the light has had long enough to reach us despite the recession).
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it.
This doesn’t appear to me to be anything like correct. It is getting closer to the things in front of it—apart from ones that are receding from it faster than the speed of light, which are (and remain) far away from it. What’s the problem here supposed to be?
Old = photons piling up in ever increasing space
I repeat: I do not think the conventional theory does say anything like “photons piling up in ever increasing space”. Of course it’s possible that my analysis is wrong; feel free to show me why.
Something already does happen for large distances.
That’s an observable fact. It’s redshift.
What causes it?
The standard answer is expansion, which needs inflation and dark energy and an arbitrary multiverse to do that. Al lthings that make the theory more complicated with distance.
Alternatively, what if light doesn’t travel forever?
How would such a reality look if things existed farther away than light could travel?
Is it not exactly what is observed?
What is more complex,
0 frequency photons ceasing to be photons,
or
infinite wavelength photons everywhere never interacting with anything in space expanding faster than c?
If it’s a matter of complexity, the 0 frequency photons ceasing to exist is less complex than there being infinite wavelength photons everywhere.
The thing you need to evaluate the complexity of is an actual theory, with equations and everything, not a vague suggestion that maybe photons lose energy as they travel.
I don’t think the conventional theory says you have infinite-wavelength photons, and I think your thought experiment with your two hands is wrong. Light from an object at the Hubble limit not only never reaches us, but also never reaches (say) a point 1m “inward” from us. It never gets any closer to us. Note that this is not the same as saying that light from 1m less than the Hubble limit never reaches us, which of course is false.
We get arbitrarily long-wavelength photons, if we wait long enough, but we have to wait longer for the longer-wavelength ones and we would have to wait infinitely long to get ones of infinite wavelength.
The actual theory, equation and everything is that distant galaxies do not recede at v = H D (Hubble’s Old Law), instead a photon travels at v = c—H d (let’s call it Hubble’s New Law).
That’s the cause of redshift.
At c/H, in both the old and new law, the frequency of the photon reaches 0.
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it. What happens at this point is poorly addressed by the standard model, for obvious reasons (it’s ridiculous).
In Hubble’s New Law, the photon has lost all energy, and thus there is no photon.
Old = photons piling up in ever increasing space. do they redshift into the negative frequency?
New = photons redshit to zero then are gone
That’s not the actual theory. It’s a tiny fraction of a theory. It’s not even clear that it makes sense. (What exactly is d here? Total distance travelled by the photon since it first came into being, I guess. But what exactly does that mean? For instance, can there be interference between two photons with different values of d, and if so what happens?)
In your theory, photons travel slower than c, the exact speed depending on their “d” value. That’s going to mess up pretty much everything in quantum electrodynamics, so what do you put in its place?
In your theory, photons pop out of existence when their velocity and frequency reach zero. Again, that violates local conservation of energy and CPT invariance and so forth; again, how are you modifying the fundamentals of conventional physics to deal with this?
But “at c/H” the photons never reach us because their source is receding from us at the speed of light. We never see zero-frequency photons. We do see substantially-lowered-frequency photons (once the light has had long enough to reach us despite the recession).
This doesn’t appear to me to be anything like correct. It is getting closer to the things in front of it—apart from ones that are receding from it faster than the speed of light, which are (and remain) far away from it. What’s the problem here supposed to be?
I repeat: I do not think the conventional theory does say anything like “photons piling up in ever increasing space”. Of course it’s possible that my analysis is wrong; feel free to show me why.