Theory E. Everything interacted with everything, then hyper-expanded, then stopped hyper expanding
(skip some steps)
Theory M: Dark energy did it
(skip some steps)
Theory T: In the multiverse, any thing can happen, even different laws of physics (that makes the multiverse of inflation way different than Everettian QM, which operates on the wave equation), and an infinite number of universes are made, and ours happens to do this cool dark energy inflation stuff
Observation 3: BICEP2 found dust.
What I’m questioning Theory A.
Theory A says light travels forever, which was established in the 1800′s before we knew there were galaxies other than the Milky Way, before we detected redshift.
We observe a finite amount of light from a finite distances.
That’s an empirical fact.
That is to say, the empirical and theoretical range of electromagnetic radiation are not in agreement.
The articles I post cast great doubts over the CMB being what it is claimed to be, the existence of a “young universe”, the nucleosynthesis, and the growing skepticism over the scientific value of inflation.
What else does the Big Bang have to rest on?
“Is there a better theory?”
Take Hubble’s Law, v = H D where v is the apparent recessional of velocity of a galaxy at distance D, and H is Hubble’s Parameter. If the apparent recessional velocity is only apparent, and not actual, we could actually take that term (H D) from the distant galaxy (such that its v = 0) and say put it into the frequency (f) of the photon f = (c—H * D) / w where w is the wavelength from which it was emitted.
In this theory, redshift is a feature of light itself.
I know this lends itself to many criticisms and open questions, however, no one has provided Dark Energy, and supposedly it makes up 68% of the observable universe, which doesn’t seem consistent with observation.
Here’s something else I found, when looking into it:
“[If the redshifts are a Doppler shift] … the observations as they stand lead to the anomaly of a closed universe, curiously small and dense, and, it may be added, suspiciously young. On the other hand, if redshifts are not Doppler effects, these anomalies disappear and the region observed appears as a small, homogeneous, but insignificant portion of a universe extended indefinitely both in space and time.”
— E. Hubble, Monthly Notices of the Royal Astronomical Society, 97, 506, 1936
I also found that the temperature of space was predicted to be 3K by Aurthor Eddington in 1926. The Big Bang’s predictions where nowhere near that, and when the CMB was discovered in the 60′s at 3K, the discovery was claimed to be evidence of an expanding universe.
If cosmology is why computers worked or airplanes flew, or was ever corroborated by a lab experiment, then I probably wouldn’t be questioning. But it takes places millions of light years away, millions of years ago. And the experts in the articles I cited were calling it into question.
Is there a cognitive bias to have a creation myth?
Hey, another crazy person like me. Now we are two.
I’ve had a similar take on it for a long time. It seems like the expansion is an attempt to explain observed red shifts, necessitating an increasingly convoluted theory to explain other observations.
The observation is, the farther away, the greater the shift, in a linear fashion.
f = (c—H * D) / w where w is the wavelength from which it was emitted.
f=f_0 - D*Constant
What if it light just loses energy as it travels, so that the frequency shifts lower?
That seems like a perfectly natural solution. How do we know it isn’t true?
What would be the implications to the current theories if it were true?
This class of theory goes by the name of “Tired light”. It seems as if every theory of this kind precise enough to make definite predictions has been pretty clearly falsified, but I’m not an expert on this stuff.
A relation of the form f = f0 - constant*distance will send the frequency to zero (and then out the other side) once the distances get large enough. You probably don’t want that.
Note that no actual infinite limits are required. Just large but finite distances.
I have no prior that says this is a problem
I think you should have. One of three problems, depending on what you expect to happen. (1) If you expect something more complicated to happen for large distances, your theory is more complicated than it initially looks. Doesn’t your prior favour simpler theories? (2) If you expect the frequency to pass through zero and continue, your theory will have to (2a) explain what negative frequencies actually mean, why frequency -f isn’t just the same as frequency +f with different phase, why we never see anything with negative frequency, etc., or else (2b) if instead it says that negative frequencies are the same as positive, then explain what happens to the frequency after it crosses zero (gets more negative? then -f isn’t the same as +f after all. gets less negative? then what we actually end up is a really weird discontinuity at the zero crossing). Again, all this stuff is extra complexity. (3) If you expect the issue not to arise because nature somehow ensures that light never travels far enough for the frequency to reach zero, then your theory needs to explain how that happens. Extra complexity again.
I expect infinite complexity, but pick the simplest model to account for the currently known data. Keep on expanding the range of applicability, and I expect to see new effects that aren’t accounted for in models validated over a more restricted range of data.
Reality is more complicated than it looks, and I don’t expect that to end.
No, I don’t expect negative frequencies, as frequency goes down, energy goes down, and I expect quantum effects to take hold as energy approaches zero. You can call that “extra complexity”, but we already know there are quantum effects in general.
Does that stop you regarding a theory as more credible when it’s simpler (for equal fidelity to observed evidence)?
It’s more credible in the range of data for which it’s fidelity has shown it to be more credible. I expect extrapolations outside that range to have less fidelity.
Do you have more specific expectations?
No.
I don’t have some grand unified theory.
I just observe that a lot of cosmology seems to be riding on the theory that the red shift is caused by an expanding universe.
Note that I ended my first post with questions, not with claims.
What if it light just loses energy as it travels, so that the frequency shifts lower?
That seems like a perfectly natural solution. How do we know it isn’t true?
What would be the implications to the current theories if it were true?
I just observe that a lot of cosmology seems to be riding on the theory that the red shift is caused by an expanding universe.
This seems wrong to be. There’s at least two independent lines of evidence for the Big Bang theory besides redshifts—isotope abundances (particularly for light elements) and the cosmic background radiation.
What if it light just loses energy as it travels, so that the frequency shifts lower?
We would have to abandon our belief in energy conservation. And we would then wonder why energy seems to be conserved exactly in every interaction we can see. Also we would wonder why we see spontaneous redshifts not spontaneous blue shifts. Every known micro-scale physical process in the universe is reversible [1], and by the CPT theorem, we expect this to be true always. A lot would have to be wrong with our notions of physics to have light “just lose energy.”
That seems like a perfectly natural solution. How do we know it isn’t true?
This solution requires light from distant galaxies to behave in ways totally different from every other physical process we know about—including physical processes in distant galaxies. It seems unnatural to say “the redshift is explained by a totally new physical process, and this process violates a lot of natural laws that hold everywhere else.”
[1] I should say, reversible assuming you also flip the charges and parities. That’s irrelevant here, though, since photons are uncharged and don’t have any special polarization.
Something already does happen for large distances.
That’s an observable fact. It’s redshift.
What causes it?
The standard answer is expansion, which needs inflation and dark energy and an arbitrary multiverse to do that. Al lthings that make the theory more complicated with distance.
Alternatively, what if light doesn’t travel forever?
How would such a reality look if things existed farther away than light could travel?
Is it not exactly what is observed?
What is more complex,
0 frequency photons ceasing to be photons,
or
infinite wavelength photons everywhere never interacting with anything in space expanding faster than c?
If it’s a matter of complexity, the 0 frequency photons ceasing to exist is less complex than there being infinite wavelength photons everywhere.
The thing you need to evaluate the complexity of is an actual theory, with equations and everything, not a vague suggestion that maybe photons lose energy as they travel.
I don’t think the conventional theory says you have infinite-wavelength photons, and I think your thought experiment with your two hands is wrong. Light from an object at the Hubble limit not only never reaches us, but also never reaches (say) a point 1m “inward” from us. It never gets any closer to us. Note that this is not the same as saying that light from 1m less than the Hubble limit never reaches us, which of course is false.
We get arbitrarily long-wavelength photons, if we wait long enough, but we have to wait longer for the longer-wavelength ones and we would have to wait infinitely long to get ones of infinite wavelength.
The actual theory, equation and everything is that distant galaxies do not recede at v = H D (Hubble’s Old Law), instead a photon travels at v = c—H d (let’s call it Hubble’s New Law).
That’s the cause of redshift.
At c/H, in both the old and new law, the frequency of the photon reaches 0.
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it. What happens at this point is poorly addressed by the standard model, for obvious reasons (it’s ridiculous).
In Hubble’s New Law, the photon has lost all energy, and thus there is no photon.
Old = photons piling up in ever increasing space. do they redshift into the negative frequency?
The actual theory, equation and everything is [...]
That’s not the actual theory. It’s a tiny fraction of a theory. It’s not even clear that it makes sense. (What exactly is d here? Total distance travelled by the photon since it first came into being, I guess. But what exactly does that mean? For instance, can there be interference between two photons with different values of d, and if so what happens?)
In your theory, photons travel slower than c, the exact speed depending on their “d” value. That’s going to mess up pretty much everything in quantum electrodynamics, so what do you put in its place?
In your theory, photons pop out of existence when their velocity and frequency reach zero. Again, that violates local conservation of energy and CPT invariance and so forth; again, how are you modifying the fundamentals of conventional physics to deal with this?
At c/H, in both the old and new law, the frequency of the photon reaches 0.
But “at c/H” the photons never reach us because their source is receding from us at the speed of light. We never see zero-frequency photons. We do see substantially-lowered-frequency photons (once the light has had long enough to reach us despite the recession).
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it.
This doesn’t appear to me to be anything like correct. It is getting closer to the things in front of it—apart from ones that are receding from it faster than the speed of light, which are (and remain) far away from it. What’s the problem here supposed to be?
Old = photons piling up in ever increasing space
I repeat: I do not think the conventional theory does say anything like “photons piling up in ever increasing space”. Of course it’s possible that my analysis is wrong; feel free to show me why.
If Hubble’s Law is true, then v = H * D, and when D = c / H, v = c, and that is called Hubble Limit. If you go far enough, galaxies will recede faster than the speed of light. That boundary defines the Hubble Volume.
In the Big Bang model, once a photon goes that far, space is expanding faster than light can travel, it’s basically stuck at Hubble’s Limit. There would be a bunch just adding up in that case.
In this model (which is not the 1930′s Tired Light model), the photon hits a frequency of zero, and ceases to exist.
Your “bunch just adding up” link is to a Q&A site’s question where you ask “would they just pile up?” and get the answer “no, not really”. (I paraphrase a little.)
The answer was that they’d have an infinite wavelength (as if that makes sense) so they wouldn’t literally be on a line in front of my face, but yet, they’re there, alright, zero frequency photons flying at c in space expanding faster than they can travel, forever (according to the expanding theory).
Point is, the frequency redshifts. It will hit zero. That’s consistent with what we observe out there. Known as Hubble’s Limit.
What if it light just loses energy as it travels, so that the frequency shifts lower?
That seems like a perfectly natural solution. How do we know it isn’t true?
As gjm mentions, the general name for this sort of theory is “tired light.” And these theories have been studied extensively and they are broken.
We have a very accurate, very well-tested theory that describes the way photons behave, quantum electrodynamics. It predicts that photons in the vacuum have a constant frequency and don’t suddenly vanish. Nor do photons have any sort of internal “clock” for how long they have been propagating. As near as I can tell, any sort of tired light model means giving up QED in fairly fundamental ways, and the evidentiary bar to overturn that theory is very high.
Worse, tired light seems to break local energy conservation. If photons just vanish or spontaneously redshift, where does the energy go?
I can conceive of there being a tired light model that isn’t ruled out by experiment, but I would like to see that theory before I junk all of 20th century cosmology and fundamental physics.
Most scientific theories, most of the time, have a whole bunch of quirky observations that they don’t explain well. Mostly these anomalies gradually go away as people find bugs in the experiments, or take into account various effects they hadn’t considered. The astronomical anomalies you point to don’t seem remotely problematic enough to give up on modern physics.
The test between an expanding model is and not expanding model is the Tolman Surface Brightness test.
It seems the expanding models predict an exponent of dimming 4, that is (1+z)^4.
A non expanding model would be an exponent of 1.
You could also make a model with frequency and energy decreasing because the speed of a photon is v = c—H * D.
In that case the trip the photon takes through static space is equal to the duration of the same trip through expanding space. In this case, the exponent predicted is 3 (it loses 1 exponent since the rate of photons stays constant instead of decreasing).
The actual Tolman Surface Brightness test seems to yield an exponent of 3.
In the expanding model, there are four factors of (1+z). 1 for the decrease in energy, 1 for the reduced rate at which photons arrive, and 2 for the increased trip of the photons through space.
In a static model, let’s say tired light from the 1930′s, light still always traveled at c and never lost energy on its own, it was thought maybe it hit dust. The dust makes the pictures blurry, which would be observed, so tired light (1930′s) is ruled out.
What you and I seem to think is possible is almost always instantly called tired light and dismissed from mind, despite being different models.
The (1930′s) tired light model has static space, and a photon always moving at c. It loses its energy via the hypothetical dust interactions and that’s one factor of dimming. There are three missing.
Now consider a model where the photon’s speed is c—H * d. In this model, the energy decreases, and the time it takes for a photon to make the journey increases with distance.
In other words, in “tired car” model (analog to 1930′s tired light), it takes a car traveling at 60 mpg an hour to travel to a location 60 miles away.
Then there is the standard model, call it “expanding road”. The car still travels at 60 mph, but the destination is receding away. Therefore, the trip is longer than hour.
Now, a novel model, the finite car model. Unlike the other models, the car itself can’t travel to infinity. The road isn’t expanding and the destination stays put, nothing gets in its way, but the car doesn’t travel forever, it (figuratively) runs out of gas and coasts.
If its speed is 60 mph—H * D, then it will take it take longer than an hour. The same amount as if the road were expanding.
Now imagine, you had 1000 of these cars, and you sent a new one toward the destination every 10 minutes.
If the road is not expanding, and the car is coasting it, each car coasts in 10 minutes apart. This model has 3 factors of (1+z).
If the road were expanding, the cars would reach the destination at increasingly larger intervals. The rate of their arrival is the 4th dimming factor.
We have slightly different models. You’ve obviously put more thought into yours, but I still like mine better, though I entirely admit I haven’t studied the implications of either.
Your model challenges two fundamental assumptions, and mine only does one.
For my model, the speed of light remains constant, but the energy of the photon decreases as it travels. A photon is a car fueled with itself, slowly burning itself up, though I’m not committed to it entirely burning itself up in the limit.
I wouldn’t think this would have anything to do with “dust”. Just travel through free space. I’m not explaining the effect, which I’d guess would require general relativity, just noting it as a possible mechanism for the observed red shift.
La Wik:
Following after Zwicky in 1935, Edwin Hubble and Richard Tolman compared recessional redshift with a non-recessional one, writing that they: … both incline to the opinion, however, that if the red-shift is not due to recessional motion, its explanation will probably involve some quite new physical principles [… and] use of a static Einstein model of the universe, combined with the assumption that the photons emitted by a nebula lose energy on their journey to the observer by some unknown effect, which is linear with distance, and which leads to a decrease in frequency, without appreciable transverse deflection.[16]
It might be interesting to consider the physics world at about 1935, and then again at 1945.
I heard one narrative put it in such a way, that these discoveries of galaxies and lots of them far away had quite a bit of interest, until everyone’s focus became war machines and nuclear bombs. When they returned to cosmology after the war, it was as they “said, where were we, space was expanding? ok” and then proceeded to work from there. An oversimplification I’m sure.
What if it light just loses energy as it travels, so that the frequency shifts lower?
By the way. That is one effect of the crackpot theory I posted below. Only it doesn’t ‘lose energy’. It kind of spreads it over a complex time component (thus geometrically the energy isn’t lost) only in our real projection of the time component it appears to be so.
Why does observing a finite amount of light from a finite distance contradict anything about the range of electromagnetic radiation?
I guess this is a reference to Olbers’ paradox. If every ray projected from a given point must eventually hit the surface of a star, then the night sky should look uniformly as bright as the Sun.
This ends up being somewhat circular then, doesn’t it?
Olbers’ paradox is only a paradox in an infinite, static universe. A fininte, expanding universe explains the night sky very well. One can’t use Olbers’ paradox to discredit the idea of an expanding universe when Olbers’ paradox depends on the universe being static.
Furthermore, upon re-reading MazeHatter’s “The way I see it is...” comment, Theory B does not put us at some objective center of reality. An intuitive way to think about it is: Imagine “space” being the surface of a balloon. Place dots on the surface of the balloon, and blow the balloon up. The distance between dots in all directions expands. One can arbitrarily consider one dot as the “center,” but that doesn’t change anything.
I’m beginning to think that MazeHatter’s comments do not warrant as much discussion as has taken place in this thread. =\
Why does observing a finite amount of light from a finite distance contradict anything about the range of electromagnetic radiation?
Because the range of electromagnetic radiation is infinite. (And light is electromagnetic radiation, FYI)
So that’s what we expected to see. Infinite light.
But that’s not what we saw.
Light does not come from 1 trillion light years away. It does not come from 20 billion light years away.
It makes it to Hubble’s Limit, c/H.
This wasn’t expected.
To explain its redshifting into nothing, one answer is that space is expanding, and if space is expanding uniformly (which we now know isn’t true by a long shot), then it would have began expanding 13.8 billion years ago.
Therefore, in theory, only 13.8 billion years existed for light to travel. And that’s why you don’t seem to think there’s a problem. Because you can solve it with some new logic:
Here’s the recapp:
In theory, light travels to infinity
In observation, light comes from finite distances
So in theory space must expand (v_galaxy = HD)
So in theory only a finite amount of time exists in physics
So in theory, no problem, we see finite light because of finite time
Of course, the evidence against the 13.8 billion number is so overwhelming, they invented an inflation period to magically fast forward through a trillion or more years of it.
Even then, all the examples in my OP describe how the theory still doesn’t work.
If the sun goes around the Milky Way once every 225 million years, then our galaxy has formed in less than 60 spins. Starting to wonder why cosmologists have no legitimate theory of galaxy formation? Now consider trying to explain galaxy that look likes ours that formed in 20 spins. That’s what the new observations ask of us. Completely out of the question. Except, now we have dark matter, which can basically do anything arbitrarily, just like dark energy.
Here’s the alternative:
Observation 1. light doesn't travel to infinity
New Theory A. light doesn't travel to infinity (v_photon = c - HD)
Crazy, I know.
Some people say “hey, that challenges relativity!”, well, it challenges the applicable limits of Maxwell’s equations, upon which relativity is based.
Some people thought Newtonian Mechanics is how reality actually worked. We are now smarter, and we know Newtonian Mechanics is an approximation of reality with a limited domain of applicablility.
For some reason, though, the idea that relativity is an approximation that has its own limited domain of applicability, is scary to people.
It’s all part of this idea, that you can believe science, because science questions itself. Yet once its called science, people are reluctant to question it.
So that’s what we expected to see. Infinite light.
Only if the universe is (not only not expanding as per current standard cosmological theories, but) infinite in both extent and age.
It makes it to Hubble’s limit, c/H.
That’s a misleading way of putting it (as if the light gets some distance and then stops); that simply isn’t what standard physics and cosmology describe.
the evidence against the 13.8 billion number is so overwhelming
… that something like 99% of people who actually know a lot about physics and cosmology accept “the 13.8 billion number”.
our galaxy has formed in less than 60 spins. [...] that formed in 20 spins.
Why should that be a problem? What aspect of our galaxy do you think requires more than 60 spins, and why?
now we have dark matter, which can basically do anything arbitrarily, just like dark energy.
It’s like you aren’t even trying to say things that are true (or that anyone thinks are true).
the idea that relativity is an approximation [...] is scary to people.
You should consider the possibility that people might disagree with you for reasons other than fear.
That’s a misleading way of putting it (as if the light gets some distance and then stops); that simply isn’t what standard physics and cosmology describe.
What standard physics and cosmology (a galaxy recedes at v = HD) descibe is that at that distance D = c/H, a photon encounters space expanding faster than c.
It doesn’t “stop” in standard physics. It gets trapped in a region of space expanding faster than it can travel.
Which is somewhat absurd, if you consider that between your left eye and your right eye is space expanding faster than c, from the perspective of someone c/H to the left and the right.
that something like 99% of people who actually know a lot about physics and cosmology accept “the 13.8 billion number”.
Maybe in 1995.
The original post deflates every piece of evidence for a Big Bang.
between your left eye and your right eye is space expanding faster than c, from the perspective of someone c/H to the left and the right.
No, space in the vicinity of your eyes is (so to speak) held together by gravity and will not be expanding at the Hubble rate.
Maybe in 1995.
You may perhaps be failing to distinguish between when you decided that standard cosmology is all wrong (which may for all I know be 1995) and when everyone else did (which they haven’t).
The original post deflates every piece of evidence for a Big Bang.
The way I see it is this:
Theory A. Light travels to infinity without losing light
Observation 1. Light redshifts and does not seem to travel to infinity
Theory B. The galaxies are receding, redshifting the light
Theory C. Since theory B puts us at the center of reality, let’s make it so space is expanding
Theory D. If space is expanding, in the past it was smaller, and there was a beginning of time
Observation 2. The horizon problem.
Theory E. Everything interacted with everything, then hyper-expanded, then stopped hyper expanding
(skip some steps)
Theory M: Dark energy did it
(skip some steps)
Theory T: In the multiverse, any thing can happen, even different laws of physics (that makes the multiverse of inflation way different than Everettian QM, which operates on the wave equation), and an infinite number of universes are made, and ours happens to do this cool dark energy inflation stuff
Observation 3: BICEP2 found dust.
What I’m questioning Theory A.
Theory A says light travels forever, which was established in the 1800′s before we knew there were galaxies other than the Milky Way, before we detected redshift.
We observe a finite amount of light from a finite distances.
That’s an empirical fact.
That is to say, the empirical and theoretical range of electromagnetic radiation are not in agreement.
The articles I post cast great doubts over the CMB being what it is claimed to be, the existence of a “young universe”, the nucleosynthesis, and the growing skepticism over the scientific value of inflation.
What else does the Big Bang have to rest on?
“Is there a better theory?”
Take Hubble’s Law, v = H D where v is the apparent recessional of velocity of a galaxy at distance D, and H is Hubble’s Parameter. If the apparent recessional velocity is only apparent, and not actual, we could actually take that term (H D) from the distant galaxy (such that its v = 0) and say put it into the frequency (f) of the photon f = (c—H * D) / w where w is the wavelength from which it was emitted.
In this theory, redshift is a feature of light itself.
I know this lends itself to many criticisms and open questions, however, no one has provided Dark Energy, and supposedly it makes up 68% of the observable universe, which doesn’t seem consistent with observation.
Here’s something else I found, when looking into it:
— E. Hubble, Monthly Notices of the Royal Astronomical Society, 97, 506, 1936
I also found that the temperature of space was predicted to be 3K by Aurthor Eddington in 1926. The Big Bang’s predictions where nowhere near that, and when the CMB was discovered in the 60′s at 3K, the discovery was claimed to be evidence of an expanding universe.
If cosmology is why computers worked or airplanes flew, or was ever corroborated by a lab experiment, then I probably wouldn’t be questioning. But it takes places millions of light years away, millions of years ago. And the experts in the articles I cited were calling it into question.
Is there a cognitive bias to have a creation myth?
Hey, another crazy person like me. Now we are two.
I’ve had a similar take on it for a long time. It seems like the expansion is an attempt to explain observed red shifts, necessitating an increasingly convoluted theory to explain other observations.
The observation is, the farther away, the greater the shift, in a linear fashion.
f=f_0 - D*Constant
What if it light just loses energy as it travels, so that the frequency shifts lower?
That seems like a perfectly natural solution. How do we know it isn’t true?
What would be the implications to the current theories if it were true?
This class of theory goes by the name of “Tired light”. It seems as if every theory of this kind precise enough to make definite predictions has been pretty clearly falsified, but I’m not an expert on this stuff.
A relation of the form f = f0 - constant*distance will send the frequency to zero (and then out the other side) once the distances get large enough. You probably don’t want that.
In the limit, yes. I have no prior that says this is a problem.
Also, as one approaches such a limit, I wouldn’t be surprised to see other terms come into play.
Note that no actual infinite limits are required. Just large but finite distances.
I think you should have. One of three problems, depending on what you expect to happen. (1) If you expect something more complicated to happen for large distances, your theory is more complicated than it initially looks. Doesn’t your prior favour simpler theories? (2) If you expect the frequency to pass through zero and continue, your theory will have to (2a) explain what negative frequencies actually mean, why frequency -f isn’t just the same as frequency +f with different phase, why we never see anything with negative frequency, etc., or else (2b) if instead it says that negative frequencies are the same as positive, then explain what happens to the frequency after it crosses zero (gets more negative? then -f isn’t the same as +f after all. gets less negative? then what we actually end up is a really weird discontinuity at the zero crossing). Again, all this stuff is extra complexity. (3) If you expect the issue not to arise because nature somehow ensures that light never travels far enough for the frequency to reach zero, then your theory needs to explain how that happens. Extra complexity again.
This sounds like case 1 above.
I expect infinite complexity, but pick the simplest model to account for the currently known data. Keep on expanding the range of applicability, and I expect to see new effects that aren’t accounted for in models validated over a more restricted range of data.
Reality is more complicated than it looks, and I don’t expect that to end.
No, I don’t expect negative frequencies, as frequency goes down, energy goes down, and I expect quantum effects to take hold as energy approaches zero. You can call that “extra complexity”, but we already know there are quantum effects in general.
OK. Does that stop you regarding a theory as more credible when it’s simpler (for equal fidelity to observed evidence)?
Everything is quantum effects. Do you have more specific expectations?
It’s more credible in the range of data for which it’s fidelity has shown it to be more credible. I expect extrapolations outside that range to have less fidelity.
No.
I don’t have some grand unified theory.
I just observe that a lot of cosmology seems to be riding on the theory that the red shift is caused by an expanding universe.
Note that I ended my first post with questions, not with claims.
This seems wrong to be. There’s at least two independent lines of evidence for the Big Bang theory besides redshifts—isotope abundances (particularly for light elements) and the cosmic background radiation.
We would have to abandon our belief in energy conservation. And we would then wonder why energy seems to be conserved exactly in every interaction we can see. Also we would wonder why we see spontaneous redshifts not spontaneous blue shifts. Every known micro-scale physical process in the universe is reversible [1], and by the CPT theorem, we expect this to be true always. A lot would have to be wrong with our notions of physics to have light “just lose energy.”
This solution requires light from distant galaxies to behave in ways totally different from every other physical process we know about—including physical processes in distant galaxies. It seems unnatural to say “the redshift is explained by a totally new physical process, and this process violates a lot of natural laws that hold everywhere else.”
[1] I should say, reversible assuming you also flip the charges and parities. That’s irrelevant here, though, since photons are uncharged and don’t have any special polarization.
Something already does happen for large distances.
That’s an observable fact. It’s redshift.
What causes it?
The standard answer is expansion, which needs inflation and dark energy and an arbitrary multiverse to do that. Al lthings that make the theory more complicated with distance.
Alternatively, what if light doesn’t travel forever?
How would such a reality look if things existed farther away than light could travel?
Is it not exactly what is observed?
What is more complex,
0 frequency photons ceasing to be photons,
or
infinite wavelength photons everywhere never interacting with anything in space expanding faster than c?
If it’s a matter of complexity, the 0 frequency photons ceasing to exist is less complex than there being infinite wavelength photons everywhere.
The thing you need to evaluate the complexity of is an actual theory, with equations and everything, not a vague suggestion that maybe photons lose energy as they travel.
I don’t think the conventional theory says you have infinite-wavelength photons, and I think your thought experiment with your two hands is wrong. Light from an object at the Hubble limit not only never reaches us, but also never reaches (say) a point 1m “inward” from us. It never gets any closer to us. Note that this is not the same as saying that light from 1m less than the Hubble limit never reaches us, which of course is false.
We get arbitrarily long-wavelength photons, if we wait long enough, but we have to wait longer for the longer-wavelength ones and we would have to wait infinitely long to get ones of infinite wavelength.
The actual theory, equation and everything is that distant galaxies do not recede at v = H D (Hubble’s Old Law), instead a photon travels at v = c—H d (let’s call it Hubble’s New Law).
That’s the cause of redshift.
At c/H, in both the old and new law, the frequency of the photon reaches 0.
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it. What happens at this point is poorly addressed by the standard model, for obvious reasons (it’s ridiculous).
In Hubble’s New Law, the photon has lost all energy, and thus there is no photon.
Old = photons piling up in ever increasing space. do they redshift into the negative frequency?
New = photons redshit to zero then are gone
That’s not the actual theory. It’s a tiny fraction of a theory. It’s not even clear that it makes sense. (What exactly is d here? Total distance travelled by the photon since it first came into being, I guess. But what exactly does that mean? For instance, can there be interference between two photons with different values of d, and if so what happens?)
In your theory, photons travel slower than c, the exact speed depending on their “d” value. That’s going to mess up pretty much everything in quantum electrodynamics, so what do you put in its place?
In your theory, photons pop out of existence when their velocity and frequency reach zero. Again, that violates local conservation of energy and CPT invariance and so forth; again, how are you modifying the fundamentals of conventional physics to deal with this?
But “at c/H” the photons never reach us because their source is receding from us at the speed of light. We never see zero-frequency photons. We do see substantially-lowered-frequency photons (once the light has had long enough to reach us despite the recession).
This doesn’t appear to me to be anything like correct. It is getting closer to the things in front of it—apart from ones that are receding from it faster than the speed of light, which are (and remain) far away from it. What’s the problem here supposed to be?
I repeat: I do not think the conventional theory does say anything like “photons piling up in ever increasing space”. Of course it’s possible that my analysis is wrong; feel free to show me why.
If Hubble’s Law is true, then v = H * D, and when D = c / H, v = c, and that is called Hubble Limit. If you go far enough, galaxies will recede faster than the speed of light. That boundary defines the Hubble Volume.
In the Big Bang model, once a photon goes that far, space is expanding faster than light can travel, it’s basically stuck at Hubble’s Limit. There would be a bunch just adding up in that case.
In this model (which is not the 1930′s Tired Light model), the photon hits a frequency of zero, and ceases to exist.
Your “bunch just adding up” link is to a Q&A site’s question where you ask “would they just pile up?” and get the answer “no, not really”. (I paraphrase a little.)
The answer was that they’d have an infinite wavelength (as if that makes sense) so they wouldn’t literally be on a line in front of my face, but yet, they’re there, alright, zero frequency photons flying at c in space expanding faster than they can travel, forever (according to the expanding theory).
Point is, the frequency redshifts. It will hit zero. That’s consistent with what we observe out there. Known as Hubble’s Limit.
As gjm mentions, the general name for this sort of theory is “tired light.” And these theories have been studied extensively and they are broken.
We have a very accurate, very well-tested theory that describes the way photons behave, quantum electrodynamics. It predicts that photons in the vacuum have a constant frequency and don’t suddenly vanish. Nor do photons have any sort of internal “clock” for how long they have been propagating. As near as I can tell, any sort of tired light model means giving up QED in fairly fundamental ways, and the evidentiary bar to overturn that theory is very high.
Worse, tired light seems to break local energy conservation. If photons just vanish or spontaneously redshift, where does the energy go?
I can conceive of there being a tired light model that isn’t ruled out by experiment, but I would like to see that theory before I junk all of 20th century cosmology and fundamental physics.
Most scientific theories, most of the time, have a whole bunch of quirky observations that they don’t explain well. Mostly these anomalies gradually go away as people find bugs in the experiments, or take into account various effects they hadn’t considered. The astronomical anomalies you point to don’t seem remotely problematic enough to give up on modern physics.
The test between an expanding model is and not expanding model is the Tolman Surface Brightness test.
It seems the expanding models predict an exponent of dimming 4, that is (1+z)^4.
A non expanding model would be an exponent of 1.
You could also make a model with frequency and energy decreasing because the speed of a photon is v = c—H * D.
In that case the trip the photon takes through static space is equal to the duration of the same trip through expanding space. In this case, the exponent predicted is 3 (it loses 1 exponent since the rate of photons stays constant instead of decreasing).
The actual Tolman Surface Brightness test seems to yield an exponent of 3.
or rather an exponent of 2.6 to 3.3 depending on frequency range.
http://en.wikipedia.org/wiki/Tolman_surface_brightness_test
but another explanation to the one given on wikipedia is that the universe is not expanding but has another geometry (again source: crackpot).
Under the assumption that photons don’t lose energy as they travel, right?
The speed? You’re modeling a change in speed by distance traveled?
Because the mathematics looks the same, would this be the same exponent for a model with:
In the expanding model, there are four factors of (1+z). 1 for the decrease in energy, 1 for the reduced rate at which photons arrive, and 2 for the increased trip of the photons through space.
In a static model, let’s say tired light from the 1930′s, light still always traveled at c and never lost energy on its own, it was thought maybe it hit dust. The dust makes the pictures blurry, which would be observed, so tired light (1930′s) is ruled out.
What you and I seem to think is possible is almost always instantly called tired light and dismissed from mind, despite being different models.
The (1930′s) tired light model has static space, and a photon always moving at c. It loses its energy via the hypothetical dust interactions and that’s one factor of dimming. There are three missing.
Now consider a model where the photon’s speed is c—H * d. In this model, the energy decreases, and the time it takes for a photon to make the journey increases with distance.
In other words, in “tired car” model (analog to 1930′s tired light), it takes a car traveling at 60 mpg an hour to travel to a location 60 miles away.
Then there is the standard model, call it “expanding road”. The car still travels at 60 mph, but the destination is receding away. Therefore, the trip is longer than hour.
Now, a novel model, the finite car model. Unlike the other models, the car itself can’t travel to infinity. The road isn’t expanding and the destination stays put, nothing gets in its way, but the car doesn’t travel forever, it (figuratively) runs out of gas and coasts.
If its speed is 60 mph—H * D, then it will take it take longer than an hour. The same amount as if the road were expanding.
Now imagine, you had 1000 of these cars, and you sent a new one toward the destination every 10 minutes.
If the road is not expanding, and the car is coasting it, each car coasts in 10 minutes apart. This model has 3 factors of (1+z).
If the road were expanding, the cars would reach the destination at increasingly larger intervals. The rate of their arrival is the 4th dimming factor.
An AskScienceDiscussion question I asked to verify this
We have slightly different models. You’ve obviously put more thought into yours, but I still like mine better, though I entirely admit I haven’t studied the implications of either.
Your model challenges two fundamental assumptions, and mine only does one.
For my model, the speed of light remains constant, but the energy of the photon decreases as it travels. A photon is a car fueled with itself, slowly burning itself up, though I’m not committed to it entirely burning itself up in the limit.
I wouldn’t think this would have anything to do with “dust”. Just travel through free space. I’m not explaining the effect, which I’d guess would require general relativity, just noting it as a possible mechanism for the observed red shift.
La Wik:
Sounds about right to me.
Does it go somewhere or you’re discarding the Conservation of Energy?
Exchange of momentum with the gravitational field?
I don’t understand this sentence. Do you want to say that light going through the gravitational field makes the gravity stronger..?
Sounds like Dan Davis means “turns into gravitons”.
It might be interesting to consider the physics world at about 1935, and then again at 1945.
I heard one narrative put it in such a way, that these discoveries of galaxies and lots of them far away had quite a bit of interest, until everyone’s focus became war machines and nuclear bombs. When they returned to cosmology after the war, it was as they “said, where were we, space was expanding? ok” and then proceeded to work from there. An oversimplification I’m sure.
By the way. That is one effect of the crackpot theory I posted below. Only it doesn’t ‘lose energy’. It kind of spreads it over a complex time component (thus geometrically the energy isn’t lost) only in our real projection of the time component it appears to be so.
Why does observing a finite amount of light from a finite distance contradict anything about the range of electromagnetic radiation?
(Also… has anyone read http://en.wikipedia.org/wiki/Redshift? It’s… well… good.)
I guess this is a reference to Olbers’ paradox. If every ray projected from a given point must eventually hit the surface of a star, then the night sky should look uniformly as bright as the Sun.
This ends up being somewhat circular then, doesn’t it?
Olbers’ paradox is only a paradox in an infinite, static universe. A fininte, expanding universe explains the night sky very well. One can’t use Olbers’ paradox to discredit the idea of an expanding universe when Olbers’ paradox depends on the universe being static.
Furthermore, upon re-reading MazeHatter’s “The way I see it is...” comment, Theory B does not put us at some objective center of reality. An intuitive way to think about it is: Imagine “space” being the surface of a balloon. Place dots on the surface of the balloon, and blow the balloon up. The distance between dots in all directions expands. One can arbitrarily consider one dot as the “center,” but that doesn’t change anything.
I’m beginning to think that MazeHatter’s comments do not warrant as much discussion as has taken place in this thread. =\
Because the range of electromagnetic radiation is infinite. (And light is electromagnetic radiation, FYI)
So that’s what we expected to see. Infinite light.
But that’s not what we saw.
Light does not come from 1 trillion light years away. It does not come from 20 billion light years away.
It makes it to Hubble’s Limit, c/H.
This wasn’t expected.
To explain its redshifting into nothing, one answer is that space is expanding, and if space is expanding uniformly (which we now know isn’t true by a long shot), then it would have began expanding 13.8 billion years ago.
Therefore, in theory, only 13.8 billion years existed for light to travel. And that’s why you don’t seem to think there’s a problem. Because you can solve it with some new logic:
Here’s the recapp:
Of course, the evidence against the 13.8 billion number is so overwhelming, they invented an inflation period to magically fast forward through a trillion or more years of it.
Even then, all the examples in my OP describe how the theory still doesn’t work.
If the sun goes around the Milky Way once every 225 million years, then our galaxy has formed in less than 60 spins. Starting to wonder why cosmologists have no legitimate theory of galaxy formation? Now consider trying to explain galaxy that look likes ours that formed in 20 spins. That’s what the new observations ask of us. Completely out of the question. Except, now we have dark matter, which can basically do anything arbitrarily, just like dark energy.
Here’s the alternative:
Crazy, I know.
Some people say “hey, that challenges relativity!”, well, it challenges the applicable limits of Maxwell’s equations, upon which relativity is based.
Some people thought Newtonian Mechanics is how reality actually worked. We are now smarter, and we know Newtonian Mechanics is an approximation of reality with a limited domain of applicablility.
For some reason, though, the idea that relativity is an approximation that has its own limited domain of applicability, is scary to people.
It’s all part of this idea, that you can believe science, because science questions itself. Yet once its called science, people are reluctant to question it.
(edited for formating)
Only if the universe is (not only not expanding as per current standard cosmological theories, but) infinite in both extent and age.
That’s a misleading way of putting it (as if the light gets some distance and then stops); that simply isn’t what standard physics and cosmology describe.
… that something like 99% of people who actually know a lot about physics and cosmology accept “the 13.8 billion number”.
Why should that be a problem? What aspect of our galaxy do you think requires more than 60 spins, and why?
It’s like you aren’t even trying to say things that are true (or that anyone thinks are true).
You should consider the possibility that people might disagree with you for reasons other than fear.
What standard physics and cosmology (a galaxy recedes at v = HD) descibe is that at that distance D = c/H, a photon encounters space expanding faster than c.
It doesn’t “stop” in standard physics. It gets trapped in a region of space expanding faster than it can travel.
Which is somewhat absurd, if you consider that between your left eye and your right eye is space expanding faster than c, from the perspective of someone c/H to the left and the right.
Maybe in 1995.
The original post deflates every piece of evidence for a Big Bang.
No, space in the vicinity of your eyes is (so to speak) held together by gravity and will not be expanding at the Hubble rate.
You may perhaps be failing to distinguish between when you decided that standard cosmology is all wrong (which may for all I know be 1995) and when everyone else did (which they haven’t).
In your dreams.