Some recent evidence against the Big Bang
I am submitting this on behalf of MazeHatter, who originally posted it here in the most recent open tread. Go there to upvote if you like this submission.
Begin MazeHatter:
I grew up thinking that the Big Bang was the beginning of it all. In 2013 and 2014 a good number of observations have thrown some of our basic assumptions about the theory into question. There were anomalies observed in the CMB, previously ignored, now confirmed by Planck:
Another is an asymmetry in the average temperatures on opposite hemispheres of the sky. This runs counter to the prediction made by the standard model that the Universe should be broadly similar in any direction we look.
Furthermore, a cold spot extends over a patch of sky that is much larger than expected.
The asymmetry and the cold spot had already been hinted at with Planck’s predecessor, NASA’s WMAP mission, but were largely ignored because of lingering doubts about their cosmic origin.
“The fact that Planck has made such a significant detection of these anomalies erases any doubts about their reality; it can no longer be said that they are artefacts of the measurements. They are real and we have to look for a credible explanation,” says Paolo Natoli of the University of Ferrara, Italy.
… One way to explain the anomalies is to propose that the Universe is in fact not the same in all directions on a larger scale than we can observe. …
“Our ultimate goal would be to construct a new model that predicts the anomalies and links them together. But these are early days; so far, we don’t know whether this is possible and what type of new physics might be needed. And that’s exciting,” says Professor Efstathiou.
We are also getting a better look at galaxies at greater distances, thinking they would all be young galaxies, and finding they are not:
The finding raises new questions about how these galaxies formed so rapidly and why they stopped forming stars so early. It is an enigma that these galaxies seem to come out of nowhere.
http://carnegiescience.edu/news/some_galaxies_early_universe_grew_quickly
http://mq.edu.au/newsroom/2014/03/11/granny-galaxies-discovered-in-the-early-universe/
The newly classified galaxies are striking in that they look a lot like those in today’s universe, with disks, bars and spiral arms. But theorists predict that these should have taken another 2 billion years to begin to form, so things seem to have been settling down a lot earlier than expected.
B. D. Simmons et al. Galaxy Zoo: CANDELS Barred Disks and Bar Fractions. Monthly Notices of the Royal Astronomical Society, 2014 DOI: 10.1093/mnras/stu1817
http://www.sciencedaily.com/releases/2014/10/141030101241.htm
The findings cast doubt on current models of galaxy formation, which struggle to explain how these remote and young galaxies grew so big so fast.
http://www.nasa.gov/jpl/spitzer/splash-project-dives-deep-for-galaxies/#.VBxS4o938jg
Although it seems we don’t have to look so far away to find evidence that galaxy formation is inconsistent with the Big Bang timeline.
If the modern galaxy formation theory were right, these dwarf galaxies simply wouldn’t exist.
Merrick and study lead Marcel Pawlowski consider themselves part of a small-but-growing group of experts questioning the wisdom of current astronomical models.
“When you have a clear contradiction like this, you ought to focus on it,” Merritt said. “This is how progress in science is made.”
http://arxiv.org/abs/1406.1799
Another observation is that lithium abundances are way too low for the theory in other places, not just here:
A star cluster some 80,000 light-years from Earth looks mysteriously deficient in the element lithium, just like nearby stars, astronomers reported on Wednesday.
That curious deficiency suggests that astrophysicists either don’t fully understand the big bang, they suggest, or else don’t fully understand the way that stars work.
It also seems there is larger scale structure continually being discovered larger than the Big Bang is thought to account for:
“The first odd thing we noticed was that some of the quasars’ rotation axes were aligned with each other—despite the fact that these quasars are separated by billions of light-years,” said Hutsemékers. The team then went further and looked to see if the rotation axes were linked, not just to each other, but also to the structure of the Universe on large scales at that time.
“The alignments in the new data, on scales even bigger than current predictions from simulations, may be a hint that there is a missing ingredient in our current models of the cosmos,” concludes Dominique Sluse.
http://www.sciencedaily.com/releases/2014/11/141119084506.htm
D. Hutsemékers, L. Braibant, V. Pelgrims, D. Sluse. Alignment of quasar polarizations with large-scale structures. Astronomy & Astrophysics, 2014
Dr Clowes said: “While it is difficult to fathom the scale of this LQG, we can say quite definitely it is the largest structure ever seen in the entire universe. This is hugely exciting—not least because it runs counter to our current understanding of the scale of the universe.
http://www.sciencedaily.com/releases/2013/01/130111092539.htm
These observations have been made just recently. It seems that in the 1980′s, when I was first introduced to the Big Bang as a child, the experts in the field knew then there were problems with it, and devised inflation as a solution. And today, the validity of that solution is being called into question by those same experts:
In light of these arguments, the oft-cited claim that cosmological data have verified the central predictions of inflationary theory is misleading, at best. What one can say is that data have confirmed predictions of the naive inflationary theory as we understood it before 1983, but this theory is not inflationary cosmology as understood today. The naive theory supposes that inflation leads to a predictable outcome governed by the laws of classical physics. The truth is that quantum physics rules inflation, and anything that can happen will happen. And if inflationary theory makes no firm predictions, what is its point?
http://www.physics.princeton.edu/~steinh/0411036.pdf
What are the odds 2015 will be more like 2014 where we (again) found larger and older galaxies at greater distances, or will it be more like 1983?
- 7 Jan 2015 5:09 UTC; -1 points) 's comment on Open thread Jan. 5-11, 2015 by (
The author appears to confuse “theory T is part of a larger theory U which has bits we don’t understand and bits we haven’t got quite right yet” with “theory T is probably wrong”.
Here theory T is “several billion years ago the universe underwent a very rapid increase in size and decrease in temperature, from something extremely hot and dense to something much more like what we see now”, a.k.a. “the big bang”, and theory U is a detailed account of current best theories of inflation, star formation, galaxy formation, dark matter, dark energy, etc., etc., etc.
One can see exactly the same pattern with T being “over many generations, living things evolve both at random and to adapt to their environments, and this is how the biosphere got where it now is”, U being a detailed account of current best theories of evolutionary biology, palaeontology, etc., etc., etc. The people making the argument in that case are creationists, they do not understand the relevant theories very well, and by pointing out areas of doubt and difficulty they are not in fact doing anyone any good.
I don’t think MazeHatter is a creationist (though I think it’s possible, and some of his/her past comments do seem like they lean in that direction) but I do think s/he’s adopting their tactics, and the result isn’t any more helpful here than it is there. Sure, it’s possible that the whole “big bang” idea might turn out to be wrong, but the fact that you can find a scattering of things we don’t understand about the early universe isn’t much evidence for that and has to be weighed against all the things that current cosmological theories don’t get wrong, which of course you don’t hear about much because “current best scientific theory predicts roughly what is observed” usually isn’t news.
[EDITED to remove gratuitous assumption that MazeHatter is male.]
The way I see it is this:
Theory A. Light travels to infinity without losing light
Observation 1. Light redshifts and does not seem to travel to infinity
Theory B. The galaxies are receding, redshifting the light
Theory C. Since theory B puts us at the center of reality, let’s make it so space is expanding
Theory D. If space is expanding, in the past it was smaller, and there was a beginning of time
Observation 2. The horizon problem.
Theory E. Everything interacted with everything, then hyper-expanded, then stopped hyper expanding
(skip some steps)
Theory M: Dark energy did it
(skip some steps)
Theory T: In the multiverse, any thing can happen, even different laws of physics (that makes the multiverse of inflation way different than Everettian QM, which operates on the wave equation), and an infinite number of universes are made, and ours happens to do this cool dark energy inflation stuff
Observation 3: BICEP2 found dust.
What I’m questioning Theory A.
Theory A says light travels forever, which was established in the 1800′s before we knew there were galaxies other than the Milky Way, before we detected redshift.
We observe a finite amount of light from a finite distances.
That’s an empirical fact.
That is to say, the empirical and theoretical range of electromagnetic radiation are not in agreement.
The articles I post cast great doubts over the CMB being what it is claimed to be, the existence of a “young universe”, the nucleosynthesis, and the growing skepticism over the scientific value of inflation.
What else does the Big Bang have to rest on?
“Is there a better theory?”
Take Hubble’s Law, v = H D where v is the apparent recessional of velocity of a galaxy at distance D, and H is Hubble’s Parameter. If the apparent recessional velocity is only apparent, and not actual, we could actually take that term (H D) from the distant galaxy (such that its v = 0) and say put it into the frequency (f) of the photon f = (c—H * D) / w where w is the wavelength from which it was emitted.
In this theory, redshift is a feature of light itself.
I know this lends itself to many criticisms and open questions, however, no one has provided Dark Energy, and supposedly it makes up 68% of the observable universe, which doesn’t seem consistent with observation.
Here’s something else I found, when looking into it:
— E. Hubble, Monthly Notices of the Royal Astronomical Society, 97, 506, 1936
I also found that the temperature of space was predicted to be 3K by Aurthor Eddington in 1926. The Big Bang’s predictions where nowhere near that, and when the CMB was discovered in the 60′s at 3K, the discovery was claimed to be evidence of an expanding universe.
If cosmology is why computers worked or airplanes flew, or was ever corroborated by a lab experiment, then I probably wouldn’t be questioning. But it takes places millions of light years away, millions of years ago. And the experts in the articles I cited were calling it into question.
Is there a cognitive bias to have a creation myth?
Hey, another crazy person like me. Now we are two.
I’ve had a similar take on it for a long time. It seems like the expansion is an attempt to explain observed red shifts, necessitating an increasingly convoluted theory to explain other observations.
The observation is, the farther away, the greater the shift, in a linear fashion.
f=f_0 - D*Constant
What if it light just loses energy as it travels, so that the frequency shifts lower?
That seems like a perfectly natural solution. How do we know it isn’t true?
What would be the implications to the current theories if it were true?
This class of theory goes by the name of “Tired light”. It seems as if every theory of this kind precise enough to make definite predictions has been pretty clearly falsified, but I’m not an expert on this stuff.
A relation of the form f = f0 - constant*distance will send the frequency to zero (and then out the other side) once the distances get large enough. You probably don’t want that.
In the limit, yes. I have no prior that says this is a problem.
Also, as one approaches such a limit, I wouldn’t be surprised to see other terms come into play.
Note that no actual infinite limits are required. Just large but finite distances.
I think you should have. One of three problems, depending on what you expect to happen. (1) If you expect something more complicated to happen for large distances, your theory is more complicated than it initially looks. Doesn’t your prior favour simpler theories? (2) If you expect the frequency to pass through zero and continue, your theory will have to (2a) explain what negative frequencies actually mean, why frequency -f isn’t just the same as frequency +f with different phase, why we never see anything with negative frequency, etc., or else (2b) if instead it says that negative frequencies are the same as positive, then explain what happens to the frequency after it crosses zero (gets more negative? then -f isn’t the same as +f after all. gets less negative? then what we actually end up is a really weird discontinuity at the zero crossing). Again, all this stuff is extra complexity. (3) If you expect the issue not to arise because nature somehow ensures that light never travels far enough for the frequency to reach zero, then your theory needs to explain how that happens. Extra complexity again.
This sounds like case 1 above.
I expect infinite complexity, but pick the simplest model to account for the currently known data. Keep on expanding the range of applicability, and I expect to see new effects that aren’t accounted for in models validated over a more restricted range of data.
Reality is more complicated than it looks, and I don’t expect that to end.
No, I don’t expect negative frequencies, as frequency goes down, energy goes down, and I expect quantum effects to take hold as energy approaches zero. You can call that “extra complexity”, but we already know there are quantum effects in general.
OK. Does that stop you regarding a theory as more credible when it’s simpler (for equal fidelity to observed evidence)?
Everything is quantum effects. Do you have more specific expectations?
It’s more credible in the range of data for which it’s fidelity has shown it to be more credible. I expect extrapolations outside that range to have less fidelity.
No.
I don’t have some grand unified theory.
I just observe that a lot of cosmology seems to be riding on the theory that the red shift is caused by an expanding universe.
Note that I ended my first post with questions, not with claims.
This seems wrong to be. There’s at least two independent lines of evidence for the Big Bang theory besides redshifts—isotope abundances (particularly for light elements) and the cosmic background radiation.
We would have to abandon our belief in energy conservation. And we would then wonder why energy seems to be conserved exactly in every interaction we can see. Also we would wonder why we see spontaneous redshifts not spontaneous blue shifts. Every known micro-scale physical process in the universe is reversible [1], and by the CPT theorem, we expect this to be true always. A lot would have to be wrong with our notions of physics to have light “just lose energy.”
This solution requires light from distant galaxies to behave in ways totally different from every other physical process we know about—including physical processes in distant galaxies. It seems unnatural to say “the redshift is explained by a totally new physical process, and this process violates a lot of natural laws that hold everywhere else.”
[1] I should say, reversible assuming you also flip the charges and parities. That’s irrelevant here, though, since photons are uncharged and don’t have any special polarization.
Something already does happen for large distances.
That’s an observable fact. It’s redshift.
What causes it?
The standard answer is expansion, which needs inflation and dark energy and an arbitrary multiverse to do that. Al lthings that make the theory more complicated with distance.
Alternatively, what if light doesn’t travel forever?
How would such a reality look if things existed farther away than light could travel?
Is it not exactly what is observed?
What is more complex,
0 frequency photons ceasing to be photons,
or
infinite wavelength photons everywhere never interacting with anything in space expanding faster than c?
If it’s a matter of complexity, the 0 frequency photons ceasing to exist is less complex than there being infinite wavelength photons everywhere.
The thing you need to evaluate the complexity of is an actual theory, with equations and everything, not a vague suggestion that maybe photons lose energy as they travel.
I don’t think the conventional theory says you have infinite-wavelength photons, and I think your thought experiment with your two hands is wrong. Light from an object at the Hubble limit not only never reaches us, but also never reaches (say) a point 1m “inward” from us. It never gets any closer to us. Note that this is not the same as saying that light from 1m less than the Hubble limit never reaches us, which of course is false.
We get arbitrarily long-wavelength photons, if we wait long enough, but we have to wait longer for the longer-wavelength ones and we would have to wait infinitely long to get ones of infinite wavelength.
The actual theory, equation and everything is that distant galaxies do not recede at v = H D (Hubble’s Old Law), instead a photon travels at v = c—H d (let’s call it Hubble’s New Law).
That’s the cause of redshift.
At c/H, in both the old and new law, the frequency of the photon reaches 0.
In the old law, it remains to encounter ever increasing space, getting farther from the things in front of it. What happens at this point is poorly addressed by the standard model, for obvious reasons (it’s ridiculous).
In Hubble’s New Law, the photon has lost all energy, and thus there is no photon.
Old = photons piling up in ever increasing space. do they redshift into the negative frequency?
New = photons redshit to zero then are gone
That’s not the actual theory. It’s a tiny fraction of a theory. It’s not even clear that it makes sense. (What exactly is d here? Total distance travelled by the photon since it first came into being, I guess. But what exactly does that mean? For instance, can there be interference between two photons with different values of d, and if so what happens?)
In your theory, photons travel slower than c, the exact speed depending on their “d” value. That’s going to mess up pretty much everything in quantum electrodynamics, so what do you put in its place?
In your theory, photons pop out of existence when their velocity and frequency reach zero. Again, that violates local conservation of energy and CPT invariance and so forth; again, how are you modifying the fundamentals of conventional physics to deal with this?
But “at c/H” the photons never reach us because their source is receding from us at the speed of light. We never see zero-frequency photons. We do see substantially-lowered-frequency photons (once the light has had long enough to reach us despite the recession).
This doesn’t appear to me to be anything like correct. It is getting closer to the things in front of it—apart from ones that are receding from it faster than the speed of light, which are (and remain) far away from it. What’s the problem here supposed to be?
I repeat: I do not think the conventional theory does say anything like “photons piling up in ever increasing space”. Of course it’s possible that my analysis is wrong; feel free to show me why.
If Hubble’s Law is true, then v = H * D, and when D = c / H, v = c, and that is called Hubble Limit. If you go far enough, galaxies will recede faster than the speed of light. That boundary defines the Hubble Volume.
In the Big Bang model, once a photon goes that far, space is expanding faster than light can travel, it’s basically stuck at Hubble’s Limit. There would be a bunch just adding up in that case.
In this model (which is not the 1930′s Tired Light model), the photon hits a frequency of zero, and ceases to exist.
Your “bunch just adding up” link is to a Q&A site’s question where you ask “would they just pile up?” and get the answer “no, not really”. (I paraphrase a little.)
The answer was that they’d have an infinite wavelength (as if that makes sense) so they wouldn’t literally be on a line in front of my face, but yet, they’re there, alright, zero frequency photons flying at c in space expanding faster than they can travel, forever (according to the expanding theory).
Point is, the frequency redshifts. It will hit zero. That’s consistent with what we observe out there. Known as Hubble’s Limit.
As gjm mentions, the general name for this sort of theory is “tired light.” And these theories have been studied extensively and they are broken.
We have a very accurate, very well-tested theory that describes the way photons behave, quantum electrodynamics. It predicts that photons in the vacuum have a constant frequency and don’t suddenly vanish. Nor do photons have any sort of internal “clock” for how long they have been propagating. As near as I can tell, any sort of tired light model means giving up QED in fairly fundamental ways, and the evidentiary bar to overturn that theory is very high.
Worse, tired light seems to break local energy conservation. If photons just vanish or spontaneously redshift, where does the energy go?
I can conceive of there being a tired light model that isn’t ruled out by experiment, but I would like to see that theory before I junk all of 20th century cosmology and fundamental physics.
Most scientific theories, most of the time, have a whole bunch of quirky observations that they don’t explain well. Mostly these anomalies gradually go away as people find bugs in the experiments, or take into account various effects they hadn’t considered. The astronomical anomalies you point to don’t seem remotely problematic enough to give up on modern physics.
The test between an expanding model is and not expanding model is the Tolman Surface Brightness test.
It seems the expanding models predict an exponent of dimming 4, that is (1+z)^4.
A non expanding model would be an exponent of 1.
You could also make a model with frequency and energy decreasing because the speed of a photon is v = c—H * D.
In that case the trip the photon takes through static space is equal to the duration of the same trip through expanding space. In this case, the exponent predicted is 3 (it loses 1 exponent since the rate of photons stays constant instead of decreasing).
The actual Tolman Surface Brightness test seems to yield an exponent of 3.
or rather an exponent of 2.6 to 3.3 depending on frequency range.
http://en.wikipedia.org/wiki/Tolman_surface_brightness_test
but another explanation to the one given on wikipedia is that the universe is not expanding but has another geometry (again source: crackpot).
Under the assumption that photons don’t lose energy as they travel, right?
The speed? You’re modeling a change in speed by distance traveled?
Because the mathematics looks the same, would this be the same exponent for a model with:
In the expanding model, there are four factors of (1+z). 1 for the decrease in energy, 1 for the reduced rate at which photons arrive, and 2 for the increased trip of the photons through space.
In a static model, let’s say tired light from the 1930′s, light still always traveled at c and never lost energy on its own, it was thought maybe it hit dust. The dust makes the pictures blurry, which would be observed, so tired light (1930′s) is ruled out.
What you and I seem to think is possible is almost always instantly called tired light and dismissed from mind, despite being different models.
The (1930′s) tired light model has static space, and a photon always moving at c. It loses its energy via the hypothetical dust interactions and that’s one factor of dimming. There are three missing.
Now consider a model where the photon’s speed is c—H * d. In this model, the energy decreases, and the time it takes for a photon to make the journey increases with distance.
In other words, in “tired car” model (analog to 1930′s tired light), it takes a car traveling at 60 mpg an hour to travel to a location 60 miles away.
Then there is the standard model, call it “expanding road”. The car still travels at 60 mph, but the destination is receding away. Therefore, the trip is longer than hour.
Now, a novel model, the finite car model. Unlike the other models, the car itself can’t travel to infinity. The road isn’t expanding and the destination stays put, nothing gets in its way, but the car doesn’t travel forever, it (figuratively) runs out of gas and coasts.
If its speed is 60 mph—H * D, then it will take it take longer than an hour. The same amount as if the road were expanding.
Now imagine, you had 1000 of these cars, and you sent a new one toward the destination every 10 minutes.
If the road is not expanding, and the car is coasting it, each car coasts in 10 minutes apart. This model has 3 factors of (1+z).
If the road were expanding, the cars would reach the destination at increasingly larger intervals. The rate of their arrival is the 4th dimming factor.
An AskScienceDiscussion question I asked to verify this
We have slightly different models. You’ve obviously put more thought into yours, but I still like mine better, though I entirely admit I haven’t studied the implications of either.
Your model challenges two fundamental assumptions, and mine only does one.
For my model, the speed of light remains constant, but the energy of the photon decreases as it travels. A photon is a car fueled with itself, slowly burning itself up, though I’m not committed to it entirely burning itself up in the limit.
I wouldn’t think this would have anything to do with “dust”. Just travel through free space. I’m not explaining the effect, which I’d guess would require general relativity, just noting it as a possible mechanism for the observed red shift.
La Wik:
Sounds about right to me.
Does it go somewhere or you’re discarding the Conservation of Energy?
Exchange of momentum with the gravitational field?
I don’t understand this sentence. Do you want to say that light going through the gravitational field makes the gravity stronger..?
Sounds like Dan Davis means “turns into gravitons”.
It might be interesting to consider the physics world at about 1935, and then again at 1945.
I heard one narrative put it in such a way, that these discoveries of galaxies and lots of them far away had quite a bit of interest, until everyone’s focus became war machines and nuclear bombs. When they returned to cosmology after the war, it was as they “said, where were we, space was expanding? ok” and then proceeded to work from there. An oversimplification I’m sure.
By the way. That is one effect of the crackpot theory I posted below. Only it doesn’t ‘lose energy’. It kind of spreads it over a complex time component (thus geometrically the energy isn’t lost) only in our real projection of the time component it appears to be so.
Why does observing a finite amount of light from a finite distance contradict anything about the range of electromagnetic radiation?
(Also… has anyone read http://en.wikipedia.org/wiki/Redshift? It’s… well… good.)
I guess this is a reference to Olbers’ paradox. If every ray projected from a given point must eventually hit the surface of a star, then the night sky should look uniformly as bright as the Sun.
This ends up being somewhat circular then, doesn’t it?
Olbers’ paradox is only a paradox in an infinite, static universe. A fininte, expanding universe explains the night sky very well. One can’t use Olbers’ paradox to discredit the idea of an expanding universe when Olbers’ paradox depends on the universe being static.
Furthermore, upon re-reading MazeHatter’s “The way I see it is...” comment, Theory B does not put us at some objective center of reality. An intuitive way to think about it is: Imagine “space” being the surface of a balloon. Place dots on the surface of the balloon, and blow the balloon up. The distance between dots in all directions expands. One can arbitrarily consider one dot as the “center,” but that doesn’t change anything.
I’m beginning to think that MazeHatter’s comments do not warrant as much discussion as has taken place in this thread. =\
Because the range of electromagnetic radiation is infinite. (And light is electromagnetic radiation, FYI)
So that’s what we expected to see. Infinite light.
But that’s not what we saw.
Light does not come from 1 trillion light years away. It does not come from 20 billion light years away.
It makes it to Hubble’s Limit, c/H.
This wasn’t expected.
To explain its redshifting into nothing, one answer is that space is expanding, and if space is expanding uniformly (which we now know isn’t true by a long shot), then it would have began expanding 13.8 billion years ago.
Therefore, in theory, only 13.8 billion years existed for light to travel. And that’s why you don’t seem to think there’s a problem. Because you can solve it with some new logic:
Here’s the recapp:
Of course, the evidence against the 13.8 billion number is so overwhelming, they invented an inflation period to magically fast forward through a trillion or more years of it.
Even then, all the examples in my OP describe how the theory still doesn’t work.
If the sun goes around the Milky Way once every 225 million years, then our galaxy has formed in less than 60 spins. Starting to wonder why cosmologists have no legitimate theory of galaxy formation? Now consider trying to explain galaxy that look likes ours that formed in 20 spins. That’s what the new observations ask of us. Completely out of the question. Except, now we have dark matter, which can basically do anything arbitrarily, just like dark energy.
Here’s the alternative:
Crazy, I know.
Some people say “hey, that challenges relativity!”, well, it challenges the applicable limits of Maxwell’s equations, upon which relativity is based.
Some people thought Newtonian Mechanics is how reality actually worked. We are now smarter, and we know Newtonian Mechanics is an approximation of reality with a limited domain of applicablility.
For some reason, though, the idea that relativity is an approximation that has its own limited domain of applicability, is scary to people.
It’s all part of this idea, that you can believe science, because science questions itself. Yet once its called science, people are reluctant to question it.
(edited for formating)
Only if the universe is (not only not expanding as per current standard cosmological theories, but) infinite in both extent and age.
That’s a misleading way of putting it (as if the light gets some distance and then stops); that simply isn’t what standard physics and cosmology describe.
… that something like 99% of people who actually know a lot about physics and cosmology accept “the 13.8 billion number”.
Why should that be a problem? What aspect of our galaxy do you think requires more than 60 spins, and why?
It’s like you aren’t even trying to say things that are true (or that anyone thinks are true).
You should consider the possibility that people might disagree with you for reasons other than fear.
What standard physics and cosmology (a galaxy recedes at v = HD) descibe is that at that distance D = c/H, a photon encounters space expanding faster than c.
It doesn’t “stop” in standard physics. It gets trapped in a region of space expanding faster than it can travel.
Which is somewhat absurd, if you consider that between your left eye and your right eye is space expanding faster than c, from the perspective of someone c/H to the left and the right.
Maybe in 1995.
The original post deflates every piece of evidence for a Big Bang.
No, space in the vicinity of your eyes is (so to speak) held together by gravity and will not be expanding at the Hubble rate.
You may perhaps be failing to distinguish between when you decided that standard cosmology is all wrong (which may for all I know be 1995) and when everyone else did (which they haven’t).
In your dreams.
Unrelated to my other comment: this is a decent sample of open problems related to the structure formation and element composition in the early universe. Cosmic inflation is an umbrella term for a number of models designed to solve the horizon and flatness problems by a rapid expansion many orders of magnitude shortly after the big bang. Several predictions of such models have been confirmed, while some others falsified, though with low confidence (e.g. the quadrupole moment in the CMB spectrum). Fine tuning of inflationary parameters is aesthetically the most unsatisfactory feature of inflation, but any known alternative is even worse. I am not qualified to judge the validity of the Steinhardt’s claims, but other experts in the field don’t seem as pessimistic.
The rapid expansion happens because some mysterious dark energy that is apparently far more abundant than the energy we observe all around us, for no other reason than maybe if anything can happen (multiverse), the entire cosmic web gets produced in the blink of an eye.
You probably mean “was more abundant”. Dark energy density (well, stress-energy density, because it’s the stress, not energy which causes expansion) due to some hypothetical field is the mechanism for most models of inflation. What we see now is just pitiful leftovers.
That’s not a good argument to reject a model. If you have something with predictive power, you use it, until something better comes along.
I don’t know what you mean by this.
Again, may I recommend that you learn a topic before making pronouncements about it?
What I mean is that there seems to be larger scale structure, a giant web of filaments and walls and voids, and that its becoming apparent, this basically existed in tact as far as the eye can see.
How’d it get there? Inflation. Which makes the blink of an eye look like eternity.
Basically inflation, dark energy, and the multiverse can arbitrarily accomplish anything at this point. That was the point of the Inflation Debate article. It doesn’t ltierally predict anything in particular. We just observe things and try to make models that fit, and say “the multiverse, crazy, huh?”
As other posters have noted, this is not a case of overturning the ‘big bang’ in the sense that there was a time about 13 gigayears in the past when the universe was dense and hot and it has expanded since. Inflation (its veracity and the details thereof if it did indeed happen) and early galaxy formation are active areas of research.
I remember when that was discovered! It is very interesting. That and the ‘dark flow’, an apparent bias in the movement of galaxies across the visible universe. I have wondered if this apparent inhomogeneity in the early universe reduces the need for inflationary cosmology in the first place to explain what is observed? It was posited in the first place to explain extreme sameness in all directions of the CMB. That it runs counter to certain cosmological nuances does not go against the evidence that the universe is expanding and used to be dense and hot.
They aren’t noting their age, they are noting their star-forming status. And finding that many finished star formation much earlier than previous models suggested, suggesting models of galaxy and star formation are wrong. It should be noted that it does not take long after star formation stops for galaxies to lose their ‘young’ gleam—star brightness increases with the 3.5th power of mass and so star lifetime decreases with the 2.5th power of mass. A star 4x as massive as the sun will be 128x as bright and only live for a bit over 300 megayears. The thin scum of the largest stars (up to 100 solar masses) dominates the light produced by any given galaxy and goes away quite fast after star formation ceases, cosmologically speaking.
It should also be noted that there IS a very distinct arrow-of-time in star formation rates that points at the early universe being very different than the late universe: ARXIV, PDF Popular press
To quote myself from a previous post I made:
This is a very strong tendency through time for star formation rates to be high early and tail off. (NOTE: if space weren’t expanding, that star formation rate difference would be even higher than the above figure given the observations made.) Combined with the fact that there don’t seem to be any stars older than the age of the cosmic microwave background radiation… yeah, something big happened 13 gigayears ago.
To my mind the cosmic microwave background radiation, the observation of the slow reionization of the universe from neutral hydrogen to hydrogen plasma over the course of hundreds of megayears after the first quasars, and the fact that if you fit models of space expansion and run them back to t=0 and calculate the densities and temperatures thereof and simulate nuclear reactions of the first few minutes after t=0 you can get out the primordial hydrogen and helium and isotope ratios that are observed (lithium 7 showing difficulties with the models with several possible implications which are being explored, including possibly relaxing the assumption of homogeneity) are the best pieces of evidence that about 13 gigayears ago there was a major event where a dense hot universe began expanding into a cool empty one. That and the redshift. Basically, something banged, and theyre still trying to figure out what did and why, and lots of details of the early universe are still being worked out. As is the apparent fact that the expansion is accelerating, and the fact that the best model so far for predicting the motions and spins of galaxies requires nonbaryonic mass.
Additionally, to my mind, any speculation as to if the big bang was truly the ‘beginning of time’, or one among many, or symmetrical about t=0, or all manner of other things is premature as there is insufficient data as to the nature of the event in the first place. The fact that an event happened seems clear though.
Why doesn’t this just mean that we are moving w.r.t. the rest frame of the CMB? The signal is redshifted in the hemisphere we’re moving away from, and blueshifted in the hemisphere we’re moving towards, so it would look hotter in the hemisphere we’re moving towards.
Yes, that is precisely correct. This is one anisotropy that you expect to see in any model, because it’s a fact about us, not about the universe.
The higher modes, however, suggest that the Big Bang wasn’t homogeneous and isotropic. That doesn’t make it not a BIG BANG in a general sense, nor not the Big Bang in the technical sense. It just means that there was more going on than we knew about. We already knew that.
The anisotropy you speak of from motion does exist. When you extract such a dipole from the data and renormalize though (you can very precisely calculate what the effects of motion would be and fit the direction to the radiation we observe) an asymmetry remains along a different axis.
Astronomy is extremely difficult. We don’t know the relevant fundamental physics, and we can’t perform direct experiments on our subjects. We should expect numerous problems with any cosmological model that we propose at this point. The only people who are certain of their cosmologies are the religious.
You need to do a lot more work for this sort of post to be useful. Cherry-picking weak arguments spread across the entire field of astronomy isn’t enough.
There are some areas where rigorous analysis can be made. No need to cherry pick some sample points. Actually lots of dark matter physics and cosmic inflation modelling apparently uses cherry picking too to arrive at their results. The Sloan Digital Sky Survey provides such an immense trove of data on galaxies very far back into the past (up to z>10) that statistical analysis with millions of galaxies is possible. Basically ‘all there are’. Enough for a crackpot to use it to ‘disprove’ inflation:
SDSS Renaisance—end of the ‘dark age’ in cosmology
I’m not sure whether I should recommend reading that. It is based on lots of real data and shows clear failures of LCDM on that but in the end it all drives toward the author’s (apparently a crackpot) pet theory of a static universe based on a ‘small’ correction to general relativity (complex time).
At this point I have to stop and ask for your credentials in astronomy. The link you posted reeks strongly of crackpot, and it’s most likely not worth my time to study. Maybe you’ve studied cosmology in detail and think differently? If you think the author is wrong about their pet theory of general relativity, why do you think they’re right in their disproof of LCDM?
I don’t know whether his theory is wrong. In the end I’m not qualified to make that claim.
Despite all the crackpottery of the auther (“‘dark’ age”, “Einsteins blunder”...) there are some things that he does different than other crackpots. He doesn’t resort to interpreting the words instead of the math of physics nor does he avoid to make testable predictions nor does he cherry-pick or creatively reinterpret data. For the SDSS data less so than serious astro physicists apparently.
His curves which quite well fit the raw SDSS data seem to be derived from an unusual but ‘simple’ spacetime geometry and are not notably fitted to parameters—quite opposite to the LCDM curves which he took from standard sources (those only fit comparatively cherry picked galaxies). Thus judging from raw SDSS data LCDM could be considered severely challenged.
He also gives all the sources he uses, the SDSS queries used in the graph generation, how the magnitudes are calculated, how the results apply in different spectral lines. All relevant considerations you’d rather expect in serious work. It is only tained by his extraordinary claims, his ego and other crackpottery traits (like making grande generalizations about everyhing).
Great! It will certainly be accepted for publication in a peer-reviewed journal. The author will most likely win a Nobel Prize for his work and be hired to work at the top institution of his choice.
Yeah. One probably can read that PDF only if one is devoid od status regulating emotions.
Totally. That’s why I added the disclaimer. I edited it a bit to make that more clear. The author matches all criteria for crackpot no doubt. But even a crackpot can stumble upon something.
I do not have credentials in astronomy. I’m somewhat well-read in the subject and can handle sufficient math. And when checking the presented data (I did actual SDSS queries; the SDSS explorer and query facilities are genuinely cool) it appears that there is something to his claims—if not to his theory itself.
What makes this a rationality post?
We are consistently looking at the evidence and saying “we know there are only 13.8 billion years to work with… why do we see mature galaxies as they were 11 billion years ago, where are all these huge webs of galaxies and massive voids existing at the dawn of time coming from, ect.”
First question, what is the probabilty that space is actually expanding?
Assuming it is expanding, what is the probability it had to go back to a Big Bang? (Some models, have it contracting and expanding without a Big Bang)
And if there was a Big Bang, what is the probability that 2015 will be different than the last 90 years where how long ago the Big Bang happened gets pushed back a billion years once or twice a decade.
(Side note: This 1/H = 13.8 billion years business is fishy. If 1/H yeilds the age of physical reality as we know it right now, what’ll happen 14 billion years from now? 1/H will give the same value for the “age of the universe” when it would be 28 billion years old. It’s true that H is not thought to be constant, but if the expansion is now accelerating, H is moving in a different direction, therefore, with everyday, 1/H predicts a smaller age. It’s growing younger. Yet, over the decades our calculations went from 2 billion to 4 billion to 8 to 10 to 12 to 13.8, our observations are forcing older ages. 1/H is suspiciously close to c/H, 14.2 billion light years, also known as Hubble’s Length, or Hubble’s Radius, or Hubble’s Limit.)
It might be time for a rational cosmology. There isn’t a certainty about the expansion of space, dark energy, inflation. There is a small sampling of reality known as our Hubble Volume, of which there are many, not just one centered on Earth. Galaxies would presumably exist trillions of light years beyond our ability to detect them with electromagenetic radiation.
In our Hubble Volume, there is a cold spot in the CMB to the south. Imagine a Hubble Volume centered 2 * Hubble Radius to the south of us. Does an observer in that Hubble Volume see the same cold spot to their south?
In this cosmology, the CMB is warmer to the north because there are more galaxies in that direction. That’s considered an anomaly in the Big Bang theory because they should be equal, and equal to every Hubble Volume.
Well, there’s a weak gimme answer: scholarship (eleventh tenet of rationality). It’s science, even if not directly related to human cognition.
It’s also an interesting topic for rational discussion, in the sense of “here’s some evidence against a thing that almost everybody believes. What is your response to it?” Things I would expect to see (and of which there have been some) is comments about things like updating beliefs, offering evidence that the sources of this contrary evidence should not be taken as authoritative, attempting to assign priors to the accuracy of various theories or hypotheses and seeing what that says about our actual beliefs in this area (and how they change when new evidence is introduced), etc.
Besides, this is discussion. I wouldn’t say the post is suitable to be promoted on the main page, but neither are posts on many of the memes associated with LW (like peoples’ thoughts on friendly AI, or effective altruism). This post may turn out to be less correct than those topics (or may not), but this doesn’t make it any less suitable for posting here.
That’s a reasonable question. In general, it seems like there’s a broad notion here of what is relevant to “rationality”- and many users see issues in science of general interest as connected. There are also good reasons to see fundamental physics as potentially connected because it connects closely both to issues involving the Fermi problem as well as to issues involving anthropic reasoning. Examples of potential failure modes of the scientific community are also relevant (although our prior should almost universally be that no serious failure is likely to be going on).
The dominant Big Bang theory (the Lambda CDM model of inflationary cosmology) today has evolved significantly from the Big Bang theory of the 80s. For example, the realization in the late 90s that the expansion of the universe was accelerating rather than decelerating required significant changes.
And yes, we know that we do not yet have a perfect understanding of it yet, and that more tweaks and modifications are needed. But I think it is inaccurate to talk as if the big bang model needs to be thrown away. The situation is more like a jigsaw puzzle where we have a good idea in general what is going on, but there are still missing pieces.
Downvoted for circumventing minimum karma requirements. Don’t do it.
Is this not kosher? The minimum karma requirement seems like an anti-spam and anti-troll measure, with the unfortunate collateral damage of temporarily gating out some potential good content. The post seems clear to me as good content, and my suggestion to MazeHatter in the open thread that this deserved its own thread was upvoted.
If that doesn’t justify skirting the rule, I can remove the post.
The point of the rule is to limit the amount of bad content. This isn’t bad content, so working around the rule seems justified.
If a rule and the stated reason for the rule conflict… the rule sometimes wins, but only for practical reasons that don’t seem to apply here.
Rather than getting into an object-level discussion of whether this particular content is bad, let’s look at the general principles.
I suggest that “bad content” means something like “content whose only reason for being here is that a low-quality poster wants it here” (with “low-quality” being defined by karma, and yes I know that this is a ridiculously poor measure of actual poster quality). If someone with good karma thinks something is worth posting, this should be allowed regardless of whether they originally got it from someone else with bad karma.
(Bad karma should be interpreted as “the fact that this person wants to post something is very weak evidence that it belongs here”, not as “the fact that this person wants to post something is evidence that it doesn’t belong here”.)
So I don’t think there’s anything very improper about JStewart’s decision to post MazeHatter’s stuff here. But of course it means JStewart is taking responsibility for doing so, and if it gets downvoted into oblivion because everyone hates it then JStewart is the one who takes the karma hit.
“The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new dark age.”
This is an anti-rationality quote from Lovecraft, and has little to do with anything under discussion. What makes you think that trying to understand the early universe is likely to cause insanity or anything similar?
I just wonder why the prevailing cosmological model in my lifetime has mysteriously come into question. Does the act of observing the universe somehow change its inferable past?
What does that wondering that have to do with the quote in question?
Almost certainly not, but what does this have to do with anything under discussion?
That model isn’t that old. So if you were living say ~50 years ago the prevaling model of the time would also have come into question (and been overturned).