The Zel’dovich Universe – Day 2 Summary

IMG-20140624-00349

Day Two of this enjoyable meeting involved more talks about the cosmic web of large-scale structure of the Universe. I’m not going to attempt to summarize the whole day, but will just mention a couple of things that made me reflect a bit. Unfortunately that means I won’t be able to do more than merely mention some of the other fascinating things that came up, as phase-space flip-flops and one-dimensional Origami.

One was a very nice review by John Peacock in which he showed that a version of Moore’s law applies to galaxy redshift surveys; since the first measurement of the redshift of an extragalactic object by Slipher in 1912, the number of redshifts has doubled every 2-3 years ago. This exponential growth has been driven by improvements in technology, from photographic plates to electronic detectors and from single-object spectroscopy to multiplex technology and so on. At this rate by 2050 or so we should have redshifts for most galaxies in the observable Universe. Progress in cosmography has been remarkable indeed.

The term “Cosmic Web” may be a bit of a misnomer in fact, as a consensus may be emerging that in some sense it is more like a honeycomb. Thanks to a miracle of 3D printing, here is an example of what the large-scale structure of the Universe seems to look like:

IMG-20140624-00350

One of the issues that emerged from the mix of theoretical and observational talks concerned the scale of cosmic homogeneity. Our standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be? A couple of presentations discussed the possibly worrying evidence for the presence of a local void, a large underdensity on scale of about 200 MPc which may influence our interpretation of cosmological results.

I blogged some time ago about that the idea that the Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension D. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius R is proportional to R^D. If galaxies are distributed uniformly (homogeneously) then D = 3, as the number of neighbours simply depends on the volume of the sphere, i.e. as R^3, and the average number-density of galaxies. A value of D < 3 indicates that the galaxies do not fill space in a homogeneous fashion: D = 1, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as R^1, not as its volume; galaxies distributed in sheets would have D=2, and so on.

We know that D \simeq 1.2 on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to D=3 has not been so strong, at least not until recently. It’s just just that measuring D from a survey is actually rather tricky, but also that when we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential \delta\Phi by \delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right), give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale \lambda reasonably large relative to the cosmological horizon \sim ct. Galaxies correspond to a large \delta \rho/\rho \sim 10^6 but don’t violate the Cosmological Principle because they are too small in scale \lambda to perturb the background metric significantly.

The discussion of a fractal universe is one I’m overdue to return to. In my previous post I left the story as it stood about 15 years ago, and there have been numerous developments since then, not all of them consistent with each other. I will do a full “Part 2” to that post eventually, but in the mean time I’ll just comment that current large surveys, such as those derived from the Sloan Digital Sky Survey, do seem to be consistent with a Universe that possesses the property of large-scale homogeneity. If that conclusion survives the next generation of even larger galaxy redshift surveys then it will come as an immense relief to cosmologists.

The reason for that is that the equations of general relativity are very hard to solve in cases where there isn’t a lot of symmetry; there are just too many equations to solve for a general solution to be obtained. If the cosmological principle applies, however, the equations simplify enormously (both in number and form) and we can get results we can work with on the back of an envelope. Small fluctuations about the smooth background solution can be handled (approximately but robustly) using a technique called perturbation theory. If the fluctuations are large, however, these methods don’t work. What we need to do instead is construct exact inhomogeneous model, and that is very very hard. It’s of course a different question as to why the Universe is so smooth on large scales, but as a working cosmologist the real importance of it being that way is that it makes our job so much easier than it would otherwise be.

PS. If anyone reading this either at the conference or elsewhere has any questions or issues they would like me to raise during the summary talk on Saturday please don’t hesitate to leave a comment below or via Twitter using the hashtag #IAU308.

30 Responses to “The Zel’dovich Universe – Day 2 Summary”

  1. “A couple of presentations discussed the possibly worrying evidence for the presence of a local void, a large underdensity on scale of about 200 MPc which may influence our interpretation of cosmological results.”

    – I’d love details on that. Are they large enough to start to undermine the need for dark energy?

    • telescoper Says:

      One of the talks concerned has a PDF online:

      Click to access S8%20Collins.pdf

      This describes a large-scale underdensity in X-ray selected clusters…

    • “even if it were large enough to counteract the effects of dark energy* in some respect, one would still have to explain the other bits and pieces of the “concordance model”.” – indeed. But it is worth testing, because the possibility of *no* dark energy/cosmological constant makes a big difference to the usual picture. The key point here is you need to do the structure formation studies in an LTB (inhomogeneous) model, using FLRW based perturbation theory may be very misleading. Nevertheless probably Efstathiou’s old work (1995?) rules it out … unless direct observations show that there *is* such a void! – hence the interest in this data.

    • telescoper Says:

      On the other hand there doesn’t seem to be any evidence for such a big void from velocity fields. The talks of Brent Tully and Mike Hudson are worth looking at when they go up on the web page.

      • … but at least we have an evidence for a supervoid of similar size in the nearby Universe according to Szapudi et al. 2014. Moreover, we effectively explained the actual CMB pattern of the Cold Spot in Finelli et al. 2014 using an LTB void model. These new findings might indicate that there is some room for a local void of similar size.

      • telescoper Says:

        On the other hand we had a talk earlier today from Shaun Hotchkiss arguing that LCDM voids have a negligible effect on the CMB..

    • telescoper Says:

      Nor is it a confused BLT

    • The voids that could possibly undermine the need for dark energy would need to be very much larger (~1000 Mpc to fit the newer supernovae data), and would also need us to be close to the centre of them. The Local Void Brent Tully and others were talking about appeared to be something we were very much on the edge of.

      That’s not to say it couldn’t have an important effect on our interpretation of cosmological results, but nobody seemed to address that question very much.

    • But Mr. Helbig, what is currently fashionsble today may be in the rubbish bin in a couple of decades. In fact, if it were not, science would have stagnated into pseudo-science unencumbered by definitive predictions and empirical evidence (which appears to be where the multiverse and anthropic reasoning drivel are heading).

      Look telescoper, sure we have some evidence for very approximate statistical uniformity over a relatively puny range of “large” scales. However, we have no evidence that nature’s hierarchy ends approximately where our observational limitations kick in.

      One could argue that a metagalactic scale of structure, which dwarfs everything we call “large scale” and gives the lie to cosmological “homogeneity”, is probably unobservable. However, once again the history of science teaches us the problem with this argument. Many things that were once thought to be unobservable (atoms, for instance) are now observable.

      Sigh. So many model-building mechanics tinkering with their dreary Ptolemaic models (what you see is all there is), and so few natural philosophers studying the fundamental principles and physical properties of nature.

      • telescoper Says:

        The Universe could well be inhomogeneous well beyond the scale of our horizon – most cosmologists are quite open to that idea – but as scientists we’re focussing on trying to explain what we can see, i.e. the observable Universe. We now have a picture that describes the structure of this patch, explains its origin as a dynamical process in the early Universe, and also relates its properties with other observations, primarily the cosmic microwave background. All this is explicitly based on the “fundamental principles and physical properties” for nature, at least as far as we understand them (i.e. general relativity and quantum field theory). Your ideas on the other hand, dismiss relevant observations and have no basis in any fundamental principle at all. I find your comments highly irrational.

      • The fundamental principle is discrete (or less appropriately called “broken”) conformal symmetry; its manifestation is a discrete self-similar cosmos, i.e., a fractal cosmos with a geometry and group structure that is new and different than the conventional over-simplifications. Weyl made a first stab at this but could not make it work. I think that is because he focused on local continuous conformal geometry and did not fully consider global discrete conformal geometry.

        If you want to call that irrational, be my guest.

      • telescoper Says:

        Can you show me the papers that outline the mathematical details of this theory, how it relates to general relativity and quantum field theory, and what predictions it makes for the observed near-isotropy of the CMB?

      • The website: http://www3.amherst.edu/~rloldershaw is devoted to this topic. There are introductory papers, a list of scores of publications in refereed physics/astrophysics journals, many definitive predictions, and a wealth of empirical evidence that argues that this alternative discrete self-similar paradigm is worth considering.

        On the other hand, my analytical skills are limited and I have often had to publish in somewhat obscure journals because reviewers often demand that the new paradigm run a gauntlet that their paradigm does not have to run and could not survive.
        One must use one’s imagination to see what thousands of physicists working over several decades could achieve with the new discrete conformal (fractal) paradigm

        The infinite self-similar paradigm has been around since the 5th century BC: Democritus, Kant, Fournier d’Albe, Mandelbrot, and many others. It has never gained a lot of attention, but that may change as empirical evidence mounts and definitive predictions are successfully passed.

        Regarding definitive predictions, which are the sine qua non of science, here are 15 bold predictions made by the new fractal paradigm, and the empirical evidence that supports them so far.

        http://www.academia.edu/2917630/Predictions_of_Discrete_Scale_Relativity

        I tire of the endless battle for open-mindedness in cosmology and hope others will take up the torch.

  2. Thanks for discussing these issues on fractal large scale structure.

    Mandelbrot and de Vaucouleurs had many important insights into this topic, but their cosmological ideas were resisted, sometimes passively and sometimes emphatically.

    Observers have published evidence for inhomogeneous structures on the 1,000 Mpc scale for decades, but their efforts have also been largely ignored or pecked to death by ducks.

    • telescoper Says:

      I’m confused by the phrase “pecked to death by ducks”, but I’ll take it to mean “refuted using better observations”. We only now have systematic galaxy surveys out to >1GPc distances and we’ve refuted the guesswork of people who extrapolated wildly from data sets that were too small.

  3. If you go to: http://www3.amherst.edu/~rloldershaw and click on “Selected Papers” #8 entitled “The Legend of Cosmological Homogeneity”, you will find a brief essay on the homogeneity/inhomogeneity issue.

    In the last 10 years galactic clustering structures and voids with scales in the 500-1,000 Mpc range have been observed, and papers on their reality have been published. There are some very large inhomogeneities discovered by the Planck mission, such as an anisotropic “directionality” to the whole observable universe.

    The hypothetical “turn-over to homogeneity” has been in retreat for decades, but is clung to religiously because it is a fundamental tenant of the standard cosmological liturgy.

    Note also that inhomogeneous cosmos models can explain the dark energy acceleration without ad hoc and unnatural epicycles.

    • telescoper Says:

      Here’s a paper from 2000 using the IRAS PSCz survey showing conclusively that there’s a turnover towards homogeneity (correlation dimension ~ 3) after about 50 Mpc:

      http://adsabs.harvard.edu/abs/2000MNRAS.318L..51P

      A similar analysis was carried out for SDSS a few years later:

      http://adsabs.harvard.edu/abs/2005ApJ…624…54H

      and more recently the WiggleZ survey:

      http://adsabs.harvard.edu/abs/2012MNRAS.425..116S

      The most recent paper you referred to was from 1992, when this was indeed an interesting question. Twenty-two years have seen progress.

      • I am pleased to see that you candidly stated the real reason that cosmologists cling so tenaciously to the concept of cosmological homogeneity: it makes the analytical effort much, much simpler. True, but so much more unnatural to the natural philosopher who cares about concepts, principles and less Platonic symmetries.

        Of course, de Vaucouleurs pointed this out in 1970 in a refreshing paper entitled “The Case For A Hierarchical Cosmology” (published in Science).

        I could cite examples of physical evidence that conflicts with the supposed “turnover to homogeneity”, but what’s the point? People just seem to believe what they want to believe and choose the evidence that supports their preferences. If metagalaxies are ever discovered, maybe minds will be more open to questioning cosmological homogeneity.

        Mandelbrot carefully explained to all who would listen that if you choose a limited test volume, a restricted resolution and the appropriate statistical analyses that assume homogeneity from the start, then you can find statistical homogeneity at various scales of an intrinsically fractal cosmos.

        If the observable universe is a tiny part of one out of zillions of metagalaxies, do we say: “well now we surely will have a turnover to homogeneity”?

        Sigh.

      • telescoper Says:

        The Pan & Coles analysis does not assume homogeneity from the start; that’s precisely the point of the paper (and those that I cited following it).

        It may be that there was a case for a hierarchical universe in 1970. Indeed, the Universe is roughly hierarchical on small to intermediate scales. Now we have much larger surveys, however, we know that this hierarchy is broken – there’s a real physical scale with a well-understood dynamical origin. This picture accounts not only for the distribution of galaxies, but also for the fluctuations in the cosmic microwave background, gravitational lensing and a host of other phenomena. That doesn’t make it the absolute truth, but it does mean that it’s a better model than alternatives that contradict these observations.

      • But Mr. Helbig, what is currently fashionsble today may be in the rubbish bin in a couple of decades. In fact, if it were not, science would have stagnated into pseudo-science unencumbered by definitive predictions and empirical evidence (which appears to be where the multiverse and anthropic reasoning drivel are heading).

        Look telescoper, sure we have some evidence for very approximate statistical uniformity over a relatively puny range of “large” scales. However, we have no evidence that nature’s hierarchy ends approximately where our observational limitations kick in.

        One could argue that a metagalactic scale of structure, which dwarfs everything we call “large scale” and gives the lie to cosmological “homogeneity”, is probably unobservable. However, once again the history of science teaches us the problem with this argument. Many things that were once thought to be unobservable (atoms, for instance) are now observable.

        Sigh. So many model-building mechanics tinkering with their dreary Ptolemaic models (what you see is all there is), and so few natural philosophers studying the fundamental principles and physical properties of nature.

    • Anton Garrett Says:

      RLO, are you the RL Oldershaw to whom you refer? If so then I greatly admired the critique in your 1988 Am J Phys paper “The new physics – physical or mathematical science?”. But as I work in a different area of physics from astrophysics/cosmology I was not aware of your fractal alternative, and I am not presuming to comment on it. I would say that since 1988 lattice gauge theory, which I had thought was going nowhere, has shown that our theories of the strong force (including in relation to electroweak) are correct. Oversimplifying a bit, the equations have essentially been solved numerically on the more powerful computers that have since become available. I’d also add that *every* theory is going to have some parameters whose (fixed) values have to be determined from experimental data; this feature does not of itself invalidate a theory. Ockham’s Razor can be made quantitative via probability theory so as to compare theories having differing numbers of data-decidable parameters.

      • Anton Garrett Says:

        Steady on Phillip, it’s up to RLO to answer the question about RLO’s identity.

        As for Robert L Oldershaw’s website, I’ve looked at it following recent discussions on this blog (telescoper’s) and familiarised myself, to the extent that a non-cosmologist physicist can, about his fractal model.

        RLO has just posted this on Peter (ie telescoper)’s more recent thread titled “The Fractal Universe, part 2: “A second opinion is that nature’s hierarchy appears to top out at about the point that our observational capabilities approach their limits.” I thought the point of Peter’s post was to show that that WASN’T the case, but I await Peter’s response to RLO on that thread.

      • Anton Garrett Says:

        “RLO is of course free to post using his full name. There is of course no doubt about his identity.”

        Well Phillip, further up this thread you wrote the response “Assuming that you are the gfrellis…” Consistency please!

      • telescoper Says:

        I require commenters to identify themselves to me, but that doesn’t mean they’re obliged to make themselves known publicly, although I prefer it if they do. They often do so inadvertently anyway…

      • Here are some relevant comments on the substandard model of particle physics and QCD specifically.

        1. The Standard Model is primarily a heuristic model with 26-30 fundamental parameters that have to be “put in by hand”.

        2. The Standard Model did not and cannot predict the masses of the fundamental particles that make up all of the luminous matter that we can observe. QCD still cannot retrodict the mass of the proton without considerable fudging, and even then it is only good to within 5%. As for retrodicting the mass of the electron, the SM cannot even make an attempt.

        3. The Standard Model did not and cannot predict the existence of the dark matter that constitutes the overwhelming majority of matter in the cosmos. The Standard Model describes heuristically the “foam on top of the ocean”.

        4. The vacuum energy density crisis clearly suggests a fundamental flaw at the very heart of particle physics. The VED crisis involves the fact that the vacuum energy densities predicted by particle physicists (microcosm) and measured by cosmologists (macrocosm) differ by up to 120 orders of magnitude (roughly 10^70 to 10^120, depending on how one ‘guess-timates’ the particle physics VED).

        5. The conventional Planck mass is highly unnatural, i.e., it bears no relation to any particle observed in nature, and calls into question the foundations of the quantum chromodynamics sector of the Standard Model.

        6. Many of the key particles of the Standard Model have never been directly observed. Rather, their existence is inferred from secondary, or more likely, tertiary decay products. Quantum chromodynamics is entirely built on inference, conjecture and speculation. It is too complex for simple definitive predictions and testing.

        7. The standard model of particle physics cannot include the most fundamental and well-tested interaction of the cosmos: gravitation, i.e., general relativity.

        Robert L. Oldershaw
        http://www3.amherst.edu/~rloldershaw
        Discrete Scale Relativity/Fractal Cosmology

  4. Shantanu Says:

    Peter, see my comments about dark matter (in the other thread
    which I mistakenly put instead of this one) which hopefully would be
    raised in the summary talk.
    Thanks

  5. Alan Heavens Says:

    In other news, we looked at homogeneity in the star formation history density within the past light cone at constant time, and limited variations to <5.8% on scales of ~ (350Mpc)^3. http://uk.arxiv.org/abs/1209.6181

  6. The tests Peter and Alan have mentioned are really important – they turn a philosophical principle into tested science.

  7. […] issue of the scale of cosmic homogeneity; I’m going to repeat some of the things I said in a post earlier this week just to make sure that this discussion is reasonable […]

  8. […] The Zel’dovich Universe – Day 2 Summary […]

Leave a comment