Archive for the The Universe and Stuff Category

The Fractal Universe, Part 2

Posted in History, The Universe and Stuff with tags , , , , , , on June 27, 2014 by telescoper

Given the recent discussion in comments on this blog I thought I’d give a brief update on the issue of the scale of cosmic homogeneity; I’m going to repeat some of the things I said in a post earlier this week just to make sure that this discussion is reasonable self-contained.

Our standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be? A couple of presentations discussed the possibly worrying evidence for the presence of a local void, a large underdensity on scale of about 200 MPc which may influence our interpretation of cosmological results.

I blogged some time ago about that the idea that the Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension D. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius R is proportional to R^D. If galaxies are distributed uniformly (homogeneously) then D = 3, as the number of neighbours simply depends on the volume of the sphere, i.e. as R^3, and the average number-density of galaxies. A value of D < 3 indicates that the galaxies do not fill space in a homogeneous fashion: D = 1, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as R^1, not as its volume; galaxies distributed in sheets would have D=2, and so on.

We know that D \simeq 1.2 on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to D=3 has not been so strong, at least not until recently. It’s just just that measuring D from a survey is actually rather tricky, but also that when we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential \delta\Phi by \delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right), give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale \lambda reasonably large relative to the cosmological horizon \sim ct. Galaxies correspond to a large \delta \rho/\rho \sim 10^6 but don’t violate the Cosmological Principle because they are too small in scale \lambda to perturb the background metric significantly.

In my previous post I left the story as it stood about 15 years ago, and there have been numerous developments since then, some convincing (to me) and some not. Here I’ll just give a couple of key results, which I think to be important because they address a specific quantifiable question rather than relying on qualitative and subjective interpretations.

The first, which is from a paper I wrote with my (then) PhD student Jun Pan, demonstrated what I think is the first convincing demonstration that the correlation dimension of galaxies in the IRAS PSCz survey does turn over to the homogeneous value D=3 on large scales:

correlations

You can see quite clearly that there is a gradual transition to homogeneity beyond about 10 Mpc, and this transition is certainly complete before 100 Mpc. The PSCz survey comprises “only” about 11,000 galaxies, and it relatively shallow too (with a depth of about 150 Mpc),  but has an enormous advantage in that it covers virtually the whole sky. This is important because it means that the survey geometry does not have a significant effect on the results. This is important because it does not assume homogeneity at the start. In a traditional correlation function analysis the number of pairs of galaxies with a given separation is compared with a random distribution with the same mean number of galaxies per unit volume. The mean density however has to be estimated from the same survey as the correlation function is being calculated from, and if there is large-scale clustering beyond the size of the survey this estimate will not be a fair estimate of the global value. Such analyses therefore assume what they set out to prove. Ours does not beg the question in this way.

The PSCz survey is relatively sparse but more recently much bigger surveys involving optically selected galaxies have confirmed this idea with great precision. A particular important recent result came from the WiggleZ survey (in a paper by Scrimgeour et al. 2012). This survey is big enough to look at the correlation dimension not just locally (as we did with PSCz) but as a function of redshift, so we can see how it evolves. In fact the survey contains about 200,000 galaxies in a volume of about a cubic Gigaparsec. Here are the crucial graphs:

homogeneity

I think this proves beyond any reasonable doubt that there is a transition to homogeneity at about 80 Mpc, well within the survey volume. My conclusion from this and other studies is that the structure is roughly self-similar on small scales, but this scaling gradually dissolves into homogeneity. In a Fractal Universe the correlation dimension would not depend on scale, so what I’m saying is that we do not live in a fractal Universe. End of story.

The Zel’dovich Universe – Day 4 Summary

Posted in History, The Universe and Stuff with tags , , , , , , , on June 27, 2014 by telescoper

And on the fourth day of this meeting about “The Zel’dovich Universe”  we were back to a full schedule (9am until 7.30pm) concentrating on further studies of the Cosmic Web. We started off with a discussion of the properties of large-scale structure at high redshift. As someone who’s old enough to remember the days when “high redshift” meant about z~0.1 the idea that we can now map the galaxy distribution at redshifts z~2. There are other measures of structure on these huge scales, such as the Lyman alpha forest, and we heard a bit about some of them too.

The second session was about “reconstructing” the Cosmic Web, although a more correct word have been “deconstructing”. The point about this session is that cosmology is basically a backwards subject. In other branches of experimental science we set the initial conditions for a system and then examine how it evolves. In cosmology we have to infer the initial conditions of the Universe from what we observe around us now. In other words, cosmology is an inverse problem on a grand scale.  In the context of the cosmic web, we want to infer the pattern of initial density and velocity fluctuations that gave rise to the present network of clusters, filaments and voids. Several talks about this emphasized how proper Bayesian methods have led to enormous progress in this field over the last few years.

All this progress has been accompanied by huge improvements in graphical visualisation techniques. Thirty years ago the state of the art in this field was represented by simple contour plots, such as this (usually called the Cosmic Chicken):

chicken

You can see how crude this representation is by comparing it with a similar plot from the modern era of precision cosmology:

chicken

Even better examples are provided by the following snapshot:

IMG-20140626-00352

It’s nice to see a better, though still imperfect,  version of the chicken at the top right, though I found the graphic at the bottom right rather implausible; it must be difficult to skate at all with those things around your legs.

Here’s another picture I liked, despite the lack of chickens:

IMG-20140626-00353

Incidentally, it’s the back of Alar Toomre‘s head you can see on the far right in this picture.

The afternoon was largely devoted to discussions of how the properties of individual galaxies are influenced by their local environment within the Cosmic Web. I usually think of galaxies as test particles (i.e. point masses) but they are interesting in their own right (to some people anyway). However, the World Cup intervened during the evening session and I skipped a couple of talks to watch Germany beat the USA in their final group match.

That’s all for now. Tonight we’ll have the conference dinner, which is apparently being held in the “House of Blackheads” on “Pikk Street”. Sounds like an interesting spot!

The Zel’dovich Universe – Day 3 Summary

Posted in History, The Universe and Stuff with tags , , , , , , on June 26, 2014 by telescoper

Day Three of this meeting about “The Zel’dovich Universe” was slightly shorter than the previous two, in that it finished just after 17.00 rather than the usual 19.00 or later. That meant that we got out in time to settle down for a beer in time the World Cup football. I watched an excellent game between Nigeria and Argentina, which ended 3-2 to Argentina but could have been 7-7. I’ll use that as an excuse for writing a slightly shorter summary.

Anyway we began with a session on the Primordial Universe and Primordial Signatures led off by Alexei Starobinsky (although there is some controversy whether his name should end -y or -i). Starobinsky outlined the theory of cosmological perturbations from inflation with an emphasis on how it relates to some of Zel’dovich’s ideas on the subject. There was then a talk from Bruce Partridge about some of the results from Planck. I’ve mentioned already that this isn’t a typical cosmology conference, and this talk provided another unusual aspect in that there’s hardly been any discussion of the BICEP2 results here. When asked about at the end of his talk, Bruce replied (very sensibly) that we should all just be patient.

Next session after coffee was about cosmic voids, kicked off by Rien van de Weygaert with a talk entitled “Much Ado About Nothing”, which reminded me of the following quote from the play of the same name:

“He hath indeed better bettered expectation than you must expect of me to tell you how”

The existence of voids in the galaxy distribution is not unexpected given the presence of clusters and superclusters, but they are interesting in their own right as they display particular dynamical evolution and have important consequences on observations. In 1984, Vincent Icke proved the so-called “Bubble Theorem” which showed that an isolated underdensity tends to evolve to a spherical shape.Most cosmologists, including myself, therefore expected big voids to be round, which turns out to be wrong; the interaction of the perimeter of the void with its surroundings always plays an important role in determining the geometry. Another thing that sprang into my mind was a classic paper by Simon White (1979) with the abstract:

We derive and display relations which can be used to express many quantitative measures of clustering in terms of the hierarchy of correlation functions. The convergence rate and asymptotic behaviour of the integral series which usually result is explored as far as possible using the observed low-order galaxy correlation functions. On scales less than the expected nearest neighbour distance most clustering measures are influenced only by the lowest order correlation functions. On all larger scales their behaviour, in general, depends significantly on correlations of high order and cannot be approximated using the low-order functions. Bhavsar’s observed relation between density enhancement and the fraction of galaxies included in clusters is modelled and is shown to be only weakly dependent on high-order correlations over most of its range. The probability that a randomly placed region of given volume be empty is discussed as a particularly simple and appealing example of a statistic which is strongly influenced by correlations of all orders, and it is shown that this probability may obey a scaling law which will allow a test of the small-scale form of high-order correlations.

The emphasis is mine. It’s fascinating and somewhat paradoxical that we can learn a lot about the statistics of where the galaxies are fom the regions where galaxies are not.

Another thing worth mentioning was Paul Sutter’s discussion of a project on cosmic voids which is a fine example of open science. Check out the CosmicVoids website where you will find void catalogues, identification algorithms and a host of other stuff all freely available to anyone who wants to use them. This is the way forward.

After lunch we had a session on Cosmic Flows, with a variety of talks about using galaxy peculiar velocities to understand the dynamics of large-scale structure. This field was booming about twenty years ago but which has been to some extent been overtaken by other cosmological probes that offer greater precision; the biggest difficulty has been getting a sufficient number of sufficiently accurate direct (redshift-independent) distance measurements to do good statistics. It remains a difficult but important field, because it’s important to test our models with as many independent methods as possible.

I’ll end with a word about the first speaker of this session, the Gruber prize winner Marc Davis. He suffered a stroke a few years ago which has left him partly paralysed (down his right side). He has battled back from this with great courage, and even turned it to his advantage during his talk when he complained about how faint the laser pointer was and used his walking stick instead.

IMG-20140625-00351

The Zel’dovich Universe – Day 2 Summary

Posted in History, The Universe and Stuff with tags , , , on June 25, 2014 by telescoper

IMG-20140624-00349

Day Two of this enjoyable meeting involved more talks about the cosmic web of large-scale structure of the Universe. I’m not going to attempt to summarize the whole day, but will just mention a couple of things that made me reflect a bit. Unfortunately that means I won’t be able to do more than merely mention some of the other fascinating things that came up, as phase-space flip-flops and one-dimensional Origami.

One was a very nice review by John Peacock in which he showed that a version of Moore’s law applies to galaxy redshift surveys; since the first measurement of the redshift of an extragalactic object by Slipher in 1912, the number of redshifts has doubled every 2-3 years ago. This exponential growth has been driven by improvements in technology, from photographic plates to electronic detectors and from single-object spectroscopy to multiplex technology and so on. At this rate by 2050 or so we should have redshifts for most galaxies in the observable Universe. Progress in cosmography has been remarkable indeed.

The term “Cosmic Web” may be a bit of a misnomer in fact, as a consensus may be emerging that in some sense it is more like a honeycomb. Thanks to a miracle of 3D printing, here is an example of what the large-scale structure of the Universe seems to look like:

IMG-20140624-00350

One of the issues that emerged from the mix of theoretical and observational talks concerned the scale of cosmic homogeneity. Our standard cosmological model is based on the Cosmological Principle, which asserts that the Universe is, in a broad-brush sense, homogeneous (is the same in every place) and isotropic (looks the same in all directions). But the question that has troubled cosmologists for many years is what is meant by large scales? How broad does the broad brush have to be? A couple of presentations discussed the possibly worrying evidence for the presence of a local void, a large underdensity on scale of about 200 MPc which may influence our interpretation of cosmological results.

I blogged some time ago about that the idea that the Universe might have structure on all scales, as would be the case if it were described in terms of a fractal set characterized by a fractal dimension D. In a fractal set, the mean number of neighbours of a given galaxy within a spherical volume of radius R is proportional to R^D. If galaxies are distributed uniformly (homogeneously) then D = 3, as the number of neighbours simply depends on the volume of the sphere, i.e. as R^3, and the average number-density of galaxies. A value of D < 3 indicates that the galaxies do not fill space in a homogeneous fashion: D = 1, for example, would indicate that galaxies were distributed in roughly linear structures (filaments); the mass of material distributed along a filament enclosed within a sphere grows linear with the radius of the sphere, i.e. as R^1, not as its volume; galaxies distributed in sheets would have D=2, and so on.

We know that D \simeq 1.2 on small scales (in cosmological terms, still several Megaparsecs), but the evidence for a turnover to D=3 has not been so strong, at least not until recently. It’s just just that measuring D from a survey is actually rather tricky, but also that when we cosmologists adopt the Cosmological Principle we apply it not to the distribution of galaxies in space, but to space itself. We assume that space is homogeneous so that its geometry can be described by the Friedmann-Lemaitre-Robertson-Walker metric.

According to Einstein’s theory of general relativity, clumps in the matter distribution would cause distortions in the metric which are roughly related to fluctuations in the Newtonian gravitational potential \delta\Phi by \delta\Phi/c^2 \sim \left(\lambda/ct \right)^{2} \left(\delta \rho/\rho\right), give or take a factor of a few, so that a large fluctuation in the density of matter wouldn’t necessarily cause a large fluctuation of the metric unless it were on a scale \lambda reasonably large relative to the cosmological horizon \sim ct. Galaxies correspond to a large \delta \rho/\rho \sim 10^6 but don’t violate the Cosmological Principle because they are too small in scale \lambda to perturb the background metric significantly.

The discussion of a fractal universe is one I’m overdue to return to. In my previous post I left the story as it stood about 15 years ago, and there have been numerous developments since then, not all of them consistent with each other. I will do a full “Part 2” to that post eventually, but in the mean time I’ll just comment that current large surveys, such as those derived from the Sloan Digital Sky Survey, do seem to be consistent with a Universe that possesses the property of large-scale homogeneity. If that conclusion survives the next generation of even larger galaxy redshift surveys then it will come as an immense relief to cosmologists.

The reason for that is that the equations of general relativity are very hard to solve in cases where there isn’t a lot of symmetry; there are just too many equations to solve for a general solution to be obtained. If the cosmological principle applies, however, the equations simplify enormously (both in number and form) and we can get results we can work with on the back of an envelope. Small fluctuations about the smooth background solution can be handled (approximately but robustly) using a technique called perturbation theory. If the fluctuations are large, however, these methods don’t work. What we need to do instead is construct exact inhomogeneous model, and that is very very hard. It’s of course a different question as to why the Universe is so smooth on large scales, but as a working cosmologist the real importance of it being that way is that it makes our job so much easier than it would otherwise be.

PS. If anyone reading this either at the conference or elsewhere has any questions or issues they would like me to raise during the summary talk on Saturday please don’t hesitate to leave a comment below or via Twitter using the hashtag #IAU308.

The Power Spectrum and the Cosmic Web

Posted in Bad Statistics, The Universe and Stuff with tags , , , , , , on June 24, 2014 by telescoper

One of the things that makes this conference different from most cosmology meetings is that it is focussing on the large-scale structure of the Universe in itself as a topic rather a source of statistical information about, e.g. cosmological parameters. This means that we’ve been hearing about a set of statistical methods that is somewhat different from those usually used in the field (which are primarily based on second-order quantities).

One of the challenges cosmologists face is how to quantify the patterns we see in galaxy redshift surveys. In the relatively recent past the small size of the available data sets meant that only relatively crude descriptors could be used; anything sophisticated would be rendered useless by noise. For that reason, statistical analysis of galaxy clustering tended to be limited to the measurement of autocorrelation functions, usually constructed in Fourier space in the form of power spectra; you can find a nice review here.

Because it is so robust and contains a great deal of important information, the power spectrum has become ubiquitous in cosmology. But I think it’s important to realise its limitations.

Take a look at these two N-body computer simulations of large-scale structure:

The one on the left is a proper simulation of the “cosmic web” which is at least qualitatively realistic, in that in contains filaments, clusters and voids pretty much like what is observed in galaxy surveys.

To make the picture on the right I first  took the Fourier transform of the original  simulation. This approach follows the best advice I ever got from my thesis supervisor: “if you can’t think of anything else to do, try Fourier-transforming everything.”

Anyway each Fourier mode is complex and can therefore be characterized by an amplitude and a phase (the modulus and argument of the complex quantity). What I did next was to randomly reshuffle all the phases while leaving the amplitudes alone. I then performed the inverse Fourier transform to construct the image shown on the right.

What this procedure does is to produce a new image which has exactly the same power spectrum as the first. You might be surprised by how little the pattern on the right resembles that on the left, given that they share this property; the distribution on the right is much fuzzier. In fact, the sharply delineated features  are produced by mode-mode correlations and are therefore not well described by the power spectrum, which involves only the amplitude of each separate mode. In effect, the power spectrum is insensitive to the part of the Fourier description of the pattern that is responsible for delineating the cosmic web.

If you’re confused by this, consider the Fourier transforms of (a) white noise and (b) a Dirac delta-function. Both produce flat power-spectra, but they look very different in real space because in (b) all the Fourier modes are correlated in such away that they are in phase at the one location where the pattern is not zero; everywhere else they interfere destructively. In (a) the phases are distributed randomly.

The moral of this is that there is much more to the pattern of galaxy clustering than meets the power spectrum…

The Zel’dovich Universe – Day 1 Summary

Posted in Biographical, History, The Universe and Stuff with tags , on June 24, 2014 by telescoper

I’m up possibly bright but definitely early to get ready for day two of IAU Symposium No. 308 The Zel’dovich Universe. The weather was a bit iffy yesterday, with showers throughout the day, but that didn’t matter much in practice as I was indoors most of the day attending the talks. I have to deliver the conference summary on Saturday afternoon so I feel I should make an effort to attend as much as I can in order to help me pretend that I didn’t write my concluding talk in advance of the conference.

Day One began with some reflections on the work and personality of the great Zel’dovich by two of his former students, Sergei Shandarin and Varun Sahni, both of whom I’ve worked with in the past.
zeldovichZel’dovich (left) was born on March 8th 1914. To us cosmologists Zel’dovich is best known for his work on the large-scale structure of the Universe, but he only started to work on that subject relatively late in his career during the 1960s. He in fact began his life in research as a physical chemist and arguably his greatest contribution to science was that he developed the first completely physically based theory of flame propagation (together with Frank-Kamenetskii). No doubt he also used insights gained from this work, together with his studies of detonation and shock waves, in the Soviet nuclear bomb programme in which he was a central figure, and which no doubt led to the chestful of medals he’s wearing in the photograph. In fact he was awarded the title of  Hero of Socialist Labour no less than three times.

My own connection with Zel’dovich is primarily through his scientific descendants, principally his former student Sergei Shandarin, who has a faculty position at the University of Kansas, but his work has had a very strong influence on my scientific career. For example, I visited Kansas back in 1992 and worked on a project with Sergei and Adrian Melott which led to a paper published in 1993, the abstract of which makes it clear the debt it owed to the work of Ze’dovich.

The accuracy of various analytic approximations for following the evolution of cosmological density fluctuations into the nonlinear regime is investigated. The Zel’dovich approximation is found to be consistently the best approximation scheme. It is extremely accurate for power spectra characterized by n = -1 or less; when the approximation is ‘enhanced’ by truncating highly nonlinear Fourier modes the approximation is excellent even for n = +1. The performance of linear theory is less spectrum-dependent, but this approximation is less accurate than the Zel’dovich one for all cases because of the failure to treat dynamics. The lognormal approximation generally provides a very poor fit to the spatial pattern.

The Zel’dovich Approximation referred to in this abstract is based on an extremely simple idea but which, as we showed in the above paper, turns out to be extremely accurate at reproducing the morphology of the “cosmic web” of large-scale structure.

Zel’dovich passed away in 1987. I was a graduate student at that time and had never had the opportunity to meet him. If I had done so I’m sure I would have found him fascinating and intimidating in equal measure, as I admired his work enormously as did everyone I knew in the field of cosmology. Anyway, a couple of years after his death a review paper written by himself and Sergei Shandarin was published, along with the note:

The Russian version of this review was finished in the summer of 1987. By the tragic death of Ya. B.Zeldovich on December 2, 1987, about four-fifths of the paper had been translated into English. Professor Zeldovich would have been 75 years old on March 8, 1989 and was vivid and creative until his last day. The theory of the structure of the universe was one of his favorite subjects, to which he made many note-worthy contributions over the last 20 years.

As one does if one is vain I looked down the reference list to see if any of my papers were cited. I’d only published one paper before Zel’dovich died so my hopes weren’t high. As it happens, though, my very first paper (Coles 1986) was there in the list. That’s still the proudest moment of my life!

reference

We then went into a Dick Bond Special, with a talk entitled: From Superweb Simplicity to Complex Intermittency in the Cosmic Web. The following pic will give you a flavour:

IMG-20140623-00347

It’s all very straightforward, really. Um…

The rest of the day consisted of a number of talks about the Cosmic Web of large-scale structure using techniques inspired by the work of Zel’dovich, particularly the Zel’dovich approximation which I’ve mentioned already. There were many fascinating talks but I had to single out Johan Hidding of Groningen for the best use of graphics. Here’s a video of his from Youtube as an example:

Well, I must get going for the start of Day Two. The first session starts at 9am (7am UK time) and the day ends at 19.30. Conferences like this are hard work!

PS. If anyone reading this either at the conference or elsewhere has any questions or issues they would like me to raise during the summary talk on Saturday please don’t hesitate to leave a comment below or via Twitter using the hashtag #IAU308.

 

Welcome Reception

Posted in Biographical, The Universe and Stuff with tags , on June 22, 2014 by telescoper

So I made it to Talinn, where it is fairly cold and rainy, for the IAU Symposium No 308 on the Zel’dovich Universe . Here is the description of the conference from the website

It will be 100 years since the birth of Yakov Zeldovich, whose seminal work paved the way towards a theoretical understanding of the complex weblike patterns that have been observed in our Universe.

Impressive progress of observational studies, of modelling and simulations and of analytical work has led to revolutionary new insights into the structure and emergence of the Cosmic Web. With the coming years marked by major observational developments – in terms of large new telescopes, instruments and corresponding versatile surveys – and with the continuing growth of computational resources, the window will be opened towards understanding the dynamics and observing the evolution of cosmic structure.

The symposium will focus on the subject of the structure, constituents, properties, dynamics and analysis of the cosmic web in the large-scale cosmic matter and galaxy distribution. The symposium will synthesize the insights obtained from many different observational and theoretical studies and set out the lines for the major upcoming scientific programs that will not only extend our view over a far larger fraction of the visible Universe but also allow the systematic investigation of the evolution of cosmic structure.

I’m looking forward to the meeting, which starts properly tomorrow morning but it was nice to have a reception event this evening to welcome those of us who made it to Estonia in time. Thp ere was plenty of wine on offer, and I had the chance to meet up with quite a few people I haven’t seen for ages:

IMG-20140622-00345

First impressions of Estonia are that the word for “Taxi” is “Tacso” and the word for “Big Bang” is also slightly different:

IMG-20140622-00346

Other than that the natives seem friendly and my hotel, though inexpensive, is positively luxurious. The crucial challenge, however, is the quality of the breakfast, which will have to wait until tomorrow morning!

Knit your own Neutralino

Posted in The Universe and Stuff with tags , , , , on June 21, 2014 by telescoper

I thought I’d give you a sneak preview of something soon to feature at the forthcoming Royal Society Summer Science Exhibition. With input from particle physicists from the Department of Physics & Astronomy at the University of Sussex, the inestimable Dorothy Lamb has designed a “Knit your own Neutralino” pack, which contains a knitting pattern and embellishments (wool not included), that can be used to construct a plushie representing the lightest neutralino, χ01, a candidate for the dark matter that pervades the Universe.

IMG_2540IMG_2541

Here are some examples, as produced by Dorothy herself:

IMG_2536

Here are some more elaborate variations, representing (I think) different types of chargino.

IMG_2539

Whatever they are, they’re a lot of fun and in my opinion more than a little bit camp!

I think we should introduce knitting as part of the “transferable skills” element of our physics courses. If we did, Dorothy would definitely graduate with first class honours!

Published BICEP2 paper admits “Unquantifiable Uncertainty”..

Posted in Bad Statistics, The Universe and Stuff with tags , , , , , , on June 20, 2014 by telescoper

Just a quick post to pass on the news that the BICEP2 results that excited so much press coverage earlier this year have now been published in Physical Review Letters. A free PDF version of the piece can be found here.  The published version incorporates a couple of important caveats that have arisen since the original release of the results prior to peer review. In particular, in the abstract (discussing models of the dust foreground emission:

However, these models are not sufficiently constrained by external public data to exclude the possibility of dust emission bright enough to explain the entire excess signal. Cross correlating BICEP2 against 100 GHz maps from the BICEP1 experiment, the excess signal is confirmed with 3σ significance and its spectral index is found to be consistent with that of the CMB, disfavoring dust at 1.7 σ.

Since the primary question-mark over the original result was whether the signal was due to dust or CMB, this corresponds to an admission that the detection is really at very low significance. I’ll set aside my objection to the frequentist language used in this statement!

There is an interesting comment in the footnotes too:

In the preprint version of this paper an additional DDM2 model was included based on information taken from Planck conference talks. We noted the large uncertainties on this and the other dust models presented. In the Planck dust polarization paper [96] which has since appeared the maps have been masked to include only regions “where the systematic uncertainties are small, and where the dust signal dominates total emission.” This mask excludes our field. We have concluded the information used for the DDM2 model has unquantifiable uncertainty. We look forward to performing a cross-correlation analysis against the Planck 353 GHz polarized maps in a future publication.

The emphasis is mine. The phrase made me think of this:

hazards

The paper concludes:

More data are clearly required to resolve the situation. We note that cross-correlation of our maps with the Planck 353 GHz maps will be more powerful than use of those maps alone in our field. Additional data are also expected from many other experiments, including Keck Array observations at 100 GHz in the 2014 season.

In other words, what I’ve been saying from the outset.

 

The Logistics of Scientific Growth in the 21st Century

Posted in Science Politics, The Universe and Stuff with tags , , on June 8, 2014 by telescoper

Interesting piece that argues that the recent growth in STEM PhD and postdocs is not sustainable.

caseybergman's avatarAn Assembly of Fragments

ResearchBlogging.org

Over the last few months, I’ve noticed a growing number of reports about declining opportunities and increasing pressure for early stage academic researchers (Ph.D. students, post-docs and junior faculty). For example, the Washington Post published an article in early July about trends in the U.S. scientific job market entitled “U.S. pushes for more scientists, but the jobs aren’t there.” This post generated over 3,500 comments on the WaPo website alone and was highly discussed in the twittersphere. In mid July, Inside Higher Ed reported that an ongoing study revealed a recent, precipitous drop in the interest of STEM (Science/Technology/Engineering/Mathematics) Ph.D. students wishing to pursue an academic tenure-track career. These results confirmed those published in PLoS ONE in May that showed the interest to pursue an academic career of STEM students surveyed in 2010 showed evidence of a decline during the course of Ph.D. studies:

Figure 1. Percent of…

View original post 1,620 more words