I just saw a press release about new results from the Dark Energy Survey relating to measurements of baryon acoustic oscillations. These are basically the residue of the oscillations seen in the power spectrum of the cosmic microwave background (CMB) temperature distribution imprinted on the galaxy distribution. They are somewhat less obvious that the primordial temperature fluctuations because the growth of structure produces a much larger background but they are measurable (and indeed are one of the things Euclid will measure).
Anyway, there is a very nice detailed description in the press release and you can find the preprint of the work in full on arXiv here, so I’ll just show the key figure:
The effective redshift of this measurement is about 0.85; in the CMB the redshift is about 1000. You can see that there is a characteristic scale but it is slightly offset from that predicted using the standard ΛCDM model based on the Planck determination of cosmological parameters. One has to be careful in interpreting this diagram because it is determined using autocorrelation functions; the errors on different bins are therefore correlated, not statistically independent. They are also, as you can see, quite large. Nonetheless, it’s a tantalizing result…
Some important cosmological results have just been announced by the Dark Energy Survey Collaboration. I haven’t had time to go through them in detail but I thought it was worth doing a quick post here to draw attention to them. The results concern a sample of Type Ia supernovae (SN Ia) discovered during the full five years of the Dark Energy Survey (DES) Supernova Program, which contains about 1500 new Type Ia Supernovae that can be used for cosmological analysis. The paper is available on the arXiv here; the abstract is:
The key numerical result of interest is the equation-of-state parameter for dark energy, designated by w, which occurs in the (assumed) relationship between pressure p and effective mass density ρ of the form p=wρc2. A cosmological constant – which for many cosmologists is the default assumption for the form of dark energy – has w=-1 as I explained here. This parameter is one of the things Euclid is going to try to measure, using different methods. Interestingly, the DES results are offset a bit from the value of -1, but with quite a large uncertainty.
While the results for the equation-of-state parameter are somewhat equivocal, one thing that is clear is that the new SNIa measurements do confirm the existence of dark energy, in that the data can only be described by models with accelerating expansion, as dramatically demonstrated in this Figure:
I think this figure – or versions of it – will very rapidly appear in public talks on cosmology, including my own!
So today’s the day. The first science-quality observations from Euclid have now been released to the public. The official press release is here, and the press conference showcasing the new observations can be viewed here:
The images themselves can be found in this repository. In summary they are (in no particular order):
IC 342NGC 6822Horsehead NebulaNGC 6397Perseus Cluster
And here they are – you can click on them to make them bigger:
A few points of my own.
First, it is important to realise that these observations are not part of the full Euclid survey, which will start in early 2024, but were produced during the process of verification the capabilities of the telescope and detectors. They are all very short exposures, taking up less than a day to make all the images, but they demonstrate that Euclid is performing very well indeed!
Euclid is designed to achieve very sharp optical quality across a very wide field of view, so its strength is that it will produce beautiful images like these not only of a handful of objects but for billions. We need to map very large numbers of galaxies to perform the careful analysis needed to extract information about dark matter and dark energy, which is the main goal of the mission.
While these images are, in a sense, by-products of the Euclid mission, not specifically related to the main aims of the mission, they are interesting in their own right and there are proper scientific papers related to each of the five sets of observations released today. We expect many more non-cosmological spinoffs like these as the mission goes on.
There were some problems during the commissioning of the instruments carried by Euclid, the most serious of which was an issue with the Fine Guidance Sensor used to control the pointing of the telescope. This has been fixed by a software update and everything is now functioning well, as today’s new results confirm!
Here’s a little video update to accompany the news that, as of yesterday (28th July), the European Space Agency’s Euclid spacecraft has reached its orbit around L2, the second Lagrange Point of the Earth-Sun system:
More news is on the way. Commissioning of the instruments is now complete and the telescope is in focus. On Monday 31st July, ESA will release the first actual images from the Euclid telescope!
With the launch of the Euclid spacecraft due next month, and the last Euclid Consortium meeting before the launch coming up in just over a week, I thought I’d share another one of the nice little taster videos prepared by the European Space Agency:
The Euclid Mission has long been “sold” as a mission to probe the nature of Dark Energy in much the same way that the Large Hardon Collider was often portrayed as an experiment designed to find the Higgs boson. But as this video makes clear, testing theories of dark energy is just one of the tasks Euclid will undertake, and it may well be the case that in years to come the mission is remembered for something other than dark energy. On the other hand, big science like this needs big money, and making the specific case for a single big ticket item is an easier way to persuade funding agencies to cough up the dosh than for a general “let’s do a lot of things we’re sure we’ll fin something” approach. These thoughts triggered a memory of an old post of mine about Alfred Hitchcock so, with apologies for repeating something I have blogged about before, here’s an updated version.
Unpick the plot of any thriller or suspense movie and the chances are that somewhere within it you will find lurking at least one MacGuffin. This might be a tangible thing, such the eponymous sculpture of a Falcon in the archetypal noir classic The Maltese Falcon or it may be rather nebulous, like the “top secret plans” in Hitchcock’s The Thirty Nine Steps. Its true character may be never fully revealed, such as in the case of the glowing contents of the briefcase in Pulp Fiction, which is a classic example of the “undisclosed object” type of MacGuffin, or it may be scarily obvious, like a doomsday machine or some other “Big Dumb Object” you might find in a science fiction thriller. It may even not be a real thing at all. It could be an event or an idea or even something that doesn’t exist in any real sense at all, such the fictitious decoy character George Kaplan in North by Northwest. In fact North by North West is an example of a movie with more than one MacGuffin. Its convoluted plot involves espionage and the smuggling of what is only cursorily described as “government secrets”. These are the main MacGuffin; George Kaplan is a sort of sub-MacGuffin. But although this is behind the whole story, it is the emerging romance, accidental betrayal and frantic rescue involving the lead characters played by Cary Grant and Eve Marie Saint that really engages the characters and the audience as the film gathers pace. The MacGuffin is a trigger, but it soon fades into the background as other factors take over.
Whatever it is real or is not, the MacGuffin is the thing responsible for kick-starting the plot. It makes the characters embark upon the course of action they take as the tale begins to unfold. This plot device was particularly beloved by Alfred Hitchcock (who was responsible for introducing the word to the film industry). Hitchcock was however always at pains to ensure that the MacGuffin never played as an important a role in the mind of the audience as it did for the protagonists. As the plot twists and turns – as it usually does in such films – and its own momentum carries the story forward, the importance of the MacGuffin tends to fade, and by the end we have usually often forgotten all about it. Hitchcock’s movies rarely bother to explain their MacGuffin(s) in much detail and they often confuse the issue even further by mixing genuine MacGuffins with mere red herrings.
Here is the man himself explaining the concept at the beginning of this clip. (The rest of the interview is also enjoyable, convering such diverse topics as laxatives, ravens and nudity..)
There’s nothing particular new about the idea of a MacGuffin. I suppose the ultimate example is the Holy Grail in the tales of King Arthur and the Knights of the Round Table and, much more recently, the Da Vinci Code. The original Grail itself is basically a peg on which to hang a series of otherwise disconnected stories. It is barely mentioned once each individual story has started and, of course, is never found.
Physicists are fond of describing things as “The Holy Grail” of their subject, such as the Higgs Boson or gravitational waves. This always seemed to me to be an unfortunate description, as the Grail quest consumed a huge amount of resources in a predictably fruitless hunt for something whose significance could be seen to be dubious at the outset. The MacGuffin Effect nevertheless continues to reveal itself in science, although in different forms to those found in Hollywood.
The Large Hadron Collider (LHC), switched on to the accompaniment of great fanfares a few years ago, provides a nice example of how the MacGuffin actually works pretty much backwards in the world of Big Science. To the public, the LHC was built to detect the Higgs Boson, a hypothetical beastie introduced to account for the masses of other particles. If it exists the high-energy collisions engineered by LHC should (and did) reveal its presence. The Higgs Boson is thus the LHC’s own MacGuffin. Or at least it would be if it were really the reason why LHC has been built. In fact there are dozens of experiments at CERN and many of them have very different motivations from the quest for the Higgs, such as evidence for supersymmetry.
Particle physicists are not daft, however, and they realized that the public and, perhaps more importantly, government funding agencies need to have a really big hook to hang such a big bag of money on. Hence the emergence of the Higgs as a sort of master MacGuffin, concocted specifically for public consumption, which is much more effective politically than the plethora of mini-MacGuffins which, to be honest, would be a fairer description of the real state of affairs.
While particle physicists might pretend to be doing cosmology, we astrophysicists have to contend with MacGuffins of our own. One of the most important discoveries we have made about the Universe in the last decade is that its expansion seems to be accelerating. Since gravity usually tugs on things and makes them slow down, the only explanation that we’ve thought of for this perverse situation is that there is something out there in empty space that pushes rather than pulls. This has various possible names, but Dark Energy is probably the most popular, adding an appropriately noirish edge to this particular MacGuffin. It has even taken over in prominence from its much older relative, Dark Matter, although that one is still very much around.
We have very little idea what Dark Energy is, where it comes from, or how it relates to other forms of energy with which we are more familiar, so observational astronomers have jumped in with various grandiose strategies to find out more about it. This has spawned a booming industry in surveys of the distant Universe, all aimed ostensibly at unravelling the mystery of the Dark Energy. It seems that to get any funding at all for cosmology these days you have to sprinkle the phrase “Dark Energy” liberally throughout your grant applications.
The old-fashioned “observational” way of doing astronomy – by looking at things hard enough and long enough until something exciting appears (which it does with surprising regularity) – has been replaced by a more “experimental” approach, more like that of the LHC. We can no longer do deep surveys of galaxies to find out what’s out there. We have to do it “to constrain models of Dark Energy”. This is just one example of the (not entirely positive) influence that particle physics has had on astronomy in recent times.
Whatever the motivation for doing these projects now, they will undoubtedly lead to many new discoveries, so I’m not for one minute arguing that the case for, e.g, the Euclid mission is misguided. I’m just saying that in my opinion there will never be a real solution of the Dark Energy problem until it is understood much better at a conceptual level, and that will probably mean major revisions of our theories of both gravity and matter. I venture to speculate that in twenty years or so people will look back on the obsession with Dark Energy with some amusement, as our theoretical language will have moved on sufficiently to make it seem irrelevant. That’s how it goes with MacGuffins. In the end, even the Maltese Falcon turned out to be a fake, but what an adventure it was along the way!
It’s time once more to announce a new paper at the Open Journal of Astrophysics. The latest paper is the 13th paper so far in Volume 6 (2023) and the 78th in all. This one is another for the folder marked Cosmology and NonGalactic Astrophysics and its title is “The catalog-to-cosmology framework for weak lensing and galaxy clustering for LSST”.
The lead author is Judit Prat of the University of Chicago (Illinois, USA) and there are 21 co-authors from elsewhere in the USA and in the UK. The paper is written on behalf of the LSST Dark Energy Science Collaboration (LSST DESC), which is the international science collaboration that will make high accuracy measurements of fundamental cosmological parameters using data from the Rubin Observatory Legacy Survey of Space and Time (LSST). The OJAp has published a number of papers involving LSST DESC, and I’m very happy that such an important consortium has chosen to publish with us.
Here is a screen grab of the overlay which includes the abstract:
You can click on the image of the overlay to make it larger should you wish to do so. You can find the officially accepted version of the paper on the arXiv here.
A couple of papers were published recently that attracted quite a lot of media interest so I thought I’d mention the work here.
The researchers detail the theory in two papers, published in The Astrophysical Journaland The Astrophysical Journal Letters, with both laying out different aspects of the cosmological connection and providing the first “astrophysical explanation of dark energy”. The lead author of both papers is Duncan Farrah of the University of Hawaii. Both are available on the arXiv, where all papers worth reading in astrophysics can be found.
The first paper, available on the arXiv here, entitled Preferential Growth Channel for Supermassive Black Holes in Elliptical Galaxies at z<2, and makes the argument that observations imply that supermassive black holes grow preferentially in elliptical galaxies:
The assembly of stellar and supermassive black hole (SMBH) mass in elliptical galaxies since z∼1 can help to diagnose the origins of locally-observed correlations between SMBH mass and stellar mass. We therefore construct three samples of elliptical galaxies, one at z∼0 and two at 0.7≲z≲2.5, and quantify their relative positions in the MBH−M∗ plane. Using a Bayesian analysis framework, we find evidence for translational offsets in both stellar mass and SMBH mass between the local sample and both higher redshift samples. The offsets in stellar mass are small, and consistent with measurement bias, but the offsets in SMBH mass are much larger, reaching a factor of seven between z∼1 and z∼0. The magnitude of the SMBH offset may also depend on redshift, reaching a factor of ∼20 at z∼2. The result is robust against variation in the high and low redshift samples and changes in the analysis approach. The magnitude and redshift evolution of the offset are challenging to explain in terms of selection and measurement biases. We conclude that either there is a physical mechanism that preferentially grows SMBHs in elliptical galaxies at z≲2, or that selection and measurement biases are both underestimated, and depend on redshift.
arXiv: 2212.06854
Note the important caveats at the end. I gather from people who work on this topic that it’s a rather controversial claim.
The second paper, entitled Observational evidence for cosmological coupling of black holes and its implications for an astrophysical source of dark energy and available on the arXiv here, discusses a mechanism by which it is claimed that the formation of black holes actually creates dark energy:
Observations have found black holes spanning ten orders of magnitude in mass across most of cosmic history. The Kerr black hole solution is however provisional as its behavior at infinity is incompatible with an expanding universe. Black hole models with realistic behavior at infinity predict that the gravitating mass of a black hole can increase with the expansion of the universe independently of accretion or mergers, in a manner that depends on the black hole’s interior solution. We test this prediction by considering the growth of supermassive black holes in elliptical galaxies over 0<z≲2.5. We find evidence for cosmologically coupled mass growth among these black holes, with zero cosmological coupling excluded at 99.98% confidence. The redshift dependence of the mass growth implies that, at z≲7, black holes contribute an effectively constant cosmological energy density to Friedmann’s equations. The continuity equation then requires that black holes contribute cosmologically as vacuum energy. We further show that black hole production from the cosmic star formation history gives the value of ΩΛ measured by Planck while being consistent with constraints from massive compact halo objects. We thus propose that stellar remnant black holes are the astrophysical origin of dark energy, explaining the onset of accelerating expansion at z∼0.7.
arXiv:2302.07878
The first I saw of these papers was in a shockingly poor write-up in the Guardian which is so garbled that I dismissed the story out of hand. I recently saw it taken up in Physics World though so maybe there is something in it. Having scanned it quickly it doesn’t look trivially wrong as I had feared it would be.
I haven’t had much time to read papers over the last few weeks but I’ve decided to present the second paper – the more theoretical one – next time I do our cosmology journal club at Maynooth, which means I’ll have to read it! I’ll add my summary after I’ve done the Journal club on Monday afternoon.
In the meantime I was wondering what the general reaction in the cosmological community is to these papers, especially the second one. If anyone has strong views please feel free to put them in the comments box!
ESA’s Euclid mission is designed to explore the composition and evolution of the dark Universe. The space telescope will create a great map of the large-scale structure of the Universe across space and time by observing billions of galaxies out to 10 billion light-years, across more than a third of the sky. Euclid will explore how the Universe has expanded and how structure has formed over cosmic history, revealing more about the role of gravity and the nature of dark energy and dark matter.
The public website is can be found here. Check it out. Many more stories, pictures and videos will be added over the forthcoming weeks but in the mean time here is a taster animated movie that shows various elements of the Euclid spacecraft, including the telescope, payload module and solar panels.
Even more information about the science to be done with Euclid can be found on the Euclid Consortium website, which is being revamped ahead of the launch.
In recent times I’ve posted quite a few times about the Hubble Tension and possible resolutions thereof. I also had polls to gauge the level of tension among my readers, like this one
and this one:
I’m not sure if these are still working, though, as I think I’ve reached the number of votes allowed on the basic free version of crowdsignal that comes with the free version of WordPress. I refuse to pay for the enhanced version. I’m nothing if not cheap. You can however still see the votes so far.
Anyway, there is a new(ish) paper on the arXiv by Mark Kamionkowski and Adam Riess that presents a nice readable introduction to this topic. I’m still not convinced that the Hubble Tension is anything more than an observational systematic, but I think this is a good discussion of what it might be if it is more than that.
Here is the abstract:
Over the past decade, the disparity between the value of the cosmic expansion rate directly determined from measurements of distance and redshift or instead from the standard ΛCDM cosmological model calibrated by measurements from the early Universe, has grown to a level of significance requiring a solution. Proposed systematic errors are not supported by the breadth of available data (and “unknown errors” untestable by lack of definition). Simple theoretical explanations for this “Hubble tension” that are consistent with the majority of the data have been surprisingly hard to come by, but in recent years, attention has focused increasingly on models that alter the early or pre-recombination physics of ΛCDM as the most feasible. Here, we describe the nature of this tension, emphasizing recent developments on the observational side. We then explain why early-Universe solutions are currently favored and the constraints that any such model must satisfy. We discuss one workable example, early dark energy, and describe how it can be tested with future measurements. Given an assortment of more extended recent reviews on specific aspects of the problem, the discussion is intended to be fairly general and understandable to a broad audience.
The standard model of cosmology is based on Einstein’s theory of general relativity. In order to account for cosmological observations this has required the introduction of dark matter – which also helps explain the properties of individual galaxies – and dark energy. The result model, which I would describe as a working hypothesis, is rather successful but it is reasonable to question whether either or both of the dark components can be avoided by adopting an alternative theory of gravity instead of Einstein’s.
There is an interesting paper by Kris Pardo and David Spergel on arXiv that argues that none of the modifications of Einstein’s theory currently on the market is able to eliminate the need for dark matter. Here is the abstract of this paper:
It’s a more sophisticated version of an argument that has been going around at least in qualitative form for some time. The gist of it is that the distinctive pattern of fluctuations in the cosmic microwave background, observed by e.g. the Planck experiment, arise from coupling between baryons and photons in the early Universe. Similar features can be observed in the distribution of galaxies – where they are called Baryon Acoustic Oscilations (BAO) at a more recent cosmic epoch, but they are are much weaker. This is easily explicable if there is a dark matter component that dominates gravitational instability at late times but does not couple to photons via electromagnetic interactions. This is summed up in the following graphic (which I think I stole from a talk by John Peacock) based on data from about 20 years ago:
If there were no dark matter the coherent features seen in the power spectrum of the galaxy distribution would be much stronger; with dark matter dominating they are masked by the general growth of the collisionless component so their relative amplitude decreases.
The graphic shows how increasing the dark matter component from 0.1 to 0.3, while keeping the baryon component fixed, suppresses the wiggles corresponding to BAOs. The data suggest a dark matter contribution at the upper end of that range, consistent with the standard cosmology.
Of course if there are were no baryons at all there wouldn’t be fluctuations in either the CMB polarization or the galaxy distribution so both spectra would be smooth as shown in the graphic, but in that case there wouldn’t be anyone around to write about them as people are made of baryons.
This general conclusion is confirmed by the Pardo & Spergel paper, though it must be said that the argument doesn’t mean that modified gravity is impossible. It’s just that it seems nobody has yet thought of a specific model that satisfies all the constraints. That may change.
The views presented here are personal and not necessarily those of my employer (or anyone else for that matter).
Feel free to comment on any of the posts on this blog but comments may be moderated; anonymous comments and any considered by me to be vexatious and/or abusive and/or defamatory will not be accepted. I do not necessarily endorse, support, sanction, encourage, verify or agree with the opinions or statements of any information or other content in the comments on this site and do not in any way guarantee their accuracy or reliability.