After a very busy and unusual week, it’s time to get back to normal with the usual Saturday roundup of business at the Open Journal of Astrophysics. If you want to know how many papers we have published so far this year (Vol. 7), the answer is 42. The total published by OJAp is now 157. We’re still on track to publish around 100 papers this year, possibly more, compared to last year’s 50.
All the members of this week’s trio are in the folder marked Cosmology and Nongalactic Astrophysics, and indeed all three relate in one way or another to the topic of weak gravitational lensing. All three were published on Wednesday 22nd May 2024.
The second paper to present is “A unified linear intrinsic alignment model for elliptical and disc galaxies and the resulting ellipticity spectra” by Basundhara Ghosh (Bangalore, India), with Kai Nussbaumer, Eileen Sophie Giesel & Björn Malte Schäfer (Heidelberg, Germany). It presents a discussion of the physical origin of intrinsic alignments of both elliptical and disk galaxies and the implications for cosmological studies
The overlay looks like this:
You can read this paper directly on the arXiv here.
This morning’s arXiv update brought the expected deluge of preprints from Euclid. You can find details of all fifteen of the new articles here. Ten of them relate to the Early Release Observations of which five were announced yesterday and five last November. These are essentially byproducts of the testing and calibration phase of the Euclid mission rather than the main cosmological survey. ESA is making a series of short videos about these results which I will share on here from time to time.
Of more direct relevance to cosmologists such as myself are the following five reference papers:
The overview paper, led by Yannick Mellier (Euclid Consortium Lead), giving a general description of the mission capabilities and science goals, will be the main reference paper and just about every active member of the Euclid Consortium is on the author list (including myself). That’s over a thousand people, not quite at the level of the Large Hadron Collider but getting there. I do think we need to find a better way of giving credit to work in large collaborations than through authorship, but until someone comes up with a workable scheme, and people responsible for hiring researchers adopt it, we’re stuck with what we’ve got. At least I can say that I’ve read that paper (which is 94 pages long, including the author list)
Papers II-IV are technical articles relating to Euclid’s instruments and their calibration, which will also be important references for the survey part of the Euclid mission. Paper V is about the Flagship simulations and the mock catalogues produced therefrom; I discussed these a while ago here. It is led by Francisco Castander of Institut de Ciencies de l’Espai, who organized the meeting I attended recently here in Barcelona.
These papers now now be peer-reviewed and, assuming they are accepted, published in a special issue of Astronomy & Astrophysics (A&A).
Together with the five images released last November that makes a total of ten Early Release Observations from the pre-survey phase of Euclid. It’s not all about the pictures, however. Today also saw the release of ten scientific papers to go with these images, as well as five reference papers for the main survey. You can find them all, with accompanying information here. They will be announced on arXiv tomorrow.
You might also be interested to read my Euclid piece on RTÉ Brainstorm which has just appeared. This is not just about the new images, but gives an update on what Euclid has been up to since launch, and what we can expect in the future. There’s also a version adapted for Maynooth University PR purposes here. It includes this quote:
Today’s release of new data and technical papers from Euclid is exciting in itself but also marks the start, after months of painstaking calibration and testing of the instruments, of Euclid’s main cosmological survey. We are on the threshold of a new era in cosmology. Maynooth is the only University in Ireland to be involved in this mission and it is very exciting to be at the forefront of such an important scientific development.
I’m also quoted in a piece in the Irish Times. You’ll probably find the article blocked by a paywall but my bit is:
It’s Saturday morning in Barcelona, and time to post another update relating to the Open Journal of Astrophysics. Since the last update we have published two more papers, taking the count in Volume 7 (2024) up to 32 and the total published by OJAp up to 147. There’s every chance we will reach 150 next week.
The first paper of the most recent pair – published on Monday 29th April- is “Supernovae in 2023 (review): possible breakthroughs by late observations” by Noam Soker of Technion in Haifa, Israel. It presents a discussion of observations of the aftermath of supernovae explosions, such as supernova remnants, and how these may shed light on the explosion mechanism. This one is in the folder marked High-Energy Astrophysical Phenomena.
Here is a screen grab of the overlay which includes the abstract:
You can click on the image of the overlay to make it larger should you wish to do so. You can find the officially accepted version of the paper on the arXiv here.
The second paper was published on Thursday 2nd May and has the title “ΛCDM is alive and well” The authors are: Alain Blanchard (Université de Toulouse, France), Jean-Yves Héloret (Université de Toulouse, France), Stéphane Ilíc (Université Paris-Saclay, France), Brahim Lamine (Université de Toulouse, France) and Isaac Tutusaus (Université de Genève, Switzerland). This one, which is in the folder marked Cosmology and NonGalactic Astrophysics, presents a review of review of the alleged tensions between observations and the standard cosmological model.
Here is a screen grab of the overlay which includes the abstract:
You can click on the image of the overlay to make it larger should you wish to do so. You can find the officially accepted version of the paper on the arXiv here.
And that concludes this week’s update. More next week!
It’s Saturday, and it’s time to post another update relating to the Open Journal of Astrophysics. Since the last update we have published two more papers, taking the count in Volume 7 (2024) up to 27 and the total published by OJAp up to 142.
The first paper of the most recent pair – published on Tuesday April 16th – is “An Enhanced Massive Black Hole Occupation Fraction Predicted in Cluster Dwarf Galaxies” by Michael Tremmel (UCC, Ireland), Angelo Ricarte (Harvard, USA), Priyamvada Natarajan (Yale, USA), Jillian Bellovar (American Museum of Natural History, New York, USA), Ray Sharma (Rutgers, USA), Thomas R. Quinn (University of Washington, USA). It presents a study, based on the Romulus cosmological simulations, of the impact of environment on the occupation fraction of massive black holes in low mass galaxies. This one is in the folder marked “Astrophysics of Galaxies“.
Here is a screen grab of the overlay which includes the abstract:
You can click on the image of the overlay to make it larger should you wish to do so. You can find the officially accepted version of the paper on the arXiv here.
The second paper was published on Wednesday 17th April and has the title “A 1.9 solar-mass neutron star candidate in a 2-year orbit” and the authors are: Kareem El-Badry (Caltech, USA), Joshua D. Simon (Carnegie Observatories, USA), Henrique Reggiani (Gemini Observatory, Chile), Hans-Walter Rix (Heidelberg, Germany), David W. Latham (Harvard, USA), Allyson Bieryla (Harvard, USA), Lars A. Buchhave (Technical University of Denmark, Denmark), Sahar Shahaf (Weizmann Institute of Science, Israel), Tsevi Mazeh (Tel Aviv University, Israel), Sukanya Chakrabarti (University of Alabama, USA), Puragra Guhathakurta (University of California Santa Cruz, USA), Ilya V. Ilyin (Potsdam, Germany), and Thomas M. Tauris (Aalborg University, Denmark)
This one, which is in the folder marked Solar and Stellar Astrophysics, presents a discussion of the discovery of a 1.9 solar mass neutron star candidate using Gaia astrometric data, together with the implications of its orbital parameters for the formation mechanism.
Here is a screen grab of the overlay which includes the abstract:
You can click on the image of the overlay to make it larger should you wish to do so. You can find the officially accepted version of the paper on the arXiv here.
Is the universe simple enough to be adequately described by the standard ΛCDM cosmological model which assumes the isotropic and homogeneous Friedmann-Lemaître-Robertson-Walker metric? Tensions have emerged between the values of cosmological parameters estimated in different ways. Do these tensions signal that our model is too simple? Could a more sophisticated model account for the data without invoking a Cosmological Constant?
That conference is actually taking place this week (on 15th and 16th April, i.e. yesterday and today). I can’t be there, of course, because I’m here, but I can share the recording of the talks. Here is the first day’s worth. The recording is about 8 hours long so you probably won’t want to watch it all in one sitting. Let me point out the talk by Wendy Freedman, which starts at around 2:13.30 talking about the Hubble Tension largely from the point of view of stellar distance indicators and suggesting an answer of 69.1 ± km s-1 Mpc-1, which reduces the tension with Planck significantly.
And here is Day 2:
You can find more information about the meeting, including a full list of the talks here.
Here’s another video in the Cosmology Talks series curated by Shaun Hotchkiss. This one very timely after yesterday’s announcement. Here is the description on the YouTube page:
The Dark Energy Spectroscopic Instrument (DESI) has produced cosmological constraints! And it is living up to its name. Two researchers from DESI, Seshadri Nadathur and Andreu Font-Ribera, tell us about DESI’s measurements of the Baryon Acoustic Oscillations (BAO) released today. These results use one full year of DESI data and are the first cosmological constraints from the telescope that have been released. Mostly, it is what you might expect: tighter constraints. However, in the realm of the equation of state of dark energy, they find, even with BAO alone, that there is a hint of evidence for evolving dark energy. When they combine their data with CMB and Supernovae, who both also find small hints of evolving dark energy on their own, the evidence for dark energy not being a cosmological constant jumps as high as 3.9σ with one combination of the datasets. It seems there still is “concordance cosmology”, it’s just not ΛCDM for these datasets. The fact that all three probes are tentatively favouring this is intriguing, as it makes it unlikely to be due to systematic errors in one measurement pipeline.
My own take is that the results are very interesting but I think we need to know a lot more about possible systematics before jumping to conclusions about time-varying dark energy. Am I getting conservative in my old age? These results from DESI do of course further underline the motivation for Euclid (another Stage IV survey), which may have an even better capability to identify departures from the standard model.
P.S. Here’s a nice graphic showing the cosmic web showing revealed by the DESI survey:
There has been a lot of excitement around the ICCUB today – the press have been here and everything – ahead of the release of the Year 1 results from the Dark Energy Spectroscopic Instrument (DESI). The press release from the Lawrence Berkeley Laboratory in California can be found here.
The papers were just released at 5pm CEST and can be found here. The key results pertain to Baryon Acoustic Oscillations (BAOs) which can be used to track the expansion rate and geometry of the Universe. This is one of the techniques that will be used by Euclid.
There’s a lot of technical information to go through and I have to leave fairly soon. Fortunately we have seminar tomorrow that will explain everything at a level I can understand:
I will update this post with a bit more after the talk, but for the time being I direct you to the high-level cosmological implications are discussed in this paper (which is Paper VI from DESI).
If your main interest is in the Hubble Tension then I direct you to this Figure:
Depending on the other data sets included, the value obtained is around 68.5 ± 0.7 in the usual units, closer to the (lower) Planck CMB value than the (higher) Supernovae values but not exactly in agreement; the error bars are quite small too.
You might want to read my thoughts about distances estimated from angular diameters compared with distances measured using luminosity distances here.
If you’re wondering whether there is any evidence for departures from the standard cosmology, another pertinent comment is:
In summary, DESI data, both alone and in combination with other cosmological probes, do not show any evidence for a constant equation of state parameter different from −1 when a flat wCDM model is assumed.
DESI 2024 VI: Cosmological Constraints from the Measurements of Baryon Acoustic Oscillations
More complicated models of time-varying dark energy might work, but there’s no strong evidence from the current data.
That’s all from me for now, but feel free to comment through the box below with any hot takes!
UPDATE: As expected there has been quite a lot of press coverage about this – see the examples below – mostly concentrating on the alleged evidence for “new physics”. Personally I think the old physics is fine!
Here’s an interestingly different talk in the series of Cosmology Talks curated by Shaun Hotchkiss. The speaker, Sylvia Wenmackers, is a philosopher of science. According to the blurb on Youtube:
Her focus is probability and she has worked on a few theories that aim to extend and modify the standard axioms of probability in order to tackle paradoxes related to infinite spaces. In particular there is a paradox of the “infinite fair lottery” where within standard probability it seems impossible to write down a “fair” probability function on the integers. If you give the integers any non-zero probability, the total probability of all integers is unbounded, so the function is not normalisable. If you give the integers zero probability, the total probability of all integers is also zero. No other option seems viable for a fair distribution. This paradox arises in a number of places within cosmology, especially in the context of eternal inflation and a possible multiverse of big bangs bubbling off. If every bubble is to be treated fairly, and there will ultimately be an unbounded number of them, how do we assign probability? The proposed solutions involve hyper-real numbers, such as infinitesimals and infinities with different relative sizes, (reflecting how quickly things converge or diverge respectively). The multiverse has other problems, and other areas of cosmology where this issue arises also have their own problems (e.g. the initial conditions of inflation); however this could very well be part of the way towards fixing the cosmological multiverse.
The paper referred to in the presentation can be found here. There is a lot to digest in this thought-provoking talk, from the starting point on Kolmogorov’s axioms to the application to the multiverse, but this video gives me an excuse to repeat my thoughts on infinities in cosmology.
Most of us – whether scientists or not – have an uncomfortable time coping with the concept of infinity. Physicists have had a particularly difficult relationship with the notion of boundlessness, as various kinds of pesky infinities keep cropping up in calculations. In most cases this this symptomatic of deficiencies in the theoretical foundations of the subject. Think of the ‘ultraviolet catastrophe‘ of classical statistical mechanics, in which the electromagnetic radiation produced by a black body at a finite temperature is calculated to be infinitely intense at infinitely short wavelengths; this signalled the failure of classical statistical mechanics and ushered in the era of quantum mechanics about a hundred years ago. Quantum field theories have other forms of pathological behaviour, with mathematical components of the theory tending to run out of control to infinity unless they are healed using the technique of renormalization. The general theory of relativity predicts that singularities in which physical properties become infinite occur in the centre of black holes and in the Big Bang that kicked our Universe into existence. But even these are regarded as indications that we are missing a piece of the puzzle, rather than implying that somehow infinity is a part of nature itself.
The exception to this rule is the field of cosmology. Somehow it seems natural at least to consider the possibility that our cosmos might be infinite, either in extent or duration, or both, or perhaps even be a multiverse comprising an infinite collection of sub-universes. If the Universe is defined as everything that exists, why should it necessarily be finite? Why should there be some underlying principle that restricts it to a size our human brains can cope with?
On the other hand, there are cosmologists who won’t allow infinity into their view of the Universe. A prominent example is George Ellis, a strong critic of the multiverse idea in particular, who frequently quotes David Hilbert
The final result then is: nowhere is the infinite realized; it is neither present in nature nor admissible as a foundation in our rational thinking—a remarkable harmony between being and thought
But to every Hilbert there’s an equal and opposite Leibniz
I am so in favor of the actual infinite that instead of admitting that Nature abhors it, as is commonly said, I hold that Nature makes frequent use of it everywhere, in order to show more effectively the perfections of its Author.
You see that it’s an argument with quite a long pedigree!
Many years ago I attended a lecture by Alex Vilenkin, entitled The Principle of Mediocrity. This was a talk based on some ideas from his book Many Worlds in One: The Search for Other Universes, in which he discusses some of the consequences of the so-called eternal inflation scenario, which leads to a variation of the multiverse idea in which the universe comprises an infinite collection of causally-disconnected “bubbles” with different laws of low-energy physics applying in each. Indeed, in Vilenkin’s vision, all possible configurations of all possible things are realised somewhere in this ensemble of mini-universes.
One of the features of this scenario is that it brings the anthropic principle into play as a potential “explanation” for the apparent fine-tuning of our Universe that enables life to be sustained within it. We can only live in a domain wherein the laws of physics are compatible with life so it should be no surprise that’s what we find. There is an infinity of dead universes, but we don’t live there.
I’m not going to go on about the anthropic principle here, although it’s a subject that’s quite fun to write or, better still, give a talk about, especially if you enjoy winding people up! What I did want to say mention, though, is that Vilenkin correctly pointed out that three ingredients are needed to make this work:
An infinite ensemble of realizations
A discretizer
A randomizer
Item 2 involves some sort of principle that ensures that the number of possible states of the system we’re talking about is not infinite. A very simple example from quantum physics might be the two spin states of an electron, up (↑) or down(↓). No “in-between” states are allowed, according to our tried-and-tested theories of quantum physics, so the state space is discrete. In the more general context required for cosmology, the states are the allowed “laws of physics” ( i.e. possible false vacuum configurations). The space of possible states is very much larger here, of course, and the theory that makes it discrete much less secure. In string theory, the number of false vacua is estimated at 10500. That’s certainly a very big number, but it’s not infinite so will do the job needed.
Item 3 requires a process that realizes every possible configuration across the ensemble in a “random” fashion. The word “random” is a bit problematic for me because I don’t really know what it’s supposed to mean. It’s a word that far too many scientists are content to hide behind, in my opinion. In this context, however, “random” really means that the assigning of states to elements in the ensemble must be ergodic, meaning that it must visit the entire state space with some probability. This is the kind of process that’s needed if an infinite collection of monkeys is indeed to type the (large but finite) complete works of shakespeare. It’s not enough that there be an infinite number and that the works of shakespeare be finite. The process of typing must also be ergodic.
Now it’s by no means obvious that monkeys would type ergodically. If, for example, they always hit two adjoining keys at the same time then the process would not be ergodic. Likewise it is by no means clear to me that the process of realizing the ensemble is ergodic. In fact I’m not even sure that there’s any process at all that “realizes” the string landscape. There’s a long and dangerous road from the (hypothetical) ensembles that exist even in standard quantum field theory to an actually existing “random” collection of observed things…
More generally, the mere fact that a mathematical solution of an equation can be derived does not mean that that equation describes anything that actually exists in nature. In this respect I agree with Alfred North Whitehead:
There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain.
It’s a quote I think some string theorists might benefit from reading!
Items 1, 2 and 3 are all needed to ensure that each particular configuration of the system is actually realized in nature. If we had an infinite number of realizations but with either infinite number of possible configurations or a non-ergodic selection mechanism then there’s no guarantee each possibility would actually happen. The success of this explanation consequently rests on quite stringent assumptions.
I’m a sceptic about this whole scheme for many reasons. First, I’m uncomfortable with infinity – that’s what you get for working with George Ellis, I guess. Second, and more importantly, I don’t understand string theory and am in any case unsure of the ontological status of the string landscape. Finally, although a large number of prominent cosmologists have waved their hands with commendable vigour, I have never seen anything even approaching a rigorous proof that eternal inflation does lead to realized infinity of false vacua. If such a thing exists, I’d really like to hear about it!
The views presented here are personal and not necessarily those of my employer (or anyone else for that matter).
Feel free to comment on any of the posts on this blog but comments may be moderated; anonymous comments and any considered by me to be vexatious and/or abusive and/or defamatory will not be accepted. I do not necessarily endorse, support, sanction, encourage, verify or agree with the opinions or statements of any information or other content in the comments on this site and do not in any way guarantee their accuracy or reliability.