Archive for Dark Energy

Cosmology on its beam-ends?

Posted in Cosmic Anomalies, The Universe and Stuff with tags , , , , on June 14, 2010 by telescoper

Interesting press release today from the Royal Astronomical Society about a paper (preprint version here) which casts doubt on whether the Wilkinson Microwave Anisotropy Probe supports the standard cosmological model to the extent that is generally claimed. Apologies if this is a bit more technical than my usual posts (but I like occasionally to pretend that it’s a science blog).

The abstract of the paper (by Sawangwit & Shanks) reads

Using the published WMAP 5-year data, we first show how sensitive the WMAP power spectra are to the form of the WMAP beam. It is well known that the beam profile derived from observations of Jupiter is non-Gaussian and indeed extends, in the W band for example, well beyond its 12.’6 FWHM core out to more than 1 degree in radius. This means that even though the core width corresponds to wavenumber l ~ 1800, the form of the beam still significantly affects the WMAP results even at l~200 which is the scale of the first acoustic peak. The difference between the beam convolved Cl; and the final Cl is ~ 70% at the scale of the first peak, rising to ~ 400% at the scale of the second.  New estimates of the Q, V and W-band beam profiles are then presented, based on a stacking analysis of the WMAP5 radio source catalogue and temperature maps. The radio sources show a significantly (3-4σ) broader beam profile on scales of 10′-30′ than that found by the WMAP team whose beam analysis is based on measurements of Jupiter. Beyond these scales the beam profiles from the radio sources are too noisy to give useful information. Furthermore, we find tentative evidence for a non-linear relation between WMAP and ATCA/IRAM 95 GHz source fluxes. We discuss whether the wide beam profiles could be caused either by radio source extension or clustering and find that neither explanation is likely. We also argue against the possibility that Eddington bias is affecting our results. The reasons for the difference between the radio source and the Jupiter beam profiles are therefore still unclear. If the radio source profiles were then used to define the WMAP beam, there could be a significant change in the amplitude and position of even the first acoustic peak. It is therefore important to identify the reasons for the differences between these two beam profile estimates.

The press release puts it somewhat more dramatically

New research by astronomers in the Physics Department at Durham University suggests that the conventional wisdom about the content of the Universe may be wrong. Graduate student Utane Sawangwit and Professor Tom Shanks looked at observations from the Wilkinson Microwave Anisotropy Probe (WMAP) satellite to study the remnant heat from the Big Bang. The two scientists find evidence that the errors in its data may be much larger than previously thought, which in turn makes the standard model of the Universe open to question. The team publish their results in a letter to the journal Monthly Notices of the Royal Astronomical Society.

I dare say the WMAP team will respond in due course, but this paper spurred me to mention some work on this topic that was done by my friend (and former student) Lung-Yih Chiang. During his last visit to Cardiff we discussed this at great length and got very excited at one point when we thought we had discovered an error along the lines that the present paper claims. However, looking more carefully into it we decided that this wasn’t the case and we abandoned our plans to publish a paper on it.

Let me show you a few slides from a presentation that Lung-Yih gave to me a while ago. For a start here is the famous power-spectrum of the temperature fluctuations of the cosmic microwave background which plays an essential role in determining the parameters of the standard cosmology:

The position of the so-called “acoustic peak” plays an important role in determining the overall curvature of space-time on cosmological scales and the higher-order peaks pin down other parameters. However, it must be remembered that WMAP doesn’t just observe the cosmic microwave background. The signal it receives is heavily polluted by contamination from within our Galaxy and there is also significant instrumental noise.  To deal with this problem, the WMAP team exploit the five different frequency channels with which the probe is equipped, as shown in the picture below.

The CMB, being described by a black-body spectrum, has a sky temperature that doesn’t vary with frequency. Foreground emission, on the other hand, has an effective temperature that varies with frequency in way that is fairly well understood. The five available channels can therefore be used to model and subtract the foreground contribution to the overall signal. However, the different channels have different angular resolution (because they correspond to different wavelengths of radiation). Here are some sample patches of sky illustrating this

At each frequency the sky is blurred out by the “beam” of the WMAP optical system; the blurring is worse at low frequencies than at high frequencies. In order to do the foreground subtraction, the WMAP team therefore smooth all the frequency maps to have the same resolution, i.e. so the net effect of optical resolution and artificial smoothing produces the same overall blurring (actually 1 degree).  This requires accurate knowledge of the precise form of the beam response of the experiment to do it accurately. A rough example (for illustration only) is given in the caption above.

Now, here are the power spectra of the maps in each frequency channel

Note this is Cl not l(l+1)Cl as in the first plot of the spectrum. Now you see how much foreground there is in the data: the curves would lie on top of each other if the signal were pure CMB, i.e. if it did not vary with frequency. The equation at the bottom basically just says that the overall spectrum is a smoothed version of the CMB plus the foregrounds  plus noise. Note, crucially,  that the smoothing suppresses the interesting high-l wiggles.

I haven’t got space-time enough to go into how the foreground subtraction is carried out, but once it is done it is necessary to “unblur” the maps in order to see the structure at small angular scales, i.e. at large spherical harmonic numbers l. The initial process of convolving the sky pattern with a filter corresponds to multiplying the power-spectrum with a “window function” that decreases sharply at high l, so to deconvolve the spectrum one essentially has to divide by this window function to reinstate the power removed at high harmonics.

This is where it all gets very tricky. The smoothing applied is very close to the scale of the acoustic peaks so you have to do it very carefully to avoid introducing artificial structure in Cl or obliterating structure that you want to see. Moreover, a small error in the beam gets blown up in the deconvolution so one can go badly wrong in recovering the final spectrum. In other words, you need to know the beam very well to have any chance of getting close to the right answer!

The next picture gives a rough model for how much the “recovered” spectrum depends on the error produced by making even a small error in the beam profile which, for illustration only, is assumed to be Gaussian. It also shows how sensitive the shape of the deconvolved spectrum is to small errors in the beam.

Incidentally, the ratty blue line shows the spectrum obtained from a small patch of the sky rather than the whole sky. We were interested to see how much the spectrum varied across the sky so broke it up into square patches about the same size as those analysed by the Boomerang experiment. This turns out to be a pretty good way of getting the acoustic peak position but, as you can see, you lose information at low l (i.e. on scales larger than the patch).

The WMAP beam isn’t actually Gaussian – it differs quite markedly in its tails, which means that there’s even more cross-talk between different harmonic modes than in this example – but I hope you get the basic point. As Sawangwit & Shanks say, you need to know the beam very well to get the right fluctuation spectrum out. Move the acoustic peak around only slightly and all bets are off about the cosmological parameters and, perhaps, the evidence for dark energy and dark matter. Lung-Yih looked at the way the WMAP had done it and concluded that if their published beam shape was right then they had done a good job and there’s nothing substantially wrong with the results shown in the first graph.

Sawangwit & Shanks suggest the beam isn’t right so the recovered angular spectrum is suspect. I’ll need to look a bit more at the evidence they consider before commenting on that, although if anyone else has worked through it I’d be happy to hear from them through the comments box!

Skepsis

Posted in Politics, The Universe and Stuff with tags , , , , , , on May 1, 2010 by telescoper

This past week was the final week of proper teaching at Cardiff University, so I’ve done my last full lectures, tutorials and exercise classes of the academic year. Yesterday I assessed a bunch of 3rd-year project talks, and soon those students will be handing in their written reports for marking.  Next week will be a revision week, shortly after that the examinations begin. And so the cycle of academic life continues, in a curious parallel to the  football league season – the other routine that provides me with important markers for the passage of the year.

Anyway, this week I gave the last lecture to my first-year class on Astrophysical Concepts. This is a beginning-level course that tries to introduce some of the theory behind astronomy, focussing on the role of gravity. I cover orbits in newtonian gravity, gravity and hydrostatic equilibrium in extended bodies, a bit about stellar structure, gravitational collapse, and so on. In the last part I do a bit of cosmology. I decided to end this time with a lecture about dark energy as, according to the standard model, this accounts for about 75% of the energy budget of the Universe. It’s also something we don’t understand very well at all.

To make a point, I usually show the following picture (credit to the High-z supernova search team).

 What is plotted is the redshift of each supernova (along the x-axis), which relates to the factor by which the universe has expanded since light set out from it. A redshift of 0.5 means the universe was compressed by a factor 1.5 in all dimensions at the time when that particular supernova went bang. The y-axis shows the really hard bit to get right. It’s the estimated distance (in terms of distance modulus) of the supernovae. In effect, this is a measure of how faint the sources are. The theoretical curves show the faintness expected of a standard source observed at a given redshift in various cosmological models. The bottom panel shows these plotted with a reference curve taken out so the trend is easier to see.

The argument from this data is that the high redshift supernovae are fainter than one would expect in models without dark energy (represented by the \Omega_{\Lambda}  in the diagram. If this is true then it means the luminosity distance of these sources is greater than it would be in a decelerating universe. They can be accounted for, however, if the universe’s expansion rate has been accelerating since light set out from the supernovae. In the bog standard cosmological models we all like to work with, acceleration requires that \rho + 3p/c^2 be negative. The “vacuum” equation of state p=-\rho c^2 provides a simple way of achieving this but there are many other forms of energy that could do it also, and we don’t know which one is present or why…

This plot contains the principal evidence that has led to most cosmologists accepting that the Universe is accelerating.  However, when I show it to first-year undergraduates (or even to members of the public at popular talks), they tend to stare in disbelief. The errors are huge, they say, and there are so  few data points. It just doesn’t look all that convincing. Moreover, there are other possible explanations. Maybe supernovae were different beasties back when the universe was young. Maybe something has absorbed their light making them look fainter rather than being further away. Maybe we’ve got the cosmological models wrong.

The reason I show this diagram is precisely because it isn’t superficially convincing. When they see it, students probably form the opinion that all cosmologists are gullible idiots. I’m actually pleased by that.  In fact, it’s the responsibility of scientists to be skeptical about new discoveries. However, it’s not good enough just to say “it’s not convincing so I think it’s rubbish”. What you have to do is test it, combine it with other evidence, seek alternative explanations and test those. In short you subject it to rigorous scrutiny and debate. It’s called the scientific method.

Some of my colleagues express doubts about me talking about dark energy in first-year lectures when the students haven’t learned general relativity. But I stick to my guns. Too many people think science has to be taught as great stacks of received wisdom, of theories that are unquestionably “right”. Frontier sciences such as cosmology give us the chance to demonstrate the process by which we find out about the answers to big questions, not by believing everything we’re told but by questioning it.

My attitude to dark energy is that, given our limited understanding of the constituents of the universe and the laws of matter, it’s the best explanation we have of what’s going on. There is corroborating evidence of missing energy, from the cosmic microwave background and measurements of galaxy clustering, so it does have explanatory power. I’d say it was quite reasonable to believe in dark energy on the basis of what we know (or think we know) about the Universe.  In other words, as a good Bayesian, I’d say it was the most probable explanation. However, just because it’s the best explanation we have now doesn’t mean it’s a fact. It’s a credible hypothesis that deserves further work, but I wouldn’t bet much against it turning out to be wrong when we learn more.

I have to say that too many cosmologists seem to accept the reality of dark energy  with the unquestioning fervour of a religious zealot.  Influential gurus have turned the dark energy business into an industrial-sized bandwagon that sometimes makes it difficult, especially for younger scientists, to develop independent theories. On the other hand, it is clearly a question of fundamental importance to physics, so I’m not arguing that such projects should be axed. I just wish the culture of skepticism ran a little deeper.

Another context in which the word “skeptic” crops up frequently nowadays is  in connection with climate change although it has come to mean “denier” rather than “doubter”. I’m not an expert on climate change, so I’m not going to pretend that I understand all the details. However, there is an interesting point to be made in comparing climate change with cosmology. To make the point, here’s another figure.

There’s obviously a lot of noise and it’s only the relatively few points at the far right that show a clear increase (just as in the first Figure, in fact). However, looking at the graph I’d say that, assuming the historical data points are accurate,  it looks very convincing that the global mean temperature is rising with alarming rapidity. Modelling the Earth’s climate is very difficult and we have to leave it to the experts to assess the effects of human activity on this curve. There is a strong consensus from scientific experts, as monitored by the Intergovernmental Panel on Climate Change, that it is “very likely” that the increasing temperatures are due to increased atmospheric concentrations of greenhouse gas emissions.

There is, of course, a bandwagon effect going on in the field of climatology, just as there is in cosmology. This tends to stifle debate, make things difficult for dissenting views to be heard and evaluated rationally,  and generally hinders the proper progress of science. It also leads to accusations of – and no doubt temptations leading to – fiddling of the data to fit the prevailing paradigm. In both fields, though, the general consensus has been established by an honest and rational evaluation of data and theory.

I would say that any scientist worthy of the name should be skeptical about the human-based interpretation of these data and that, as in cosmology (or any scientific discipline), alternative theories should be developed and additional measurements made. However, this situation in climatology is very different to cosmology in one important respect. The Universe will still be here in 100 years time. We might not.

The big issue relating to climate change is not just whether we understand what’s going on in the Earth’s atmosphere, it’s the risk to our civilisation of not doing anything about it. This is a great example where the probability of being right isn’t the sole factor in making a decision. Sure, there’s a chance that humans aren’t responsible for global warming. But if we carry on as we are for decades until we prove conclusively that we are, then it will be too late. The penalty for being wrong will be unbearable. On the other hand, if we tackle climate change by adopting greener technologies, burning less fossil fuels, wasting less energy and so on, these changes may cost us a bit of money in the short term but  frankly we’ll be better off anyway whether we did it for the right reasons or not. Of course those whose personal livelihoods depend on the status quo are the ones who challenge the scientific consensus most vociferously. They would, wouldn’t they? Moreover, as Andy Lawrence pointed out on his blog recently, the oil is going to run out soon anyway…

This is a good example of a decision that can be made on the basis of a  judgement of the probability of being right. In that respect , the issue of how likely it is that the scientists are correct on this one is almost irrelevant. Even if you’re a complete disbeliever in science you should know  how to respond to this issue, following the logic of Blaise Pascal. He argued that there’s no rational argument for the existence or non-existence of God but that the consequences of not believing if God does exist (eternal damnation) were much worse than those of behaving as if you believe in God when he doesn’t. For “God” read “climate change” and let Pascal’s wager be your guide….

Dark Horizons

Posted in Cosmic Anomalies, The Universe and Stuff with tags , , , , , , on March 21, 2010 by telescoper

Last Tuesday night I gave a public lecture as part of  Cardiff University’s contribution to National Science and Engineering Week. I had an audience of about a hundred people, although more than half were students from the School of Physics & Astronomy rather than members of the public. I’d had a very full day already by the time it began (at 7pm) and I don’t mind admitting I was pretty exhausted even before I started the talk. I’m offering that as an excuse for struggling to get going, although I think I got better as I got into it. Anyway, I trotted out the usual stuff about the  Cosmic Web and it seemed to go down fairly well, although I don’t know about that because I wasn’t really paying attention.

At the end of the lecture, as usual, there was a bit of time for questions and no shortage of hands went up. One referred to something called Dark Flow which, I’ve just noticed, has actually got its own wikipedia page. It was also the subject of a recent Horizon documentary on BBC called Is Everything we Know about the Universe Wrong? I have to say I thought the programme was truly terrible, but that’s par for the course for Horizon these days I’m afraid. It used to be quite an interesting and informative series, but now it’s full of pointless special effects, portentous and sensationalising narration, and is repetitive to the point of torture. In this case also, it also portrayed a very distorted view of its subject matter.

The Dark Flow is indeed quite interesting, but of all the things that might threaten the foundations of the Big Bang theory this is definitely not it. I certainly have never lost any sleep worrying about it. If it’s real and not just the result of a systematic error in the data – and that’s a very big “if” – then the worst it would do would be to tell us that the Universe was a bit more complicated than our standard model. The same is true of the other cosmic anomalies I discuss from time to time on here.  

But we know our standard model leaves many questions unanswered and, as a matter of fact, many questions unasked. The fact that Nature may present us with a few surprises doesn’t mean the whole framework is wrong. It could be wrong, of course. In fact I’d be very surprised if our standard view of cosmology survives the next few decades without major revision. A healthy dose of skepticism is good for cosmology. To some extent, therefore, it’s good to have oddities like the Dark Flow out in the open.

However, that shouldn’t divert our attention from the fact that the Big Bang model isn’t just an arbitrary hypothesis with no justification. It’s the result of almost a century of  vigorous interplay between theory and observation, using an old-fashioned thing called the scientific method. That’s probably too dull for the producers of  Horizon, who would rather portray it as a kind of battle of wills between individuals competing for the title of next Einstein.

Anyway, just to emphasize the fact that I think questioning the Big Bang model is a good thing to do, here is a list of fundamental questions that should trouble modern cosmologists. Most of them are fundamental,  and we do not have answers to them. 

Is General Relativity right?

Virtually everything in the standard model depends on the validity of Einstein’s general theory of relativity (or theory of general relativity…). In a sense we already know that the answer to this question is “no”.

At sufficiently high energies (near the Planck scale) we expect classical relativity to be replaced by a quantum theory of gravity. For this reason, a great deal of interest is being directed at cosmological models inspired by superstring theory. These models require the existence of extra dimensions beyond the four we are used to dealing with. This is not in itself a new idea, as it dates back to the work of Kaluza and Klein in the 1920s, but in older versions of the idea the extra dimensions were assumed to be wrapped up so small as to be invisible. In “braneworld models”, the extra dimensions can be large but we are confined to a four-dimensional subset of them (a “brane”). In one version of this idea, dubbed the Ekpyrotic Universe, the origin of our observable universe lies in the collision between two branes in a higher-dimensional “bulk”. Other models are less dramatic, but do result in the modification of the Friedmann equations at early times.

 It is not just in the early Universe that departures from general relativity are possible. In fact there are many different alternative theories on the market. Some are based on modifications of Newton’s gravitational mechanics, such as MOND, modifications of Einstein’s theory, such as the Brans-Dicke theory, as well as those theories involving extra dimensions, such as braneworld theory, and so on

There remain very few independent tests of the validity of Einstein’s theory, particularly in the limit of strong gravitational fields. There is very little independent evidence that the curvature of space time on cosmological scales is related to the energy density of matter. The chain of reasoning leading to the cosmic concordance model depends entirely this assumption. Throw it away and we have very little to go on.

What is the Dark Energy?

In the standard cosmology, about 75% of the energy density of the Universe is in a form we do not understand. Because we’re in the dark about it, we call it Dark Energy. The question here is twofold. One part is whether the dark energy is of the form of an evolving scalar field, such as quintessence, or whether it really is constant as in Einstein’s original version. This may be answered by planned observational studies, but both of these are at the mercy of funding decisions. The second part is to whether dark energy can be understood in terms of fundamental theory, i.e. in understanding why “empty space” contains this vacuum energy.  I think it is safe to say we are still very far from knowing how vacuum energy on a cosmological scale arises from fundamental physics. It’s just a free parameter.

 

What is the Dark Matter?

Around 25% of the mass in the Universe is thought to be in the form of dark matter, but we don’t know what form it takes. We do have some information about this, because the nature of the dark matter determines how it tends to clump together under the action of gravity. Current understanding of how galaxies form, by condensing out of the primordial explosion, suggests the dark matter particles should be relatively massive. This means that they should move relatively slowly and can consequently be described as “cold”. As far as gravity is concerned, one cold particle is much the same as another so there is no prospect for learning about the nature of cold dark matter (CDM) particles through astronomical means unless they decay into radiation or some other identifiable particles. Experimental attempts to detect the dark matter directly are pushing back the limits of technology, but it would have to be a long shot for them to succeed when we have so little idea of what we are looking for.

Did Inflation really happen?

The success of concordance cosmology is largely founded on the appearance of “Doppler peaks” in the fluctuation spectrum of the cosmic microwave background (CMB). These arise from acoustic oscillations in the primordial plasma that have particular statistical properties consistent owing to their origin as quantum fluctuations in the scalar field driving a short-lived period of rapid expansion called inflation. This is strong circumstantial evidence in favour of inflation, but perhaps not strong enough to obtain a conviction. The smoking gun for inflation is probably the existence of a stochastic gravitational wave background. The identification and extraction of this may be possible using future polarisation-sensitive CMB studies even before direct experimental probes of sufficient sensitivity become available. As far as I am concerned, the jury will be out for a considerable time.

Despite these gaps and uncertainties, the ability of the standard framework to account for such a diversity of challenging phenomena provides strong motivation for assigning it a higher probability than its competitors. Part of this  is that no other theory has been developed to the point where we know what predictions it can make. Some of the alternative  ideas  I discussed above are new, and consequently we do not really understand them well enough to know what they say about observable situations. Others have adjustable parameters so one tends to disfavour them on grounds of Ockham’s razor unless and until some observation is made that can’t be explained in the standard framework.

Alternative ideas should be always explored. The business of cosmology, however,  is not only in theory creation but also in theory testing. The great virtue of the standard model is that it allows us to make precise predictions about the behaviour of the Universe and plan observations that can test these predictions. One needs a working hypothesis to target the multi-million-pound investment that is needed to carry out such programmes. By assuming this model we can make rational decisions about how to proceed. Without it we would be wasting taxpayers’ money on futile experiments that have very little chance of improving our understanding. Reasoned belief  in a plausible working hypothesis is essential to the advancement of our knowledge.

 Cosmologists may appear a bit crazy (especially when they appear on TV), but there is method in their madness. Sometimes.

Neophlogistonianism

Posted in The Universe and Stuff with tags , , on May 18, 2009 by telescoper

What happens when something burns?

Ask a seventeenth century scientist that question and the chances are the answer would  have involved the word phlogiston, a name derived from the Greek  φλογιστόν, meaning “burning up”. This “fiery principle” or “element” was supposed to be present in all combustible materials and the idea was that it was released into air whenever any such stuff was ignited. The act of burning separated the phlogiston from the dephlogisticated “true” form of the material, also known as calx.

The phlogiston theory held sway until  the late 18th Century, when Antoine Lavoisier demonstrated that combustion results in an increase in weight of the material being burned. This poses a serious problem if burning also involves the loss of phlogiston unless phlogiston has negative weight. However, many serious scientists of the 18th Century, such as Georg Ernst Stahl, had already suggested that phlogiston might have negative weight or, as he put it, “levity”. Nowadays we would probably say “anti-gravity”.

Eventually, Joseph Priestley discovered what actually combines with materials during combustion:  oxygen. Instead of becoming dephlogisticated, things become oxidised by fixing oxygen from air, which is why their weight increases. It’s worth mentioning, though, the name that Priestley used for oxygen was in fact “dephlogisticated air” (because it was capable of combining more extensively with phlogiston than ordinary air). He  remained a phlogistonian longer after making the discovery that should have killed the theory.

So why am I rambling on about a scientific theory that has been defunct for more than two centuries?

Well,  it’s because there just might be a lesson from history about the state of modern cosmology…

The standard cosmological model involves the hypothesis that about 75% of the energy budget of the Universe is in the form of “dark energy”. We don’t know much about what this is, except that in order to make our current understanding work out it has to act like a source of anti-gravity. It does this by violating the strong energy condition of general relativity.

Dark energy is needed to reconcile three basic measurements: (i) the brightness distant supernovae that seem to indicate the Universe is accelerating (which is where the anti-gravity comes in); (ii) the cosmic microwave background that suggests the Universe has flat spatial sections; and (iii) the direct estimates of the mass associated with galaxy clusters that accounts for about 25% of the mass needed to close the Universe.

A universe without dark energy appears not to be able to account for these three observations simultaneously within our current understanding of gravity as obtained from Einstein’s theory of general relativity.

I’ve blogged before, with some levity of my own, about how uncomfortable this dark energy makes me feel. It makes me even more uncomfortable that such an enormous  industry has grown up around it and that its existence is accepted unquestioningly by so many modern cosmologists.

Isn’t there a chance that, with the benefit of hindsight, future generations will look back on dark energy in the same way that we now see the phlogiston theory?

Or maybe the dark energy really is phlogiston. That’s got to be worth a paper! At least I prefer the name to quintessence.

**** Energy

Posted in Poetry, The Universe and Stuff with tags , , , , , on March 30, 2009 by telescoper

The phrase expletive deleted was made popular at the time of Watergate after the release of the expurgated tapes made by Richard Nixon in the Oval Office when he was President of the United States of America. These showed that, as well as been a complete crook, he was practically unable to speak a single sentence without including a swear word.

Nowadays the word expletive is generally taken to mean an oath or exclamation, particularly if it is obscene, but that’s not quite what it really means. Derived from the latin verb explere (“to fill out”) from which the past participle is expletus, the meaning of the word in the context of English grammar is  “something added to a phrase or sentence that isn’t strictly needed for the grammatical sense”.  An expletive is added either to fill a syntactical role or, in a poem, simply to make a line fit some metrical rule.

Examples of the former can be found in constructions like “It takes two to Tango” or “There is lots of crime in Nottingham”; neither  “it” nor “there” should really be needed but English likes to have something before the verb.

The second kind of use is illustrated wonderfully by Alexander Pope in his Essay on Criticism, which is a kind of guide to what to avoid in writing poetry. It’s a tour de force for its perceptiveness and humour. The following excerpt is pricelessly apt

These equal syllables alone require,
Tho’ oft the open vowels tire;
While expletives their feeble aid do join;
And ten low words oft creep in one dull line

Here the expletive is “do”,  and it is cleverly incorporated in the line talking about expletives, adding  the syllable needed to fit with a strict pentameter. Apparently, poets often used this construction before Pope attacked it but it quickly fell from favour afterwards.

His other prosodic targets are the “open vowels” which means initial vowels that produce an ugly glottal sound, such as in “oft” (especially ugly when following “Tho”). The last line is brilliant too, showing how using only monosyllabic “low” words makes for a line that plods along tediously just like it says.

It’s amazing how much Pope managed to fit into this poem, given the restrictions imposed by the closed couplet structure he adopted. Each idea is compressed into a unit of twenty syllables, two lines of ten syllables with a rhyme at the end of each. This is such an impressive exercise in word-play that it reminds me a lot of the skill showed by the best cryptic crossword setters. Needless to say I’m no more successful at writing poetry than I am at setting crossword clues.

After my talk in Dublin last Friday, somebody in the audience asked me what I thought about Dark Energy. There’s some discussion in the comments after my post on that too.

The Dark Energy is an ingredient added to the standard model of cosmology to reconcile  observations of a flat Universe with a matter density that seems too low to account for it.

Other than that it makes the  cosmological metric work out satisfactorily (geddit?), we don’t understand what Dark Energy means and would rather it wasn’t there.  Most people think the resulting model is inelegant or even ugly.

In other words, it’s an expletive…