Archive for Planck

Mud Wrestling and Microwaves

Posted in The Universe and Stuff with tags , , , on January 13, 2011 by telescoper

Reading through an interesting blog post about the new results from Planck by the ever-reliable Jonathan Amos (the BBC’s very own “spaceman”), I was reminded of a comment I heard made by Martin Rees (now Lord Rees) many years ago.

The remark concerned the difference between cosmology and astrophysics. Cosmology, said Lord Rees, especially the part of it that concerns the very early Universe, involves abstract mathematical concepts, difficult yet logical reasoning and the ability to see deep things in complicated spatial patterns. In that respect it’s rather like chess. Astrophysics, on the other hand, which is not at all elegant and has so many messy complications that it is sometimes difficult even to work out what is going on or what the rules are, is more like mud wrestling.

The following image, which I borrowed from Jonathan Amos’ piece, explains why I was reminded of this and why some cosmologists are having to abandon chess for mud wrestling, at least for the time being. The picture shows the nine individual frequency maps (spanning the range from 30 GHz to 857 GHz) obtained by Planck.

What we cosmologists really want to see is a pristine map of the cosmic microwave background, the black-body radiation that pervades the entire Universe. It’s black body form means that it would have the same brightness temperature across all frequencies, and would also be statistically homogeneous (i.e. looking roughly the same all across the sky).

What you actually see is a mess. There are strong contributions from the disk of our own Galaxy, some of it extending quite a way above and below the plane of the Milky Way. You can also see complicated residuals produced by the way Planck scans the sky. On top of that there is radiation from individual sources within our Galaxy, other Galaxies and even clusters of Galaxies (which I mentioned a couple of days ago). These “contaminants” constitute valuable raw material for astronomers of various sorts, but for cosmologists they are an unwanted nuisance. Unfortunately, there is no other way to reach the jewels of the CMB than by hacking through this daunting jungle of foregrounds and instrumental artefacts.

Looking at the picture might induce one of two reactions. One would be to assume that there’s no way that all the crud can be removed with sufficient accuracy and precision to do cosmology with what’s left. Another is  to appreciate how well cosmologists have done with previous datasets, especially WMAP, have confidence that they’ll solve the numerous problems associated with the Planck data, but understand why  will take another two years of high-powered data analysis by a very large number of very bright people to extract cosmological results from Planck.

There might be gold at the end of the pipeline, but until then it’s going to be mud, glorious mud…


Share/Bookmark

First Science from Planck

Posted in The Universe and Stuff with tags , , , , , , , , , on January 11, 2011 by telescoper

It’s been quite a long wait for results to emerge from the Planck satellite, which was launched in May 2009, but today the first science results have at last been released. These aren’t to do with the cosmological aspects of the mission – those will have to wait another two years – but things we cosmologists tend to think of as “foregrounds”, although they are of great astrophysical interest in themselves.

For an overview, with lots of pretty pictures,  see the European Space Agency’s Planck site and the UK Planck outreach site; you can also watch this morning’s press briefing in full here.

A repository of all 25 science papers can be found here and there’ll no doubt be a deluge of them on the arXiv tomorrow.

A few of my Cardiff colleagues are currently in Paris living it up at the junket working hard at the serious scientific conference at which these results are being discussed. I, on the other hand, not being one of the in-crowd, am back here in Cardiff, only have a short window in between meetings, project vivas and postgraduate lectures  to comment on the new data. I’m also sure there’ll be a huge amount of interest in the professional media and in the blogosphere for some time to come. I’ll therefore just mention a couple of things that struck me immediately as I went quickly through the papers while I was eating my sandwich; the following was cobbled together from the associated ESA press release.

The first concerns the so-called  ‘anomalous microwave emission’ (aka Foreground X) , which is a diffuse glow most strongly associated with the dense, dusty regions of our Galaxy. Its origin has been a puzzle for decades, but data collected by Planck seem to confirm the theory that it comes from rapidly spinning dust grains. Identifying the source of this emission will help Planck scientists remove foreground contamination which much greater precision, enabling them to construct much cleaner maps of the cosmic microwave background and thus, among other things, perhaps clarify the nature of the various apparent anomalies present in current cosmological data sets.

Here’s a nice composite image of a region of anomalous emission, alongside individual maps derived from low-frequency radio observations as well as two of the Planck channels (left).

Credits: ESA/Planck Collaboration

The colour composite of the Rho Ophiuchus molecular cloud highlights the correlation between the anomalous microwave emission, most likely due to miniature spinning dust grains observed at 30 GHz (shown here in red), and the thermal dust emission, observed at 857 GHz (shown here in green). The complex structure of knots and filaments, visible in this cloud of gas and dust, represents striking evidence for the ongoing processes of star formation. The composite image (right) is based on three individual maps (left) taken at 0.4 GHz from Haslam et al. (1982) and at 30 GHz and 857 GHz by Planck, respectively. The size of the image is about 5 degrees on a side, which is about 10 times the apparent diameter of the full Moon.

The second of the many other exciting results presented today that I wanted to mention is a release of new data on clusters of galaxies – the largest structures in the Universe, each containing hundreds or even thousands of galaxies. Owing to the Sunyaev-Zel’dovich Effect these show up in the Planck data as compact regions of lower temperature in the cosmic microwave background. By surveying the whole sky, Planck stands the best chance of finding the most massive examples of these clusters. They are rare and their number is a sensitive probe of the kind of Universe we live in, how fast it is expanding, and how much matter it contains.

Credits: ESA/Planck Collaboration; XMM-Newton image: ESA

This image shows one of the newly discovered superclusters of galaxies, PLCK G214.6+37.0, detected by Planck and confirmed by XMM-Newton. This is the first supercluster to be discovered through its Sunyaev-Zel’dovich effect. The effect is the name for the cluster’s silhouette against the cosmic microwave background radiation. Combined with other observations, the Sunyaev-Zel’dovich effect allows astronomers to measure properties such as the temperature and density of the cluster’s hot gas where the galaxies are embedded. The right panel shows the X-ray image of the supercluster obtained with XMM-Newton, which reveals that three galaxy clusters comprise this supercluster. The bright orange blob in the left panel shows the Sunyaev-Zel’dovich image of the supercluster, obtained by Planck. The X-ray contours are also superimposed on the Planck image.

UPDATES: For other early perspectives on the early release results, see the blogs of Andrew Jaffe and Stuart Lowe; as usual, Jonathan Amos has done a very quick and well-written news piece for the BBC.


Share/Bookmark

Blydhen Nowydh Da!

Posted in Biographical, Education, Music, Politics, Science Politics, Sport with tags , , on January 1, 2011 by telescoper

I hope the blogosphere hasn’t got too bad a hangover this morning. I don’t, although I did have a nice lie in until about 11am when the lure of the Guardian prize crossword drew me out of bed and down to the newsagents. Luckily, I remembered to get dressed first. The crossword turned out to be quite a nice one to start the year with, by the perennial Araucaria, but it didn’t take all that long to do so I’ve got time to do a bit of shopping and a go on my exercise bike. Yes, that’s my New Year’s resolution. More shopping.

I know 2010 was a tough year for many people for many different reasons. I wouldn’t say it’s exactly been brilliant for me either, but I am looking forward to 2011 whatever it might bring. The first results from Planck will be released very soon (on 11th January, in fact), which will give me something exciting to blog about. More generally, the recent financial settlement for STFC was not as poor as many of us expected so the future doesn’t look quite as grim for UK astronomy as we feared.

There are exciting developments in store for the School of Physics & Astronomy at Cardiff University, where I work, with (hopefully) a number of new staff members joining us soon. Later on in the year we’ll be rolling out a completely redsigned set of physics courses which we’ve been working on for over a year. In addition we’ll be starting to work more closely with Swansea University in order to provide a broader range of advanced options for physics students at both institutions.

Of course behind all this there’s still considerable uncertainty about the funding situation for universities which are facing big cuts in government grants and having to increase tuition fees charged to students. Whether and to what extent this will deter students from going to university remains to be seen. The financial pressure will certainly lead to mergers and possibly to closures across the UK over the next few years, although only time will tell how many.

On the cultural side there’s a large number of concerts at St David’s Hall and a full season of Opera at WNO to look forward to, including a performance of Cosi fan Tutte on my birthday. Cardiff plays host to the First Test match between England and Sri Lanka at the end of May, and a one-day international against India in September. I might even get myself a membership of Glamorgan Cricket Club, something I’ve toyed with doing for a couple of years now. There’s also a good chance that Cardiff City F.C. might get themselves promoted to the Premiership, something that would be great for the city of Cardiff. It wouldn’t be beyond them to fall at the last fence, as they have a habit of doing..

May 2011 will also see the Welsh Assembly elections, and there will be a referendum on further law-making powers for the WAG on 3rd March.

On the wider political scene the question is whether the governing coalition’s cuts will force the economy back into recession or not. I don’t know the answer to that, but I do know that many ordinary working people are going to lose their jobs and many less advantaged members of society will have their benefits cut. Meanwhile the people who took us to the brink of economic ruin will no doubt carry on getting their bonuses.

In spite of all that, let me end by wishing you peace and prosperity for the New Year and beyond. And if that’s not possible, just remember Nil Illegitimi Carborundum.

Hot Stuff, Looking Cool..

Posted in The Universe and Stuff with tags , , , , , on September 15, 2010 by telescoper

It’s nice for a change to have an excuse to write something about science rather than science funding, as a press release appeared today concerning the discovery of a new supercluster by Planck in collaboration with the X-ray observatory XMM-Newton.

The physics behind this new discovery concerns what happens to low-energy photons from the cosmic microwave background (CMB) when they are scattered by extremely hot plasma. Basically, incoming microwave photons collide with highly energetic electrons with the result that they gain energy and so are shifted to shorter wavelengths. The generic name given to this process is inverse Compton scattering, and it can happen in a variety of physical contexts. In cosmology, however, there is a particularly important situation where this process has observable consequences, when CMB photons travel through the extremely hot (but extremely tenuous) ionized gas in a cluster of galaxies. In this setting the process is called the Sunyaev-Zel’dovich effect.

The observational consequence is slightly paradoxical because what happens is that the microwave background can appears to have a lower temperature (at least for a certain range of wavelengths) in the direction of a galaxy cluster (in which the plasma can have a temperature of 10 million degrees or more). This is because fewer photons reach the observer in the microwave part of the spectrum that would if the cluster did not intervene; the missing ones have been kicked up to higher energies and are therefore not seen at their original wavelength, ergo the CMB looks a little cooler along the line of sight to a cluster than in other directions. To put it another way, what has actually happened is that the hot electrons have distorted the spectrum of the photons passing through it.

Here’s an example of the Sunyaev-Zel’dovich effect in action as seen by Planck in seven frequency bands:

At low frequencies (in the Rayleigh-Jeans part of the spectrum) the region where the cluster is looks cooler than average, although at high frequencies the effect is reversed.

The magnitude of the temperature distortion produced by a cluster depends on the density of electrons in the plasma pervading the cluster n, the temperature of the plasma T, and the overall size of the cluster; in fact, it’s propotional to n×T integrated along the line of sight through the cluster.

Why this new result is so interesting is that it combines very sensitive measurements of the microwave background temperature pattern  with sensitive measures of the X-ray emission over the same region of the sky. Plasma hot enough to produce a Sunyaev-Zel’dovich distortion of the CMB spectrum will also generate X-rays through a process known as thermal bremsstrahlung.  The power of the X-ray emission depends on the square of the electron density n2 multiplied by the Temperature T.

Since the Sunyaev-Zel’dovich and X-ray measurements depend on different mathematical combinations of the physical properties involved the amalgamation of these two techniques allows astronomers to probe the internal details of the cluster quite precisely.

The example shown here in the top two panels is of a familiar cluster – the Coma Cluster as mapped by Planck (in microwaves) and, by an older X-ray satellite called ROSAT, in X-rays. The two distributions have very similar morphology, strongly suggesting that they have a common origin in the cluster plasma.

The bottom panels show comparisons with the distribution of galaxies as seen in the optical part of the spectrum. You can see that the hot gas I’ve been talking about extends throughout the space between the galaxies. In fact, there is at least as much matter in the hot plasma as there is in the individual galaxies in objects like this, but it’s too hot to be seen in optical light. This could reasonably be called dark matter when it comes to its lack of optical emission, but it’s certainly not dark in X-rays!

The reason why the intracluster plasma is so hot boils down to the strength of the gravitational field in the cluster. Roughly speaking, the hot matter is in virial equilibrium within the gravitational potential generated by the mass distribution within the cluster. Since this is a very deep potential well, electrons move very quickly in response to it. In fact, the galaxies in the cluster are also roughly in virial equilibrium so they too are pulled about by the gravitational field. Galaxies don’t sit around quietly in clusters, they buzz about like bees in a bottle.

Anyway, the new data arising from the combination of Planck and XMM-Newton has revealed not just one cluster, but a cluster of clusters (i.e. a “supercluster”):

It’s early days for Planck, of course, and this is no more than a taster.
The Planck team is currently analysing the data from the first all-sky survey to identify both known and new galaxy clusters for the early Sunyaev-Zel’dovich catalogue, which will be released in January of 2011 as part of the Early Release Compact Source Catalogue. The full Sunyaev-Zel’dovich catalogue may well turn out to be the most enduring legacy of the Planck mission.


Share/Bookmark

Cardiff inSPIREs Willetts

Posted in Politics, Science Politics, The Universe and Stuff with tags , , , , on July 9, 2010 by telescoper

The Minister for Universities and Science David Willetts’ important speech today at the Royal Institution in London has already attracted a considerable amount of comment and reaction. I haven’t really got time to comment on it in detail, but in between the expected warning of tough times ahead, it does contain a great deal of extremely interesting and thoughtful material, which I recommend you read if you’re interested in science policy.

Of particular interest to us here in the School of Physics & Astronomy at Cardiff University is that we get a specific mention for the wonderful work done by the Astronomical Instrumentation Group on the development of the SPIRE instrument on the Herschel Space Observatory.  Everyone’s chuffed about it, and delighted that the Minister chose to highlight this particular example of excellence.

In my speech at Birmingham University in May, I spoke of links between the academic and the vocational, the conceptual and the physical. We are not always good at this – we have world-class particle physicists at the Large Hadron Collider but sadly not many British engineers helped to build it. But there are other areas where these links between British science and technology are stronger. We not only have distinguished astronomers, but it was scientists and engineers at Cardiff University who produced the Spectral and Photometric Imaging Receiver for Herschel and Planck. This combination of scientific research and technological advance creates extraordinary dynamism, both intellectual and commercial. I see it as one of my tasks to strengthen these links.

OK, so I know SPIRE wasn’t for “Herschel and Planck” but the AIG was involved with instruments for both these missions so the point is well made anyway.

Space: The Final Frontier?

Posted in The Universe and Stuff with tags , , , , , , , on July 9, 2010 by telescoper

I found this on my laptop just now. Apparently I wrote it in 2003, but I can’t remember what it was for. Still, when you’ve got a hungry blog to feed, who cares about a little recycling?

It seems to be part of our nature for we humans to feel the urge  to understand our relationship to the Universe. In ancient times, attempts to cope with the vastness and complexity of the world were usually in terms of myth or legend, but even the most primitive civilizations knew the value of careful observation. Astronomy, the science of the heavens, began with attempts to understand the regular motions of the Sun, planets and stars across the sky. Astronomy also aided the first human explorations of own Earth, providing accurate clocks and navigation aids. But during this age the heavens remained remote and inaccessible, their nature far from understood, and the idea that they themselves could some day be explored was unthinkable. Difficult frontiers may have been crossed on Earth, but that of space seemed impassable.

The invention of the telescope ushered in a new era of cosmic discovery, during which we learned for the first time precisely how distant the heavenly bodies were and what they were made of.  Galileo saw that Jupiter had moons going around it, just like the Earth. Why, then, should the Earth be thought of as the centre of the Universe? The later discovery, made in the 19th Century using spectroscopy, that the Sun and planets were even made of the same type of material as commonly found on Earth made it entirely reasonable to speculate that there could be other worlds just like our own. Was there any theoretical reason why we might not be able to visit them?

No theoretical reason, perhaps, but certainly practical ones. For a start, there’s the small matter of getting “up there”. Powered flying machines came on the scene about one hundred years ago, but conventional aircraft simply can’t travel fast enough to escape the pull of Earth’s gravity. This problem was eventually solved by adapting technology developed during World War II to produce rockets of increasingly large size and thrusting power. Cold-war rivalry between the USA and the USSR led to the space race of the 1960s culminating in the Apollo missions to the Moon in the late 60s and early 70s. These missions were enormously expensive and have never been repeated, although both NASA and the European Space Agency are currently attempting to gather sufficient funds to (eventually) send manned missions to Mars.

But manned spaceflights have been responsible for only a small fraction of the scientific exploration of space. Robotic probes have been dispatched all over the Solar System. Some have failed, but at tiny fraction of the cost of manned missions. Landings have been made on the solid surfaces of Venus, Mars and Titan and probes have flown past the beautiful gas giants Jupiter, Saturn, Uranus and Neptune taking beautiful images of these bizarre frozen worlds.

Space is also a superb vantage point for astronomical observation. Above the Earth’s atmosphere there is no twinkling of star images, so even a relatively small telescope like the Hubble Space Telescope (HST) can resolve details that are blurred when seen from the ground. Telescopes in space can also view the entire sky, which is not possible from a point on the Earth’s surface. From space we can see different kinds of light that do not reach the ground: from gamma rays and X-rays produced by very energetic objects such as black holes, down to the microwave background which bathes the Universe in a faint afterglow of its creation in the Big Bang. Recently the Wilkinson Microwave Anisotropy Probe (WMAP) charted the properties of this cosmic radiation across the entire sky, yielding precise measurements of the size and age of the Universe. Planck and Herschel are pushing back the cosmic frontier as I write, and many more missions are planned for the future.

Over the last decade, the use of dedicated space observatories, such as HST and WMAP, in tandem with conventional terrestrial facilities, has led to a revolution in our understanding of how the Universe works. We are now convinced that the Universe began with a Big Bang, about 14 billion years ago. We know that our galaxy, the Milky Way, is just one of billions of similar objects that condensed out of the cosmic fireball as it expanded and cooled. We know that most galaxies have a black hole in their centre which gobbles up everything falling into it, even light. We know that the Universe contains a great deal of mysterious dark matter and that empty space is filled with a form of dark energy, known in the trade as the cosmological constant. We know that our own star the Sun is a few billion years old and that the planets formed from a disk of dusty debris that accompanied the infant star during its birth. We also know that planets are by no means rare: nearly two hundred exoplanets (that is, planets outside our Solar System) have so far been discovered. Most of these are giants, some even larger than Jupiter which is itself about 300 times more massive than Earth, but this may simply because big objects are easier to find than small ones.

But there is still a lot we still don’t know, especially about the details. The formation of stars and planets is a process so complicated that it makes weather forecasting look simple. We simply have no way of knowing what determines how many stars have solid planets, how many have gas giants, how many have both and how many have neither. In order to support life, a planet must be in an orbit which is neither too close to its parent star (where it would be too hot for life to exist) nor too far aware (where it would be too cold). We also know very little about how life evolves from simple molecules or how robust it is to the extreme environments that might be found elsewhere in our Universe. It is safe to say that we have no absolutely idea how common life is within our own Galaxy or the Universe at large.

Within the next century it seems likely that we will whether there is life elsewhere in our Solar System. We will probably also be able to figure out how many earth-like exoplanets there are “out there”. But the unimaginable distances between stars in our galaxy make it very unlikely that crude rocket technology will ever enable us to physically explore anything beyond our own backyard for the foreseeable future.

So will space forever remain the final frontier? Will we ever explore our Galaxy in person, rather than through remote observation? The answer to these questions is that we don’t know for sure, but the laws of nature may have legal loopholes (called “wormholes”) that just might allow us to travel faster than light if we ever figure out how to exploit them. If we can do it then we could travel across our Galaxy in hours rather than aeons. This will require a revolution in our understanding not just of space, but also of time. The scientific advances of the past few years would have been unimaginable only a century ago, so who is to say that it will never happen?

Ten Facts about Space Exploration

  1. The human exploration of space began on October 4th 1957 when the Soviet Union launched Sputnik the first man-made satellite. The first man in space was also a Russian, Yuri Gagarin, who completed one orbit of the Earth in the Vostok spacecraft in 1961. Apparently he was violently sick during the entire flight.
  2. The first man to set foot on the Moon was Neil Armstrong, on July 20th 1969. As he descended to the lunar surface, he said “That’s one small step for a man, one giant leap for mankind.”
  3. In all, six manned missions landed on the Moon (Apollo 11, 12, 14, 15, 16 and 17; Apollo 13 aborted its landing and returned to Earth after an explosion seriously damaged the spacecraft). Apollo 17 landed on December 14th 1972, since when no human has set foot on the lunar surface.
  4. The first reusable space vehicle was the Space Shuttle, four of which were originally built. Columbia was the first, launched in 1981, followed by Challenger in 1983, Discovery in 1984 and Atlantis in 1985.  Challenger was destroyed by an explosion shortly after takeoff in 1992, and was replaced by Endeavour. Columbia disintegrated over Texas while attempting to land in 2003.
  5. Viking 1 and Viking 2 missions landed on surface of Mars in 1976; they sent back detailed information about the Martian soil. Tests for the presence of life proved inconclusive, but there is strong evidence that Mars once had running water on its surface.
  6. The outer planets (Jupiter, Saturn, Uranus and Neptune) have been studied by numerous fly-by probes, starting with Pioneer 10 (1973) and Pioneer 11 (1974) . Voyager 1 and Voyager 2 flew past Jupiter in 1979;  Voyager 2 went on to visit Uranus (1986)  and Neptune (1989) after receiving a gravity assist from a close approach to Jupiter. These missions revealed, among other things, that all these planets have spectacular ring systems – not just Saturn. More recently, in 2004, the Cassini spacecraft launched the Huygens probe into the atmosphere of Titan. It survived the descent and sent back amazing images of the surface of Saturn’s largest moon.
  7. Sending a vehicle into deep space requires enough energy to escape the gravitational pull of the Earth. This means exceeding the escape velocity of our planet, which is about 11 kilometres per second (nearly 40,000 kilometres per hour). Even travelling at this speed, a spacecraft will take many months to reach Mars, and years to escape the Solar System.
  8. The nearest star to our Sun is Proxima Centauri, about 4.5 light years away. This means that, even travelling at the speed of light (300,000 kilometres per second) which is as fast as anything can do according to known physics, a spacecraft would take 4.5 years to get there. At the Earth’s escape velocity (11 kilometres per second), it would take over a hundred thousand years.
  9. Our Sun orbits within our own galaxy – the Milky Way – at a distance of about 30,000 light years from the centre at a speed of about 200 kilometres per second, taking about a billion years to go around. The Milky Way contains about a hundred billion stars.
  10. The observable Universe has a radius of about 14 billion light years, and it contains about as many galaxies as there are stars in the Milky Way. If every star in every galaxy has just one planet then there are approximately ten thousand million million million other places where life could exist.

The Planck Sky

Posted in The Universe and Stuff with tags , , , , , , , on July 5, 2010 by telescoper

Hot from the press today is a release of all-sky images from the European Space Agency’s Planck mission, including about a year’s worth of data. You can find a full set of high-resolution images here at the ESA website, along with a lot of explanatory text, and also here and here. Here’s a low-resolution image showing the galactic dust (blue) and radio (pink) emission concentrated in the plane of the Milky Way but extending above and below it. Only well away from the Galactic plane do you start to see an inkling of the pattern of fluctuations in the Cosmic Microwave Background that the survey is primarily intended to study.

It will take a lot of sustained effort and clever analysis to clean out the foreground contamination from the maps, so the cosmological interpretation will have to wait a while. In fact, the colour scale seems to have been chosen in such a way as to deter people from even trying to analyse the CMB component of the data contained in these images. I’m not sure that will work, however, and it’s probably just a matter of days before some ninny posts a half-baked paper on the arXiv claiming that the standard cosmological model is all wrong and that the Universe is actually the shape of a vuvuzela. (This would require only a small modification of an earlier suggestion.)

These images are of course primarily for PR purposes, but there’s nothing wrong with that. Apart from being beautiful in its own right, they demonstrate that Planck is actually working and that results it will eventually produce should be well worth waiting for!

Oh, nearly forgot to mention that the excellent Jonathan Amos has written a nice piece about this on the BBC Website too.

Planck and the Cold Galaxy

Posted in The Universe and Stuff with tags , , , , , , on March 17, 2010 by telescoper

Just a quick post to show a cool result from Planck which has just been released by the European Space Agency (ESA). It will be a while before any real cosmological results are available, but in the meantime here are a couple of glimpses into the stuff we cosmologists think of as foreground contamination but which are of course of great interest in themselves to other kinds of astronomers.

The beautiful image above (courtesy of ESA and the HFI Consortium) covers a portion of the sky about 55 degrees across. It is a three-colour combination constructed from Planck’s two shortest wavelength channels (540 and 350 micrometres, corresponding to frequencies of 545 and 857 GHz respectively), and an image at 100 micrometres obtained with the Infrared Astronomical Satellite (IRAS). This combination effectively traces the dust temperature: reddish tones correspond to temperatures as cold as 12 degrees above absolute zero, and whitish tones to significantly warmer ones (a few tens of degrees above absolute zero) in regions where massive stars are currently forming. Overall, the image shows local dust structures within 500 light years of the Sun.

Our top man in the HFI Consortium,  Professor Peter Ade, is quoted as saying

..the HFI is living up to our most optimistic pre-flight expectations.  The wealth of the data is seen in these beautiful multicolour images exposing previously unseen detail in the cold dust components of our galaxy.  There is much to be learned from detailed interpretation of the data which will significantly enhance our understanding of the star formation processes and galactic morphology.

This Planck image was obtained during the first Planck all-sky survey which began in mid-August 2009. By mid-March 2010 more than 98% of the sky has been observed by Planck. Because of the way Planck scans the sky 100% sky coverage for the first survey will take until late-May 2010.

Other new results and a more detailed discussion of this one can be found here and here.

Taken for Granted

Posted in Science Politics with tags , , , , on March 10, 2010 by telescoper

It’s been a couple of weeks since the Astronomy group in the School of Physics & Astronomy at Cardiff University was informed of the result of its recent application to the Science and Technology Facilities Council (STFC) for a continuation of its rolling grant. I haven’t been able to post anything about it because it has led to some difficult personal situations and we didn’t want anyone to hear about it other than face to face from relevant members of the department.

In case you weren’t aware, a rolling grant covers a 5-year but a group holding one has to apply for renewal every three years at which point the programme of research is reviewed by a panel of experts. If this review is positive a new 5-year grant is awarded and the two years remaining on the old grant or cancelled. In the case of a negative review, however, there is two years’ grace until the funding is terminated, giving the applicants the chance to try again next time.

At least that’s what used to happen.

The previous Cardiff Astronomy roller supported 6 postdoctoral research assistants (PDRAs) as well as providing other funds for travel, equipment, infrastructure and other staff time. This time we requested an increase, primarily in order to enable us to exploit the wonderful data coming from the Herschel observatory. I joined Cardiff after the last review so I wasn’t included in the existing  funding package. However, I did succeed in getting a standard grant in last year’s grant round which provides support for a 3-year period. This time, I applied to have this grant subsumed into the rolling programme when it completes in 2012. I requested an extension to the 3-years to tide this over until the next rolling grant and bring me into phase with the rest of the group.

That was the idea, anyway. STFC is extremely short of money, so despite what we felt was a strong case for supporting our Herschel work we weren’t particularly optimistic of a good outcome, especially since  additional cuts to research grants were announced last December.  In fact the rolling grant application went in last year, but the process is extremely lengthy. Three of us had to go to Swindon last October to present the case to the grants panel. The panel had apparently completed its work by December, but when new cuts were announced they had to revisit their decisions. That’s why we were only informed at the end of February of the level of support that we would get from April 1st this year.

In fact we received two announcements, one detailing what we would have got had the panel’s original recommendations been followed, then another showing the result of the additional 15% cut decided in December. In the first we were cut from 6 PDRAs to 5, but in the second an additional position was cut leaving us with 4 surviving from the previous grant. Moreover, STFC has basically abandoned the rolling grant concept entirely, and refused us permission to let the previous grant roll out. We had no choice but to accept the new grant, which means that we have insufficient funds from 1st April 2010 to honour contracts already issued to two scientists. Not a pleasant situation to be presented with. We’ve managed to find a way of coping to the extent that nobody will be made redundant in the short-term, but it’s still a time of great uncertainty for those involved.

For my own part, the circumstances are a bit better. The panel did award me an extension of my grant to enable me to merge my research with the rest of the programme by the next review date. They also – unexpectedly, I must admit – gave me a small uplift in my existing funding. I’ll be OK, at least for another 3 years.

Overall, we’re disappointed. The outcome wasn’t as good as we’d hoped but, then again, it wasn’t as bad as we’d feared. Taking into account the standard grant I hold, we’ve gone down from 7 PDRAs to 5. I’ve heard rumours of much more drastic cuts elsewhere, and I’m sure other departments are feeling the pain much more than we are right now. I don’t have a clear picture of what has happened nationally, so I’d be grateful for any information people might be prepared to divulge through the comments box as long as you don’t betray any confidences!

The whole business of securing grant funding can be deeply frustrating, and sometimes the  decisions seem bewildering. However, I’ve been on these panels before and I know how hard it is, so I’m never tempted to whinge. In fact, I’m going to be joining the panel again for this round. Not that I’m looking forward to it very much!

However, I can’t resist ending with a comment about the current management of STFC. It really seems quite absurd to be cutting grant funding at precisely the time that Herschel and Planck are starting to deliver huge quantities of exquisite data.  I say that as a scientist of course, not a civil servant. However, the prevailing mentality at STFC – instigated by the Treasury – seems to be that science part of their remit is much less important than the technology and the facilities. Although the Science Minister Lord Drayson recently announced a proposal that purports to fix some of STFC’s difficulties, this seems more than likely to keep grant funding at a miserably low level for the indefinite future. The STFC management’s readiness to rewrite the rules governing rolling grants, cut funding at absurdly short notice, and raid the grant budget in order to solve problems elsewhere has convinced me that there will be no improvement until there are people at the top that recognize that it’s science that matters, that science is done by people, and that the way to manage those people is not to treat them the way they are doing now.

Especially if they want people to provide free advice to their panels…

The Seven Year Itch

Posted in Bad Statistics, Cosmic Anomalies, The Universe and Stuff with tags , , , on January 27, 2010 by telescoper

I was just thinking last night that it’s been a while since I posted anything in the file marked cosmic anomalies, and this morning I woke up to find a blizzard of papers on the arXiv from the Wilkinson Microwave Anisotropy Probe (WMAP) team. These relate to an analysis of the latest data accumulated now over seven years of operation; a full list of the papers is given here.

I haven’t had time to read all of them yet, but I thought it was worth drawing attention to the particular one that relates to the issue of cosmic anomalies. I’ve taken the liberty of including the abstract here:

A simple six-parameter LCDM model provides a successful fit to WMAP data, both when the data are analyzed alone and in combination with other cosmological data. Even so, it is appropriate to search for any hints of deviations from the now standard model of cosmology, which includes inflation, dark energy, dark matter, baryons, and neutrinos. The cosmological community has subjected the WMAP data to extensive and varied analyses. While there is widespread agreement as to the overall success of the six-parameter LCDM model, various “anomalies” have been reported relative to that model. In this paper we examine potential anomalies and present analyses and assessments of their significance. In most cases we find that claimed anomalies depend on posterior selection of some aspect or subset of the data. Compared with sky simulations based on the best fit model, one can select for low probability features of the WMAP data. Low probability features are expected, but it is not usually straightforward to determine whether any particular low probability feature is the result of the a posteriori selection or of non-standard cosmology. We examine in detail the properties of the power spectrum with respect to the LCDM model. We examine several potential or previously claimed anomalies in the sky maps and power spectra, including cold spots, low quadrupole power, quadropole-octupole alignment, hemispherical or dipole power asymmetry, and quadrupole power asymmetry. We conclude that there is no compelling evidence for deviations from the LCDM model, which is generally an acceptable statistical fit to WMAP and other cosmological data.

Since I’m one of those annoying people who have been sniffing around the WMAP data for signs of departures from the standard model, I thought I’d comment on this issue.

As the abstract says, the  LCDM model does indeed provide a good fit to the data, and the fact that it does so with only 6 free parameters is particularly impressive. On the other hand, this modelling process involves the compression of an enormous amount of data into just six numbers. If we always filter everything through the standard model analysis pipeline then it is possible that some vital information about departures from this framework might be lost. My point has always been that every now and again it is worth looking in the wastebasket to see if there’s any evidence that something interesting might have been discarded.

Various potential anomalies – mentioned in the above abstract – have been identified in this way, but usually there has turned out to be less to them than meets the eye. There are two reasons not to get too carried away.

The first reason is that no experiment – not even one as brilliant as WMAP – is entirely free from systematic artefacts. Before we get too excited and start abandoning our standard model for more exotic cosmologies, we need to be absolutely sure that we’re not just seeing residual foregrounds, instrument errors, beam asymmetries or some other effect that isn’t anything to do with cosmology. Because it has performed so well, WMAP has been able to do much more science than was originally envisaged, but every experiment is ultimately limited by its own systematics and WMAP is no different. There is some (circumstantial) evidence that some of the reported anomalies may be at least partly accounted for by  glitches of this sort.

The second point relates to basic statistical theory. Generally speaking, an anomaly A (some property of the data) is flagged as such because it is deemed to be improbable given a model M (in this case the LCDM). In other words the conditional probability P(A|M) is a small number. As I’ve repeatedly ranted about in my bad statistics posts, this does not necessarily mean that P(M|A)- the probability of the model being right – is small. If you look at 1000 different properties of the data, you have a good chance of finding something that happens with a probability of 1 in a thousand. This is what the abstract means by a posteriori reasoning: it’s not the same as talking out of your posterior, but is sometimes close to it.

In order to decide how seriously to take an anomaly, you need to work out P(M|A), the probability of the model given the anomaly, which requires that  you not only take into account all the other properties of the data that are explained by the model (i.e. those that aren’t anomalous), but also specify an alternative model that explains the anomaly better than the standard model. If you do this, without introducing too many free parameters, then this may be taken as compelling evidence for an alternative model. No such model exists -at least for the time being – so the message of the paper is rightly skeptical.

So, to summarize, I think what the WMAP team say is basically sensible, although I maintain that rummaging around in the trash is a good thing to do. Models are there to be tested and surely the best way to test them is to focus on things that look odd rather than simply congratulating oneself about the things that fit? It is extremely impressive that such intense scrutiny over the last seven years has revealed so few oddities, but that just means that we should look even harder..

Before too long, data from Planck will provide an even sterner test of the standard framework. We really do need an independent experiment to see whether there is something out there that WMAP might have missed. But we’ll have to wait a few years for that.

So far it’s WMAP 7 Planck 0, but there’s plenty of time for an upset. Unless they close us all down.