Archive for the Science Politics Category

Should Open Access Include Open Software?

Posted in Open Access, Science Politics, The Universe and Stuff with tags , , , , on February 4, 2013 by telescoper

Very busy today, so just time for a quick post (and associated poll) about Open Science.

As you all know I’ve been using this blog for a while to bang on about Open Access to scientific publications. I’m not going to repeat my position in detail here except to say that I’m in favour of Open Access but not at the immense cost envisaged by the Finch Report.

I thought however that it might be useful to float some opinions about wider issues related to open science. In particular, the question that often troubles me is that is open access to scientific results actually enough, or do we have to go a lot further?

I think an important aspect of the way science works is that when a given individual or group publishes a result, it should be possible for others to reproduce it (or not as the case may be). Traditional journal publications don’t always allow this. In my own field of astrophysics/cosmology, for example, results in scientific papers are often based on very complicated analyses of large data sets. This is increasingly the case in other fields too. A basic problem obviously arises when data are not made public. Fortunately in astrophysics these days researchers are pretty good at sharing their data, although this hasn’t always been the case.

However, even allowing open access to data doesn’t always solve the reproducibility problem. Often extensive numerical codes are needed to process the measurements and extract meaningful output. Without access to these pipeline codes it is impossible for a third party to check the path from input to output without writing their own version assuming that there is sufficient information to do that in the first place. That researchers should publish their software as well as their results is quite a controversial suggestion, but I think it’s the best practice for science. There isn’t a uniform policy in astrophysics and cosmology, but I sense that quite a few people out there agree with me. Cosmological numerical simulations, for example, can be performed by anyone with a sufficiently big computer using GADGET the source codes of which are freely available. Likewise, for CMB analysis, there is the excellent CAMB code, which can be downloaded at will; this is in a long tradition of openly available numerical codes, including CMBFAST and HealPix.

I suspect some researchers might be reluctant to share the codes they have written because they feel they won’t get sufficient credit for work done using them. I don’t think this is true, as researchers are generally very appreciative of such openness and publications describing the corresponding codes are generously cited. In any case I don’t think it’s appropriate to withhold such programs from the wider community, which prevents them being either scrutinized or extended as well as being used to further scientific research. In other words excessively proprietorial attitudes to data analysis software are detrimental to the spirit of open science.

Anyway, my views aren’t guaranteed to be representative of the community, so I’d like to ask for a quick show of hands via a poll…

…and you are of course welcome to comment via the usual box.

Critical Masses

Posted in Education, Science Politics with tags , , , , , , , on January 26, 2013 by telescoper

One of the interesting bits of news floating around academia at the moment is the announcement that my current employer (until the end of next week), Cardiff University is to join forces with the Universities of Bath, Exeter and Bristol in an alliance intended to create a ‘critical mass of knowledge’ and help Cardiff  ‘better compete for more research income’ (apparently by pretending to be in England rather than in Wales).  How successful this will be – or even what form this alliance will take – remains to be seen.

There’s been a lot of gossip about what inspired this move, but it’s not the first attempt to create a collaborative bloc of this kind. Last year five universities from the Midlands announced plans to do something similar. The “M5” group of   Birmingham, Leicester, Loughborough, Nottingham and Warwick got together primarily to share infrastructure in order to help them win grants, which is probably what also lies behind the Cardiff-Bath-Exeter-Bristol deal.

Of course there are also a myriad  alliances at the level of individual Schools and Departments. I’ll shortly be joining the University of Sussex, which is a major player in SEPNET – the South-East Physics Physics Network which was set up with help from HEFCE There are other such networks in England, as well as SUPA in Scotland, funded by the devolved Scottish Funding Council. Attempts to form a similar arrangement for Physics in Wales were given short shrift by the Welsh Funding Agency, HEFCW. The inability or unwillingness of HEFCW to properly engage with research in Wales is no doubt behind Cardiff’s decision to seek alliances with English universities but I wonder how it will translate into funding. Surely HEFCE wouldn’t be allowed to fund a Welsh University, so presumably this is more aimed at funding from the research councils or further afield, perhaps in Europe. Or perhaps the idea is that if GW4 can persuade HEFCE to fund Bath, Bristol and Exeter, HEFCW will be shamed into stumping up something for Cardiff? Sneaky.

Anyway, good luck to the new “GW4” alliance. Although I’m moving to pastures new I’ll certainly keep an eye on any developments, and hope that they’re positive. The only thing that really disturbs me is that the name “Great Western Four” is apparently inspired by the Great Western Railway, now run by an outfit called First Great Western. My recent experiences of travelling on that have left a lot to be desired and I’m sure the name will have negative connotations in the minds of many who are fed up of their unreliable, overcrowded, overpriced and poorly managed services. They say a rose by any other name would smell as sweet, but so far this is only a name – and one with a distinctly questionable odour.

REF moves the goalposts (again)

Posted in Bad Statistics, Education, Science Politics with tags , , , on January 18, 2013 by telescoper

The topic of the dreaded 2014 Research Excellence Framework came up quite a few times in quite a few different contexts over the last few days, which reminded me that I should comment on a news item that appeared a week or so ago.

As you may or may not be aware, the REF is meant to assess the excellence of university departments in various disciplines and distribute its “QR” research funding accordingly.  Institutions complete submissions which include details of relevant publications etc and then a panel sits in judgement. I’ve already blogged of all this: the panels clearly won’t have time to read every paper submitted in any detail at all, so the outcome is likely to be highly subjective. Moreover, HEFCE’s insane policy to award the bulk of its research funds to only the very highest grade (4* – “internationally excellent”) means that small variations in judged quality will turn into enormous discrepancies in the level of research funding. The whole thing is madness, but there seems no way to inject sanity into the process as the deadline for submissions remorselessly approaches.

Now another wrinkle has appeared on the already furrowed brows of those preparing REF submissions. The system allows departments to select staff to be entered; it’s not necessary for everyone to go in. Indeed if only the very best researchers are entered then the typical score for the department will be high, so it will appear  higher up  in the league tables, and since the cash goes primarily to the top dogs then this might produce almost as much money as including a few less highly rated researchers.

On the other hand, this is a slightly dangerous strategy because it presupposes that one can predict which researchers and what research will be awarded the highest grade. A department will come a cropper if all its high fliers are deemed by the REF panels to be turkeys.

In Wales there’s something that makes this whole system even more absurd, which is that it’s almost certain that there will be no QR funding at all. Welsh universities are spending millions preparing for the REF despite the fact that they’ll get no money even if they do stunningly well. The incentive in Wales is therefore even stronger than it is in England to submit only the high-fliers, as it’s only the position in the league tables that will count.

The problem with a department adopting the strategy of being very selective is that it could have a very  negative effect on the career development of younger researchers if they are not included in their departments REF submission. As well as taking the risk that people who manage to convince their Head of School that they are bound to get four stars in the REF may not have the same success with the various grey eminences who make the decision that really matters.

Previous incarnations of the REF (namely the Research Assessment Exercises of 2008 and 2001) did not publish explicit information about exactly how many eligible staff were omitted from the submissions, largely because departments were extremely creative in finding ways of hiding staff they didn’t want to include.

Now however it appears there are plans that the Higher Education Statistics Agency (HESA) will publish its own figures on how many staff it thinks are eligible for inclusion in each department. I’m not sure how accurate these figures will be but they will change the game, in that they will allow compilers of league tables to draw up lists of the departments that prefer playing games to   just allowing the REF panels to  judge the quality of their research.

I wonder how many universities are hastily revising their submission plans in the light of this new twist?

Science Propaganda

Posted in Science Politics, The Universe and Stuff with tags , , on January 2, 2013 by telescoper

I thought I’d do a quick rehash of an old post which is vaguely relevant to the still simmering controversy generated by the Cox-Ince editorial I blogged about before Christmas.

The legitimate interface between science and society has many levels to it. One aspect is the simple need to explain what science tells us about the world in order that people can play an informed part in our increasingly technological society. Another is that there needs to be encouragement for (especially young) people to study science seriously and to make it their career in order to maintain the supply of scientists for the future. And then there is the issue of the wider cultural implications of science, its impact on other belief-systems (such as religions) other forms of endeavour (such as art and literature) and even for government.

I think virtually all scientists would agree with the need for engagement in at least the first two of these. In fact, I’m sure most scientists would love to have the chance to explain their work to a lay audience, but not all subjects are as accessible or inspirational as, say, astronomy. Unfortunately also, not all scientists are very good at this sort of thing. Some might even be counter-productive if inflicted on the public in this way. So it seems relatively natural that some people have had more success at this activity than others, and have thus become identified as “science communicators”. Although some scientists are a bit snobby about those who write popular books and give popular talks, most of us agree that this kind of work is vital for both science and society.

Vital, yes, but there are dangers. The number of scientists involved in this sort of work is probably more limited than it should be owing to the laziness of the popular media, who generally can’t be bothered to look outside London and the South-East for friendly scientists. The broadsheet newspapers employ very few qualified specialists among their staff even on the science pages so it’s a battle to get meaningful scientific content into print in the mass media. Much that does appear is slavishly regurgitated from one of the press agencies who are kept well fed by the public relations experts employed by research laboratories and other science institutes.

These factors mean that what comes out in the media can be a distorted representation of the real scientific process. Heads of research groups and laboratories are engaged in the increasingly difficult business of securing enough money to continue their work in these uncertain financial times. Producing lots of glossy press releases seems to be one way of raising the profile and gaining the attention of funding bodies. Most scientists do this with care, but sometimes the results are ludicrously exaggerated or simply wrong. Some of the claims circulating around the time the Large Hadron Collider was switched on definitely fell into one or more of those categories. I realise that there’s a difficult balance to be struck between simplicity and accuracy, and that errors can result from over-enthusiasm rather than anything more sinister, but even so we should tread carefully if we want the public to engage with what science really is.

The Cox-Ince editorial is refreshingly clear about the limitations of science:

Science is a framework with only one absolute: all opinions, theories and “laws” are open to revision in the face of evidence. It should not be seen or presented, therefore, as a body of inviolate knowledge against which policy should be judged; the effect of this would be to replace one priesthood with another. Rather, science is a process, a series of structures that allow us, in as unbiased a way as possible, to test our assertions against Nature.

However, there is still far too much science reporting that portrays as facts  ideas and theories which have little or no evidence to support them. This isn’t science communication, it’s science propaganda and I think too many scientists go along with it. There’s a difficult balance to be struck, between engaging the public with inspirational but superficial TV programmes and explaining the intellectual struggles that science really involves.  Give the public the latter without any of the former and they’ll surely switch off!

Most worryingly is the perceived need to demonstrate black-and-white certainty over issues which are considerably more complicated than that. This is another situation where science popularisation becomes science propaganda. I’m not sure whether the public actually wants its scientists to make pronouncements as if they were infallible oracles, but the media definitely do. Scientists sometimes become cast in the role of priests, which is dangerous, especially when a result is later shown to be false. Then the public don’t just lose faith with one particular scientist, but with the whole of science.

Science is not about certainty. What it is a method for dealing rationally with uncertainty. It is a pragmatic system primarily intended for making testable inferences about the world using measurable, quantitative data. Scientists look their most arrogant and dogmatic when they try to push science beyond the (relatively limited) boundaries of its applicability and to ride roughshod over alternative ways of dealing with wider issues including, yes, religion.

I don’t have any religious beliefs that anyone other than me would recognize as such. I am also a scientist. But I don’t see any reason why being a scientist or not being a scientist should have any implications for my (lack of) religious faith. God (whatever that means) is, by construction, orthogonal to science. I’m not at all opposed to scientists talking about their religion or their atheism in the public domain. I don’t see why their opinions are of any more interest than anyone else’s in these matters, but I’m quite happy to hear them voiced.

This brings us to the question, often raised by hardline atheists, as to whether more scientists  should follow Richard Dawkins’ lead and be champions of atheism in the public domain. As a matter of fact, I agree with some of Dawkins’ agenda, such as his argument for the separation of church and state, although I don’t feel his heavy-handed use of the vitriol in The God Delusion achieved anything particularly positive (except for his bank balance, perhaps). But I don’t think it’s right to assume that all scientists should follow his example. Their beliefs are their business. I don’t think we will be much better off if we simply replace one set of priests with another. In this respect I wholeheartedly agree with Peter Higgs who has recently described Dawkins as “embarrassing”.

So there you have my plea for both public and scientists to accept that science will never have all the answers. There will always be “aspects of human experience that, even in an age of astonishing scientific advance, remain beyond the reach of scientific explanation”.

Can I have the Templeton Prize now please?

The Cox-Ince affair rumbles on..

Posted in Science Politics with tags , , on January 1, 2013 by telescoper

The Cox-Ince controversy rumbles on, apparently…

Ken's avatarOpen Parachute

Popular science presenters like Brian Cox are sometimes criticised by colleagues suffering from a bit of professional jealousy – although it’s a lot better than in the old days. I think most scientists today recognise the need for good science communication with the public – who, after all, are financing our science through the taxation system.

Brian Cox and his mate Robin Ince wrote a recent New Statesman editorial promoting a better understanding of the nature of science and its role in public decision-making (see Politicians must not elevate mere opinion over science). It made some good points – but upset some people. The jealousy this time seems to come from a few historians and sociologists – and not scientists themselves.

I think their criticism reveals an unfortunate attitude towards the scientific process, or indeed a misunderstanding of that process. Nevertheless, the debate does reveal some aspects of the…

View original post 2,382 more words

Science and Politics

Posted in Politics, Science Politics with tags , , , , on December 22, 2012 by telescoper

It’s a dark dreary December day with a downright deluge descending outside to add to the alliteration.  Fortunately, it being almost Christmas, this weekend is offering a glut of crosswords with which I’ve been occupying myself while waiting for a break in the rain.

Among the puzzles I’ve done was a moderately challenging one in the New Statesman.  I have a subscription to the New Statesman, which means that I get it delivered in the post approximately two days after everyone else has had a chance to read it. After finishing the crossword, which contain a number of hidden (unclued) famous pseudonyms, I had a look at the rest of the magazine and discovered that this issue, the Christmas one, was edited by Brian Cox (who needs no introduction) and Robin Ince (who I believe is a comedian of some sort). It’s nice to see science featured so strongly in a political magazine, of course, but I did raise an eyebrow when I read this (about the LHC) in a piece written by Professor Cox:

The machine itself is 27 kilometres in circumference and is constructed from 9,300 superconducting electromagnets operating at -271.3°C. There is no known place in the universe that cold outside laboratories on earth…

Not so. The cryogenic systems on ESA’s Planck mission achieved a stable operating temperature at the 0.1 K level. This experiment has now reached the end of its lifetime and is warming up, but  the Herschel Space Observatory with a temperature of 1.4 K is still cooler than the Large Hadron Collider. Moreover, there are natural phenomena involving very low temperatures. The Boomerang Nebula has a measured temperature of −272.15°C, also lower than the LHC.  How does this system manage to cool itself down below the temperature of the cosmic microwave background, I hear you asking.  A detailed model is presented here; it’s “supercooled” because it is expanding so quickly compared to the rate at which it is absorbing CMB photons.

Anyway, if this all seems a bit pedantic then I suppose it is, but if prominent science advocates can’t be bothered to check their facts on things they claim to be authorities about, one wonders why the public show pay them any attention in the broader sphere. Fame and influence bring with them difficult responsibilities.

That brings me to another piece in the same issue, this one co-authored by Cox and Ince, about Science and Society entitled Politicians must not elevate mere opinion over science. I’d realised that there was a bit of a Twitter storm brewing about this item, but had to wait until the horse and cart arrived with my snail mail copy before I could try figure out what it was about. I still haven’t because although it’s not a particularly focussed piece it doesn’t seem to say anything all that controversial. In fact it just struck me that it seems to be a bit self-contradictory, on the one hand arguing that politicians should understand science better and on the other calling for a separation of science and politics.   There are two more detailed rejoinders here and here.

For my part I’ll just say that I think it is neither possible nor desirable to separate science from politics.  That’s because, whether we like it or not, we need them both. Science may help us understand the world around us, and (to a greater or lesser degree of reliability) predict its behaviour, but it does not make decisions for us. Cox and Ince argue that

Science is the framework within which we reach conclusions about the natural world. These conclusions are always preliminary, always open to revision, but they are the best we can do.

I’d put it differently, in terms of probabilities and evidence rather than “conclusions”, but I basically agree. The problem is that at some point we have to make decision which may not depend solely on the interpretation of evidence but on a host of other factors that science can say nothing about. Definite choices have to be made, even when the evidence is ambiguous. In other words we have to bring closure, much as we do when a jury delivers a verdict in a court of law, which is something that science on its own can rarely do. Mere opinion certainly counts in that context, and so it should. The point is that science is done by people, not machines. People decide what questions to ask, and what assumptions to proceed from. Choices of starting point are political (in the widest sense of the word) and sometimes what you get out of a scientific investigation  is little more than what you put in.

It’s always going to a problem in a democratic society that scientific knowledge is confined to a relatively small number of experts. We can do our best to educate as many as possible about what we do, but we’re always going to struggle to explain ourselves adequately. There will always be conspiracy theories and crackpots of various kinds. The way to proceed is not to retreat into a bunker and say “Trust me, I’m a scientist” but to be more open about the doubts and uncertainties and to present a more realistic picture of the strengths and limitations of science. That means to engage with public debate, not by preaching the gospel of science as if it held all the answers, but by acknowledging that science is a people thing and that as such it belongs in politics as much as politics belongs in it.

Society Counts, and so do Astronomers!

Posted in Bad Statistics, Science Politics with tags , , , , , on December 6, 2012 by telescoper

The other day I received an email from the British Academy (for Humanities and Social Sciences) announcing a new position statement on what they call Quantitative Skills.  The complete text of this statement, which is entitled Society Counts and which is well worth reading,  is now  available on the British Academy website.

Here’s an excerpt from the letter accompanying the document:

The UK has a serious deficit in quantitative skills in the social sciences and humanities, according to a statement issued today (18 October 2012) by the British Academy. This deficit threatens the overall competitiveness of the UK’s economy, the effectiveness of public policy-making, and the UK’s status as a world leader in research and higher education.

The statement, Society Counts, raises particular concerns about the impact of this skills deficit on the employability of young people. It also points to serious consequences for society generally. Quantitative skills enable people to understand what is happening to poverty, crime, the global recession, or simply when making decisions about personal investment or pensions.

Citing a recent survey of MPs by the Royal Statistical Society’s getstats campaign – in which only 17% of Conservative and 30% of Labour MPs thought politicians use official statistics and figures accurately when talking about their policies – Professor Sir Adam Roberts, President of the British Academy, said: “Complex statistical and analytical work on large and complex data now underpins much of the UK’s research, political and business worlds. Without the right skills to analyse this data properly, government professionals, politicians, businesses and most of all the public are vulnerable to misinterpretation and wrong decision-making.”

The statement clearly identifies a major problem, not just in the Humanities and Social Sciences but throughout academia and wider society. I even think the British Academy might be a little harsh on its own constituency because, with a few notable exceptions,  statistics and other quantitative data analysis methods are taught very poorly to science students too.  Just the other day I was talking to an undergraduate student who is thinking about doing a PhD in physics about what that’s likely to entail. I told him that the one thing he could be pretty sure he’d have to cope with is analysing data statistically. Like most physics departments, however, we don’t run any modules on statistical techniques and only the bare minimum is involved in the laboratory session. Why? I think it’s because there are too few staff who would be able to teach such material competently (because they don’t really understand it themselves).

Here’s a paragraph from the British Association statement:

There is also a dearth of academic staff able to teach quantitative methods in ways that are relevant and exciting to students in the social sciences and humanities. As few as one in ten university social science lecturers have the skills necessary to teach a basic quantitative methods course, according to the report. Insufficient curriculum time is devoted to methodology in many degree programmes.

Change “social sciences and humanities” to “physics” and I think that statement would still be correct. In fact I think “one in ten” would be an overestimate.

The point is that although  physics is an example of a quantitative discipline, that doesn’t mean that the training in undergraduate programmes is adequate for the task. The upshot is that there is actually a great deal of dodgy statistical analysis going on across a huge number of disciplines.

So what is to be done? I think the British Academy identifies only part of the required solution. Of course better training in basic numeracy at school level is needed, but it shouldn’t stop there. I think there also needs to a wider exchange of knowledge and ideas across disciplines and a greater involvement of expert consultants. I think this is more likely to succeed than getting more social scientists to run standard statistical analysis packages. In my experience, most bogus statistical analyses do not result from using the method wrong, but from using the wrong method…

A great deal of astronomical research is based on inferences drawn from large and often complex data sets, so astronomy is a discipline with a fairly enlightened attitude to statistical data analysis. Indeed, many important contributions to the development of statistics were made by astronomers. In the future I think we’ll  see many more of the astronomers working on big data engage with the wider academic community by developing collaborations or acting as consultants in various ways.

We astronomers are always being challenged to find applications of their work outside the purely academic sphere, and this is one that could be developed much further than it has so far. It disappoints me that we always seem to think of this exclusively in terms of technological spin-offs, while the importance of transferable expertise is often neglected. Whether you’re a social scientist or a physicist, if you’ve got problems analysing your data, why not ask an astronomer?

Urgent Announcement from the AGP

Posted in Science Politics with tags , , , on December 1, 2012 by telescoper

As the festive season approaches, the UK government has decided  to make immediate changes to the  procedures to be followed for the allocation and distribution of yuletide gifts. In previous years, such awards have been made  directly by the agency involved, e.g. proposals within the STFC  remit have been directly Sent To Father Christmas, often in hand-written format. However, to cut costs improve the quality of service, it has been decided to extend the operations of the Shared Services Centre to cover such applications, which will henceforth be administered by a Shared Santa Claus (SSC), after being uploaded to the JES system (in Word 95 format only). They will then be sent to relevant experts for peer review, i.e. the Advent Gift Panel (AGP).

In preparing submissions, Applicants should note the following  important revisions to AGP guidelines.

Proposals must include:

  1. The aims and scope of the presents requested and any interrelation between them, where appropriate.
  2. The areas in which the Applicants have a proven track-record in the general area of not being naughty, including (where appropriate) highlights of particularly good behaviour within the last three years.
  3. The support  already provided to the Applicants with particular emphasis on recent investments that are relevant to the gifts requested.
  4. How the Applicants will be advanced as a result of the proposed present.
  5. How the  requested present  fits within the international context, i.e. is it of comparable quality to the best gifts available overseas?
  6. The likely impact of the present (e.g. when thrown around the living room).
  7. How you expect the present to evolve over the next three years, e.g. is it likely to break or need repair?
  8. The level of resources needed to supply the present.
  9. How the gift will contribute to the UK economy over the next thirty years.

The following supplementary rules also apply:

  1. Consumables will be allocated using a formula based on the number of FTE awarded, to include (per FTE): one Bernard Matthews Turkey Twizzler, three sprouts, 2 potatoes (including one roast if the case justifies such extravagance), and one small carrot/parsnip. Gravy is expected to be provided from local resources.
  2. Christmas puddings and/or mince pies are covered by a different  programme (overseen by the Hefty Pudding Committee, HPC)  and will require a separate application; a Cheese Board may also be convened if there is sufficient demand.
  3. Requests for crackers are welcomed, as long as the proposal is not entirely crackers.
  4. Travel expenses will be limited to the cost of one sleigh ride (weather permitting).
  5. Batteries will not be included.
  6. Under no circumstances will funding be allocated for the purchase of paperweights.
  7. Each proposal  must be accompanied by a Knowledge Exchange case, explaining the impact of the proposal outside the STFC remit.
  8. Each proposal must be accompanied by an Outreach case outlining any public activities,  such as carol singing.

The deadline for applications is Friday 14th December 2012. In line with normal shambolically inefficient SSC practice, awards are expected to be made sometime in April (2014).

I hope this clarifies the situation.

Diamond Lights

Posted in Football, Music, Science Politics with tags , , , on November 27, 2012 by telescoper

Apparently there’s been a posh do this evening at the Royal Society to celebrate the 10th Anniversary of the Diamond Light Source. In fact the Diamond Light Source has its own anniversary blog that’s been posting celebratory things for a while; the actual anniversary being celebrated was the signing of the agreement to set up the Diamond Light Source, which happened on March 27th 2002. Actual operations didn’t commence until 2007, at a total cost of £260m, which is when STFC was created and told to pick up the tab for running the facility which, together with a few other things, precipitated a financial crisis from which UK particle physics and astronomy are only just starting to recover.

I don’t be churlish about the good science the Diamond Light Sources is undoubtedly doing so I thought I’d mark the anniversary here. The blog I mentioned above has a video page but it sadly doesn’t contain the video I most expected to see. This, Diamond Lights, was released – or did it escape? – in 1987 and it “stars” Glen Hoddle and Chris Waddle who, as singers, were both excellent footballers. I’m surprised STFC Chief Executive John Womersley didn’t record a cover version of this as part of the anniversary celebrations…

The Tremors from L’Aquila

Posted in Bad Statistics, Open Access, Science Politics with tags , , , on October 23, 2012 by telescoper

I can’t resist a comment on news which broke yesterday that an Italian court has found six scientists and a former government official guilty of manslaughter in connection with the L’Aquila Earthquake of 2009. Scientific colleagues of mine are shocked by their conviction and by the severity of the sentences (six years’ imprisonment), the assumption being that they were convicted for having failed to predict the earthquake. However, as Nature News pointed out long before the trial when the scientists were indicted:

The view from L’Aquila, however, is quite different. Prosecutors and the families of victims alike say that the trial has nothing to do with the ability to predict earthquakes, and everything to do with the failure of government-appointed scientists serving on an advisory panel to adequately evaluate, and then communicate, the potential risk to the local population. The charges, detailed in a 224-page document filed by Picuti, allege that members of the National Commission for Forecasting and Predicting Great Risks, who held a special meeting in L’Aquila the week before the earthquake, provided “incomplete, imprecise, and contradictory information” to a public that had been unnerved by months of persistent, low-level tremors. Picuti says that the commission was more interested in pacifying the local population than in giving clear advice about earthquake preparedness.

“I’m not crazy,” Picuti says. “I know they can’t predict earthquakes. The basis of the charges is not that they didn’t predict the earthquake. As functionaries of the state, they had certain duties imposed by law: to evaluate and characterize the risks that were present in L’Aquila.” Part of that risk assessment, he says, should have included the density of the urban population and the known fragility of many ancient buildings in the city centre. “They were obligated to evaluate the degree of risk given all these factors,” he says, “and they did not.”

Many of my colleagues have interpreted the conviction of these scientists as an attack on science, but the above statement actually looks to me more like a demand that the scientists involved should have been more scientific. By that I mean not giving a simple “yes” or “no” answer (which in this case was “no”) but by give a proper scientific analysis of the probabilities involved. This comment goes straight to two issues that I feel very strongly about. One is the vital importance of probabilistic reasoning – in this case in connection with a risk assessment – and the other is the need for openness in science.

I thought I’d take this opportunity to repeat the reasons I think statistics and statistical reasoning are so important. Of course they are important in science. In fact, I think they lie at the very core of the scientific method, although I am still surprised how few practising scientists are comfortable even with statistical language. A more important problem is the popular impression that science is about facts and absolute truths. It isn’t. It’s a process. In order to advance, it has to question itself.

Statistical reasoning also applies outside science to many facets of everyday life, including business, commerce, transport, the media, and politics. It is a feature of everyday life that science and technology are deeply embedded in every aspect of what we do each day. Science has given us greater levels of comfort, better health care, and a plethora of labour-saving devices. It has also given us unprecedented ability to destroy the environment and each other, whether through accident or design. Probability even plays a role in personal relationships, though mostly at a subconscious level.

Civilized societies face severe challenges in this century. We must confront the threat of climate change and forthcoming energy crises. We must find better ways of resolving conflicts peacefully lest nuclear or conventional weapons lead us to global catastrophe. We must stop large-scale pollution or systematic destruction of the biosphere that nurtures us. And we must do all of these things without abandoning the many positive things that science has brought us. Abandoning science and rationality by retreating into religious or political fundamentalism would be a catastrophe for humanity.

Unfortunately, recent decades have seen a wholesale breakdown of trust between scientists and the public at large; the conviction of the scientists in the L’Aquila case is just one example. This breakdown is due partly to the deliberate abuse of science for immoral purposes, and partly to the sheer carelessness with which various agencies have exploited scientific discoveries without proper evaluation of the risks involved. The abuse of statistical arguments have undoubtedly contributed to the suspicion with which many individuals view science.

There is an increasing alienation between scientists and the general public. Many fewer students enrol for courses in physics and chemistry than a a few decades ago. Fewer graduates mean fewer qualified science teachers in schools. This is a vicious cycle that threatens our future. It must be broken.

The danger is that the decreasing level of understanding of science in society means that knowledge (as well as its consequent power) becomes concentrated in the minds of a few individuals. This could have dire consequences for the future of our democracy. Even as things stand now, very few Members of Parliament are scientifically literate. How can we expect to control the application of science when the necessary understanding rests with an unelected “priesthood” that is hardly understood by, or represented in, our democratic institutions?

Very few journalists or television producers know enough about science to report sensibly on the latest discoveries or controversies. As a result, important matters that the public needs to know about do not appear at all in the media, or if they do it is in such a garbled fashion that they do more harm than good.

Years ago I used to listen to radio interviews with scientists on the Today programme on BBC Radio 4. I even did such an interview once. It is a deeply frustrating experience. The scientist usually starts by explaining what the discovery is about in the way a scientist should, with careful statements of what is assumed, how the data is interpreted, and what other possible interpretations might be and the likely sources of error. The interviewer then loses patience and asks for a yes or no answer. The scientist tries to continue, but is badgered. Either the interview ends as a row, or the scientist ends up stating a grossly oversimplified version of the story.

Some scientists offer the oversimplified version at the outset, of course, and these are the ones that contribute to the image of scientists as priests. Such individuals often believe in their theories in exactly the same way that some people believe religiously. Not with the conditional and possibly temporary belief that characterizes the scientific method, but with the unquestioning fervour of an unthinking zealot. This approach may pay off for the individual in the short term, in popular esteem and media recognition – but when it goes wrong it is science as a whole that suffers. When a result that has been proclaimed certain is later shown to be false, the result is widespread disillusionment. And the more secretive the behaviour of the scientific community, the less reason the public has to trust its pronouncements.

I don’t have any easy answers to the question of how to cure this malaise, but do have a few suggestions. It would be easy for a scientist such as myself to blame everything on the media and the education system, but in fact I think the responsibility lies mainly with ourselves. We are usually so obsessed with our own research, and the need to publish specialist papers by the lorry-load in order to advance our own careers that we usually spend very little time explaining what we do to the public or why we do it.

I think every working scientist in the country should be required to spend at least 10% of their time working in schools or with the general media on “outreach”, including writing blogs like this. People in my field – astronomers and cosmologists – do this quite a lot, but these are areas where the public has some empathy with what we do. If only biologists, chemists, nuclear physicists and the rest were viewed in such a friendly light. Doing this sort of thing is not easy, especially when it comes to saying something on the radio that the interviewer does not want to hear. Media training for scientists has been a welcome recent innovation for some branches of science, but most of my colleagues have never had any help at all in this direction.

The second thing that must be done is to improve the dire state of science education in schools. Over the last two decades the national curriculum for British schools has been dumbed down to the point of absurdity. Pupils that leave school at 18 having taken “Advanced Level” physics do so with no useful knowledge of physics at all, even if they have obtained the highest grade. I do not at all blame the students for this; they can only do what they are asked to do. It’s all the fault of the educationalists, who have done the best they can for a long time to convince our young people that science is too hard for them. Science can be difficult, of course, and not everyone will be able to make a career out of it. But that doesn’t mean that it should not be taught properly to those that can take it in. If some students find it is not for them, then so be it. I always wanted to be a musician, but never had the talent for it.

The third thing that has to be done is for scientists to be far more open. Publicly-funded scientists have a duty not only to publish their conclusions in such a way that the public can access them freely, but also to publish their data, their methodology and the intermediate steps. Most members of the public will struggle to make sense of the information, but at least there will be able to see that nothing is being deliberately concealed.

Everyone knows that earthquake prediction is practically impossible to do accurately. The danger of the judgement in the L’Aquila Earthquake trial (apart from discouraging scientists from ever becoming seismologists) is that the alarm will be sounded every time there is the smallest tremor. The potential for panic is enormous. But the science in this field,as in any other, does not actually tell one how to act on evidence of risk, merely to assess it. It’s up to others to decide whether and when to act, when the threshold of danger has been crossed. There is no scientific answer to the question “how risky is too risky?”.

So instead of bland reassurances or needless panic-mongering, the scientific community should refrain from public statements about what will happen and what won’t and instead busy itself with the collection, analysis and interpretation of data and publish its studies as openly as possible. The public will find it very difficult to handle this information overload, but so they should. Difficult questions don’t have simple answers. Scientists aren’t priests.