Archive for statistics

Australia: Cyclones go up to Eleven!

Posted in Bad Statistics with tags , , , , , , , on October 14, 2013 by telescoper

I saw a story on the web this morning which points out that Australians can expect 11 cyclones this season.

It’s not a very good headline, because it’s a bit misleading about what the word “expected” means. In fact the number eleven is the average number of cyclones, which is not necessarily the number expected, despite the fact that “expected value” or “expectation value” . If you don’t understand this criticism, ask yourself how many legs you’d expect a randomly-chosen person to have. You’d probably settle on the answer “two”, but that is the most probable number, i.e. the mode, which in this case exceeds the average. If one person in a thousand has only one leg then a group of a thousand has 1999 legs between them, so the average (or arithmetic mean) is 1.999. Most people therefore have more than the average number of legs…

I’ve always found it quite annoying that physicists use the term “expectation value” to mean “average” because it implies that the average is the value you would expect. In the example given above you wouldn’t expect a person to have the average number of legs – if you assume that the actual number is an integer, it’s actually impossible to find a person with 1.999! In other words, the probability of finding someone in that group with the average number of legs in the group is exactly zero.

The same confusion happens when newspapers talk about the “average wage” which is considerably higher than the wage most people receive.

In any case the point is that there is undoubtedly a considerable uncertainty in the prediction of eleven cyclones per season, and one would like to have some idea how large an error bar is associated with that value.

Anyway, statistical pedantry notwithstanding, it is indeed impressive that the number of cyclones in a season goes all the way up to eleven..

Physics and Statistics

Posted in Bad Statistics, Education with tags , , , on August 16, 2013 by telescoper

Predictably, yesterday’s newspapers and other media  were full of feeble articles about the A-level results, and I don’t just mean the gratuitous pictures of pretty girls opening envelopes and/or jumping in the air.  I’ve never met a journalist who understood the concept of statistical significance, which seems to account for the way they feel able to write whatever they like about any numbers that happen to be newsworthy without feeling constrained by mathematical common-sense.  Sometimes it’s the ridiculous over-interpretation of opinion polls (which usually have a sampling uncertainty of ±3 %), sometimes its league tables. This time it’s the number of students getting the top grades at A-level.

The BBC, for example, made a lot of fuss about the fall in the % of A and A* A-level grades, to  26.3% this year from 26.6% last year. Anyone with a modicum of statistical knowledge would know, however, that whether this drop means anything at all depends on how many results were involved: the sampling uncertainty depends on size N approximately as √N. For a cohort of 300000 this turns into a percentage uncertainty of about 0.57%, which is about twice as large as the reported fall.  The result is therefore “in the noise” – in the sense that there’s no evidence that it was actually harder to get a high grade this year compared with last year – but that didn’t prove a barrier to those editors intent on filling their newspapers and websites with meaningless guff.

Almost hidden among the bilge was an interesting snippet about Physics. It seems that the number of students taking Physics A-level this year has exceeded 35,000 in 2013.  That was set as a government target for 2014, so it has been reached a year early.  The difference between the number that took Physics this year (35,569) and those who took it in 2006 (27,368) is certainly significant. Whether this is the so-called Brian Cox effect or something else, it’s very good news for the future health of the subject.

On the other hand, the proportion of female Physics students remains around 20%. Over the last three years the proportion has been 20.8%, 21.3% and 20.6% so numerically this year is down on last year, but the real message in these figures is that despite strenuous efforts to increase this fraction, there is no significant change.

As I write I’m formally still on Clearing business, sitting beside the telephone in case anyone needs to talk to me. However, at close of play yesterday the School of Mathematical and Physical Sciences had exceeded its recruitment target by quite a healthy margin.  We’re still open for Clearing, though, as our recent expansion means we can take a few more suitably qualified students. Physics and Astronomy did particularly well, and we’re set to welcome our biggest-ever intake into the first year in September 2013. I’m really looking forward to meeting them all.

While I’m on about statistics, here’s another thing. When I was poring over this year’s NSS results, I noticed that only 39 Physics departments appeared in the survey. When I last counted them there were 115 universities in the UK. This number doesn’t include about 50 colleges and other forms of higher education institutions which are also sometimes included in lists of universities. Anyway, my point is that at most about a third of British universities have a physics department.

Now that is a shocking statistic…

(Lack of) Diversity in STEM Subjects

Posted in Science Politics with tags , , , , , , on May 10, 2013 by telescoper

Among the things I learnt over the last few days was some interesting information about the diversity (or, rather, lack of diversity) of undergraduates taking undergraduate degrees in STEM subjects in the UK universities. For those of you not up on the lingo, `STEM’ is short for Science, Technology, Engineering and Mathematics. Last year the Institute of Physics produced a report that contains a wealth of statistical information about the demographics of the undergraduate population, from which the following numbers are only a small component.

Physics

Maths

Chemistry

Engineering

Female

21%

41%

44%

12%

BME

11%

24%

20%

30%

Socio-Economic

37%

42%

43%

51%

Non-EU

5%

12%

7%

32%

For completeness I should point out that these numbers refer to first-year undergraduates in 2010-11; I have no particular reason to suppose there has been a qualitative change since then. “BME” stands for “Black and Minority Ethnic”, and “Socio-Economic” refers to students whose with parents not employed in managerial or professional positions.

Overall, the figures here at the University of Sussex are roughly in line with, but slightly better than, these national statistics; the proportion of female students in our Physics intake for 2010/11, for example, was 27%.

There are some interesting (and rather disappointing) things to remark. First is that the proportion of Physics students who are female remains low; Physics scores very badly on ethnic diversity too. Mathematics on the other hand seems a much more attractive subject for female students.  Notice also how Physics and Chemistry attract a very small proportion of overseas students compared to Engineering.

In summary, therefore, we can see that Physics is a subject largely studied by white  middle-class European males. What are we doing wrong?

Despite considerable efforts to promote Physics to a more diverse constituency,  the proportion of, e.g., female physics students seems to have been bumping along at around 20% for ages.  Interestingly, all the anecdotal evidence suggests that those women who do Physics at University do disproportionately well, in the sense that female students constitute a  much larger fraction of First-class graduates than 20%. This strongly suggests that the problem lies at school level; some additional IOP information and discussion on this can be found here.

I’m just passing these figures on for information, as I’m quite often asked about them during, e.g., admissions-related activities. I don’t have any really compelling suggestions, but I would like to invite the blogosphere to comment and/or make suggestions as to promote diversity in STEM disciplines.

Never mind the table, look at the sample size!

Posted in Bad Statistics with tags , , , on April 29, 2013 by telescoper

This morning I was just thinking that it’s been a while since I’ve filed anything in the category marked bad statistics when I glanced at today’s copy of the Times Higher and found something that’s given me an excuse to rectify my lapse. Last week saw the publication of said organ’s new Student Experience Survey which ranks  British Universities in order of the responses given by students to questions about various aspects of the teaching, social life and so  on. I had a go at this table a few years ago, but they still keep trotting it out. Here are the main results, sorted in decreasing order:

University Score Resp.
1 University of East Anglia 84.8 119
2 University of Oxford 84.2 259
3 University of Sheffield 83.9 192
3 University of Cambridge 83.9 245
5 Loughborough University 82.8 102
6 University of Bath 82.7 159
7 University of Leeds 82.5 219
8 University of Dundee 82.4 103
9 York St John University 81.2 88
10 Lancaster University 81.1 100
11 University of Southampton 80.9 191
11 University of Birmingham 80.9 198
11 University of Nottingham 80.9 270
14 Cardiff University 80.8 113
14 Newcastle University 80.8 125
16 Durham University 80.3 188
17 University of Warwick 80.2 205
18 University of St Andrews 79.8 109
18 University of Glasgow 79.8 131
20 Queen’s University Belfast 79.2 101
21 University of Hull 79.1 106
22 University of Winchester 79 106
23 Northumbria University 78.9 100
23 University of Lincoln 78.9 103
23 University of Strathclyde 78.9 107
26 University of Surrey 78.8 102
26 University of Leicester 78.8 105
26 University of Exeter 78.8 130
29 University of Chester 78.7 102
30 Heriot-Watt University 78.6 101
31 Keele University 78.5 102
32 University of Kent 78.4 110
33 University of Reading 78.1 101
33 Bangor University 78.1 101
35 University of Huddersfield 78 104
36 University of Central Lancashire 77.9 121
37 Queen Mary, University of London 77.8 103
37 University of York 77.8 106
39 University of Edinburgh 77.7 170
40 University of Manchester 77.4 252
41 Imperial College London 77.3 148
42 Swansea University 77.1 103
43 Sheffield Hallam University 77 102
43 Teesside University 77 103
45 Brunel University 76.6 110
46 University of Portsmouth 76.4 107
47 University of Gloucestershire 76.3 53
47 Robert Gordon University 76.3 103
47 Aberystwyth University 76.3 104
50 University of Essex 76 103
50 University of Glamorgan 76 108
50 Plymouth University 76 112
53 University of Sunderland 75.9 100
54 Canterbury Christ Church University 75.8 102
55 De Montfort University 75.7 103
56 University of Bradford 75.5 52
56 University of Sussex 75.5 102
58 Nottingham Trent University 75.4 103
59 University of Roehampton 75.1 102
60 University of Ulster 75 101
60 Staffordshire University 75 102
62 Royal Veterinary College 74.8 50
62 Liverpool John Moores University 74.8 102
64 University of Bristol 74.7 137
65 University of Worcester 74.4 101
66 University of Derby 74.2 101
67 University College London 74.1 102
68 University of Aberdeen 73.9 105
69 University of the West of England 73.8 101
69 Coventry University 73.8 102
71 University of Hertfordshire 73.7 105
72 London School of Economics 73.5 51
73 Royal Holloway, University of London 73.4 104
74 University of Stirling 73.3 54
75 King’s College London 73.2 105
76 Bournemouth University 73.1 103
77 Southampton Solent University 72.7 102
78 Goldsmiths, University of London 72.5 52
78 Leeds Metropolitan University 72.5 106
80 Manchester Metropolitan University 72.2 104
81 University of Liverpool 72 104
82 Birmingham City University 71.8 101
83 Anglia Ruskin University 71.7 102
84 Glasgow Caledonian University 71.1 100
84 Kingston University 71.1 102
86 Aston University 71 52
86 University of Brighton 71 106
88 University of Wolverhampton 70.9 103
89 Oxford Brookes University 70.5 106
90 University of Salford 70.2 102
91 University of Cumbria 69.2 51
92 Napier University 68.8 101
93 University of Greenwich 68.5 102
94 University of Westminster 68.1 101
95 University of Bedfordshire 67.9 100
96 University of the Arts London 66 54
97 City University London 65.4 102
97 London Metropolitan University 65.4 103
97 The University of the West of Scotland 65.4 103
100 Middlesex University 65.1 104
101 University of East London 61.7 51
102 London South Bank University 61.2 50
Average scores 75.5 11459
YouthSight is the source of the data that have been used to compile the table of results for the Times Higher Education Student Experience Survey, and it retains the ownership of those data. Each higher education institution’s score has been indexed to give a percentage of the maximum score attainable. For each of the 21 attributes, students were given a seven-point scale and asked how strongly they agreed or disagreed with a number of statements based on their university experience.

My current employer, the University of Sussex, comes out right on the average (75.5)  and is consequently in the middle in this league table. However, let’s look at this in a bit more detail.  The number of students whose responses produced the score of 75.5 was just 102. That’s by no means the smallest sample in the survey, either. The University of Sussex has over 13,000 students. The score in this table is therefore obtained from less than 1% of the relevant student population. How representative can the results be, given that the sample is so incredibly small?

What is conspicuous by its absence from this table is any measure of the “margin-of-error” of the estimated score. What I mean by this is how much the sample score would change for Sussex if a different set of 102 students were involved. Unless every Sussex student scores exactly 75.5 then the score will vary from sample to sample. The smaller the sample, the larger the resulting uncertainty.

Given a survey of this type it should be quite straightforward to calculate the spread of scores from student to student within a sample from a given University in terms of the standard deviation, σ, as well as the mean score. Unfortunately, this survey does not include this information. However, lets suppose for the sake of argument that the standard deviation for Cardiff is quite small, say 10% of the mean value, i.e. 7.55. I imagine that it’s much larger than that, in fact, but this is just meant to be by way of an illustration.

If you have a sample size of  N then the standard error of the mean is going to be roughly (σ⁄√N) which, for Sussex, is about 0.75. Assuming everything has a normal distribution, this would mean that the “true” score for the full population of Sussex students has a 95% chance of being within two standard errors of the mean, i.e. between 74 and 77. This means Sussex could really be as high as 43rd place or as low as 67th, and that’s making very conservative assumptions about how much one student differs from another within each institution.

That example is just for illustration, and the figures may well be wrong, but my main gripe is that I don’t understand how these guys can get away with publishing results like this without listing the margin of error at all. Perhaps its because that would make it obvious how unreliable the rankings are? Whatever the reason we’d never get away with publishing results without errors in a serious scientific journal.

This sampling uncertainty almost certainly accounts for the big changes from year to year in these tables. For instance, the University of Lincoln is 23rd in this year’s table, but last year was way down in 66th place. Has something dramatic happened there to account for this meteoric rise? I doubt it. It’s more likely to be just a sampling fluctuation.

In fact I seriously doubt whether any of the scores in this table is significantly different from the mean score; the range from top to bottom is only 61 to 85 showing a considerable uniformity across all 102 institutions listed. What a statistically literate person should take from this table is that (a) it’s a complete waste of time and (b) wherever you go to University you’ll probably have a good experience!

A Small Problemette related to Cosmological non-Gaussianity

Posted in Cute Problems, The Universe and Stuff with tags , , , on April 8, 2013 by telescoper

Writing yesterday’s post I remembered doing a calculation a while ago which I filed away and never used again. Now that it has come back to my mind I thought I’d try it out on my readers (Sid and Doris Bonkers). I think the answer might be quite well known, as it is in a closed form, but it might be worth a shot if you’re bored.

The variable x has a normal distribution with zero mean and variance \sigma^{2}. Consider the variable

y = x + \alpha \left( x^2 - \sigma^2 \right),

where \alpha is a constant. What is the probability density of y?

Answers on a postcard through the comments box please..

Society Counts, and so do Astronomers!

Posted in Bad Statistics, Science Politics with tags , , , , , on December 6, 2012 by telescoper

The other day I received an email from the British Academy (for Humanities and Social Sciences) announcing a new position statement on what they call Quantitative Skills.  The complete text of this statement, which is entitled Society Counts and which is well worth reading,  is now  available on the British Academy website.

Here’s an excerpt from the letter accompanying the document:

The UK has a serious deficit in quantitative skills in the social sciences and humanities, according to a statement issued today (18 October 2012) by the British Academy. This deficit threatens the overall competitiveness of the UK’s economy, the effectiveness of public policy-making, and the UK’s status as a world leader in research and higher education.

The statement, Society Counts, raises particular concerns about the impact of this skills deficit on the employability of young people. It also points to serious consequences for society generally. Quantitative skills enable people to understand what is happening to poverty, crime, the global recession, or simply when making decisions about personal investment or pensions.

Citing a recent survey of MPs by the Royal Statistical Society’s getstats campaign – in which only 17% of Conservative and 30% of Labour MPs thought politicians use official statistics and figures accurately when talking about their policies – Professor Sir Adam Roberts, President of the British Academy, said: “Complex statistical and analytical work on large and complex data now underpins much of the UK’s research, political and business worlds. Without the right skills to analyse this data properly, government professionals, politicians, businesses and most of all the public are vulnerable to misinterpretation and wrong decision-making.”

The statement clearly identifies a major problem, not just in the Humanities and Social Sciences but throughout academia and wider society. I even think the British Academy might be a little harsh on its own constituency because, with a few notable exceptions,  statistics and other quantitative data analysis methods are taught very poorly to science students too.  Just the other day I was talking to an undergraduate student who is thinking about doing a PhD in physics about what that’s likely to entail. I told him that the one thing he could be pretty sure he’d have to cope with is analysing data statistically. Like most physics departments, however, we don’t run any modules on statistical techniques and only the bare minimum is involved in the laboratory session. Why? I think it’s because there are too few staff who would be able to teach such material competently (because they don’t really understand it themselves).

Here’s a paragraph from the British Association statement:

There is also a dearth of academic staff able to teach quantitative methods in ways that are relevant and exciting to students in the social sciences and humanities. As few as one in ten university social science lecturers have the skills necessary to teach a basic quantitative methods course, according to the report. Insufficient curriculum time is devoted to methodology in many degree programmes.

Change “social sciences and humanities” to “physics” and I think that statement would still be correct. In fact I think “one in ten” would be an overestimate.

The point is that although  physics is an example of a quantitative discipline, that doesn’t mean that the training in undergraduate programmes is adequate for the task. The upshot is that there is actually a great deal of dodgy statistical analysis going on across a huge number of disciplines.

So what is to be done? I think the British Academy identifies only part of the required solution. Of course better training in basic numeracy at school level is needed, but it shouldn’t stop there. I think there also needs to a wider exchange of knowledge and ideas across disciplines and a greater involvement of expert consultants. I think this is more likely to succeed than getting more social scientists to run standard statistical analysis packages. In my experience, most bogus statistical analyses do not result from using the method wrong, but from using the wrong method…

A great deal of astronomical research is based on inferences drawn from large and often complex data sets, so astronomy is a discipline with a fairly enlightened attitude to statistical data analysis. Indeed, many important contributions to the development of statistics were made by astronomers. In the future I think we’ll  see many more of the astronomers working on big data engage with the wider academic community by developing collaborations or acting as consultants in various ways.

We astronomers are always being challenged to find applications of their work outside the purely academic sphere, and this is one that could be developed much further than it has so far. It disappoints me that we always seem to think of this exclusively in terms of technological spin-offs, while the importance of transferable expertise is often neglected. Whether you’re a social scientist or a physicist, if you’ve got problems analysing your data, why not ask an astronomer?

The Tremors from L’Aquila

Posted in Bad Statistics, Open Access, Science Politics with tags , , , on October 23, 2012 by telescoper

I can’t resist a comment on news which broke yesterday that an Italian court has found six scientists and a former government official guilty of manslaughter in connection with the L’Aquila Earthquake of 2009. Scientific colleagues of mine are shocked by their conviction and by the severity of the sentences (six years’ imprisonment), the assumption being that they were convicted for having failed to predict the earthquake. However, as Nature News pointed out long before the trial when the scientists were indicted:

The view from L’Aquila, however, is quite different. Prosecutors and the families of victims alike say that the trial has nothing to do with the ability to predict earthquakes, and everything to do with the failure of government-appointed scientists serving on an advisory panel to adequately evaluate, and then communicate, the potential risk to the local population. The charges, detailed in a 224-page document filed by Picuti, allege that members of the National Commission for Forecasting and Predicting Great Risks, who held a special meeting in L’Aquila the week before the earthquake, provided “incomplete, imprecise, and contradictory information” to a public that had been unnerved by months of persistent, low-level tremors. Picuti says that the commission was more interested in pacifying the local population than in giving clear advice about earthquake preparedness.

“I’m not crazy,” Picuti says. “I know they can’t predict earthquakes. The basis of the charges is not that they didn’t predict the earthquake. As functionaries of the state, they had certain duties imposed by law: to evaluate and characterize the risks that were present in L’Aquila.” Part of that risk assessment, he says, should have included the density of the urban population and the known fragility of many ancient buildings in the city centre. “They were obligated to evaluate the degree of risk given all these factors,” he says, “and they did not.”

Many of my colleagues have interpreted the conviction of these scientists as an attack on science, but the above statement actually looks to me more like a demand that the scientists involved should have been more scientific. By that I mean not giving a simple “yes” or “no” answer (which in this case was “no”) but by give a proper scientific analysis of the probabilities involved. This comment goes straight to two issues that I feel very strongly about. One is the vital importance of probabilistic reasoning – in this case in connection with a risk assessment – and the other is the need for openness in science.

I thought I’d take this opportunity to repeat the reasons I think statistics and statistical reasoning are so important. Of course they are important in science. In fact, I think they lie at the very core of the scientific method, although I am still surprised how few practising scientists are comfortable even with statistical language. A more important problem is the popular impression that science is about facts and absolute truths. It isn’t. It’s a process. In order to advance, it has to question itself.

Statistical reasoning also applies outside science to many facets of everyday life, including business, commerce, transport, the media, and politics. It is a feature of everyday life that science and technology are deeply embedded in every aspect of what we do each day. Science has given us greater levels of comfort, better health care, and a plethora of labour-saving devices. It has also given us unprecedented ability to destroy the environment and each other, whether through accident or design. Probability even plays a role in personal relationships, though mostly at a subconscious level.

Civilized societies face severe challenges in this century. We must confront the threat of climate change and forthcoming energy crises. We must find better ways of resolving conflicts peacefully lest nuclear or conventional weapons lead us to global catastrophe. We must stop large-scale pollution or systematic destruction of the biosphere that nurtures us. And we must do all of these things without abandoning the many positive things that science has brought us. Abandoning science and rationality by retreating into religious or political fundamentalism would be a catastrophe for humanity.

Unfortunately, recent decades have seen a wholesale breakdown of trust between scientists and the public at large; the conviction of the scientists in the L’Aquila case is just one example. This breakdown is due partly to the deliberate abuse of science for immoral purposes, and partly to the sheer carelessness with which various agencies have exploited scientific discoveries without proper evaluation of the risks involved. The abuse of statistical arguments have undoubtedly contributed to the suspicion with which many individuals view science.

There is an increasing alienation between scientists and the general public. Many fewer students enrol for courses in physics and chemistry than a a few decades ago. Fewer graduates mean fewer qualified science teachers in schools. This is a vicious cycle that threatens our future. It must be broken.

The danger is that the decreasing level of understanding of science in society means that knowledge (as well as its consequent power) becomes concentrated in the minds of a few individuals. This could have dire consequences for the future of our democracy. Even as things stand now, very few Members of Parliament are scientifically literate. How can we expect to control the application of science when the necessary understanding rests with an unelected “priesthood” that is hardly understood by, or represented in, our democratic institutions?

Very few journalists or television producers know enough about science to report sensibly on the latest discoveries or controversies. As a result, important matters that the public needs to know about do not appear at all in the media, or if they do it is in such a garbled fashion that they do more harm than good.

Years ago I used to listen to radio interviews with scientists on the Today programme on BBC Radio 4. I even did such an interview once. It is a deeply frustrating experience. The scientist usually starts by explaining what the discovery is about in the way a scientist should, with careful statements of what is assumed, how the data is interpreted, and what other possible interpretations might be and the likely sources of error. The interviewer then loses patience and asks for a yes or no answer. The scientist tries to continue, but is badgered. Either the interview ends as a row, or the scientist ends up stating a grossly oversimplified version of the story.

Some scientists offer the oversimplified version at the outset, of course, and these are the ones that contribute to the image of scientists as priests. Such individuals often believe in their theories in exactly the same way that some people believe religiously. Not with the conditional and possibly temporary belief that characterizes the scientific method, but with the unquestioning fervour of an unthinking zealot. This approach may pay off for the individual in the short term, in popular esteem and media recognition – but when it goes wrong it is science as a whole that suffers. When a result that has been proclaimed certain is later shown to be false, the result is widespread disillusionment. And the more secretive the behaviour of the scientific community, the less reason the public has to trust its pronouncements.

I don’t have any easy answers to the question of how to cure this malaise, but do have a few suggestions. It would be easy for a scientist such as myself to blame everything on the media and the education system, but in fact I think the responsibility lies mainly with ourselves. We are usually so obsessed with our own research, and the need to publish specialist papers by the lorry-load in order to advance our own careers that we usually spend very little time explaining what we do to the public or why we do it.

I think every working scientist in the country should be required to spend at least 10% of their time working in schools or with the general media on “outreach”, including writing blogs like this. People in my field – astronomers and cosmologists – do this quite a lot, but these are areas where the public has some empathy with what we do. If only biologists, chemists, nuclear physicists and the rest were viewed in such a friendly light. Doing this sort of thing is not easy, especially when it comes to saying something on the radio that the interviewer does not want to hear. Media training for scientists has been a welcome recent innovation for some branches of science, but most of my colleagues have never had any help at all in this direction.

The second thing that must be done is to improve the dire state of science education in schools. Over the last two decades the national curriculum for British schools has been dumbed down to the point of absurdity. Pupils that leave school at 18 having taken “Advanced Level” physics do so with no useful knowledge of physics at all, even if they have obtained the highest grade. I do not at all blame the students for this; they can only do what they are asked to do. It’s all the fault of the educationalists, who have done the best they can for a long time to convince our young people that science is too hard for them. Science can be difficult, of course, and not everyone will be able to make a career out of it. But that doesn’t mean that it should not be taught properly to those that can take it in. If some students find it is not for them, then so be it. I always wanted to be a musician, but never had the talent for it.

The third thing that has to be done is for scientists to be far more open. Publicly-funded scientists have a duty not only to publish their conclusions in such a way that the public can access them freely, but also to publish their data, their methodology and the intermediate steps. Most members of the public will struggle to make sense of the information, but at least there will be able to see that nothing is being deliberately concealed.

Everyone knows that earthquake prediction is practically impossible to do accurately. The danger of the judgement in the L’Aquila Earthquake trial (apart from discouraging scientists from ever becoming seismologists) is that the alarm will be sounded every time there is the smallest tremor. The potential for panic is enormous. But the science in this field,as in any other, does not actually tell one how to act on evidence of risk, merely to assess it. It’s up to others to decide whether and when to act, when the threshold of danger has been crossed. There is no scientific answer to the question “how risky is too risky?”.

So instead of bland reassurances or needless panic-mongering, the scientific community should refrain from public statements about what will happen and what won’t and instead busy itself with the collection, analysis and interpretation of data and publish its studies as openly as possible. The public will find it very difficult to handle this information overload, but so they should. Difficult questions don’t have simple answers. Scientists aren’t priests.

More Maths, or Better Maths?

Posted in Education with tags , , on July 25, 2012 by telescoper

Interesting view from a Biosciences perspective about the recent recommendations to increase the number of students taking Mathematics at A-level.

I’ve always had a problem with the way Statistics is taught at A-level, which is largely as a collection of recipes without much understanding of the underlying principles; would more emphasis on probability theory be a better way to go?

JennyAKoenig's avatarBiomaths Education Network

The introduction of post-16 maths is in the news again with a report from the House of Lords committee on Higher Education in STEM and many of the headlines from the Guardian, Independent and Times Higher  have picked up on the recommendations regarding maths study post-16.

I have written a few thoughts here on my first impressions but would very much welcome comments.

Though I was pleased to see that some of my work showing that only GCSE maths is required for undergraduate biosciences was cited, the conclusion from this was that more students should take maths A level and this is a little worrying.

The lack, or low level, of maths requirements for admission to HEIs, particularly for programmes in STEM subjects, acts as a disincentive for students to take maths and high level maths at A level. We urge HEIs to introduce more demanding maths  requirements…

View original post 634 more words

The Long Weekend

Posted in Books, Talks and Reviews, The Universe and Stuff with tags , , , , , on April 5, 2012 by telescoper

It’s getting even warmer in Cape Town as we approach the Easter vacation. The few clouds to be found in the sky over the last couple of days have now disappeared and even the mountain behind the campus has lost its white fluffy hat:

It’s going to be a busy weekend in these parts over the forthcoming weekend. As in the UK, tomorrow (Good Friday) is a national holiday and there will be a 5K fun run around the campus. The temporary stands and marquees you can see in the picture are associated with that. On Saturday there’s a really big event finishing there too – the Two Oceans Marathon – which will finish on the University of Cape Town campus. At the moment it’s 30 degrees, but the forecast is to cool down a bit over the holiday weekend. Good news for the runners, but not I suspect for everyone who’s disappearing off for a weekend at the beach!

Anyway, I did my talk this morning which seemed to go down reasonably well. It was followed by a nice talk by Roberto Trotta from Imperial College in a morning that turned out to be devoted to statistical cosmology. I didn’t get the chance to coordinate with Roberto, but suspected he would focus on in the ins and outs of Bayesian methods (which turned out to be right), so I paved the way with a general talk about the enormous statistical challenges cosmology will face in the era after Planck. The main point I wanted to make – to an audience which mainly comprised theoretical folk  – was that we’ve really been lucky so far in that the nature of the concordance cosmology has enabled us to get away with using relatively simple statistical tools, i.e. the power spectrum.This is because the primordial fluctuations from which galaxies and large-scale structure grew are assumed to be the simplest possible statistical form, i.e. Gaussian.  Searching for physics beyond the standard model, e.g. searching for the  non-Gaussianities which might be key to understanding the physics of the very early stages of the evolution of the Universe,  will be more difficult  by an enormous factor and will require much more sophisticated tools than we’ve needed so far.

Anyway, that’s for the future. Cosmological results from Planck won’t be freely available until next year at the earliest, so I think I can still afford to take the long weekend off  without endangering the “Post-Planck Era” too much!

Late Arrivals at the Statistician’s Ball

Posted in Uncategorized with tags , , on October 16, 2011 by telescoper

I’m in a frivolous mood this Sunday morning so I thought I’d have a go at stirring up a bit of audience participation. Taking my cue from I’m Sorry I Haven’t a Clue, please let me announce some of the late arrivals at the Statistician’s Ball. Your contributions are also welcomed…

Ladies and Gentlemen may I introduce:

Mr and Mrs Ear-Regresssion and their daughter Lynne Ear-Regression

Mr and Mrs Thmetick-Mean and their son, Harry Thmetick-Mean

Mr and Mrs D’arderra and their son, Stan.

Mr and Mrs Layshun and their daughter, Cora

Here’s Mark Offchain and his friend Monty Carlo

Incidentally, the food this evening will be served at your table free of charge; there’s a “Buy no meal” distribution…

Mr and Mrs Rating-Function and their daughter, Jenna.

Mr and Mrs Mentz and their daughter, Mo.

Mr and Mrs Al-Distribution and their son Norm.

Mr and Mrs Variate and their daughter Una; she’s still single, by the way…

Mr and Mrs Otis and their son, Curt

Mr and Mrs Pling-Bias  and their son, Sam

Mr and Mrs Inal-Probability and their daughter, Marge.

Mr and Mrs Over and their daughter, Anne Over.

Mr and Mrs Mogorov and their son, Carl. I’m sure he’ll want to try out the vodka. Hey Carl Mogorov! Smirnov test?

Mr and Mrs Fordslaw and their son, Ben.

Mr and Mrs Knife and their son, Jack.

Mr and Mrs Motion and their son Ian (who’s just back from a holiday during which he got a very deep tan), yes it’s Brown Ian Motion.

Mr and Mrs Rage and their daughter, Ava.

Mr and Mrs Sprier and their son, Jeffrey Sprier.

And now we’re joined by royalty. From the distinguished house of Ippal-Components, here’s Prince Ippal-Components.

Mr and Mrs D’alscoefficient and their son, Ken.

Here’s the Hood family with their particularly amiable son, Lee. I’m sure you will like Lee Hood!

Mr and Mrs Gale and their son, Martin.

Mr and Mrs Imum-Entropy and their son, Max.

Mr and Mrs Spectra and their daughter, Polly.

That’s all I’ve got time for at the moment, but please feel free to offer your own suggestions through the box below…