| Share

In time-honoured fashion, A level results were published on the second Thursday in August and the academic fates of a quarter of a million young people were revealed. Universities were open for business, and schools opened doors or ‘phone lines to help their leavers process the news and work out what to do next. In the dog days of August, the nation’s journalists home in on A levels, looking for stories, sometimes finding them and sometimes making them up. However, with rising levels of infection on the continent, a break-through in diplomatic relations in the Middle East and confirmation of our precipitous slide into recession, the annual publication of results which, but for periodic intervention by reformist-minded Ministers, are generally and indeed strategically quite consistent year in, year out, was going to have to be pretty shattering to make even the inside pages.

Ofqual and the Secretary of State for Education came up trumps: having published A level results once, three working days later they published them again: Ofqual announced that a school’s assessments, so-called centre-assessment grades, submitted in June but initially superseded by an algorithm, will count as a student’s results unless the earlier result would have been better. Two sets of A level results in four days. Not bad going.

The original results may, at least to Ofqual’s statisticians, have looked quite similar but they didn’t tell the whole story. Of course, no one took any exams during lock-down, so exam boards had to come up with a way of awarding results without exam scripts, and in pretty short order came up with an algorithm, rules to follow in making calculations, which would produce results which were to be fair by the individual and in line with what the nation’s examinees have done in the past and will do in the future. They had to do this without time to trial the algorithm, and the fallout shows the imperfections of a sticking-plaster system (what else was possible to award grades to students who had sat no exams?). It didn’t work: even if many schools had headline figures broadly similar to what they were used to because we (commentators, politicians, parents, school leaders even) always focus on the proportion of A*, A or B grades a school gets, a useful proxy, we believe, for school quality, there were pupils in every school whose results made no sense and for whom there was no workable appeals process. It was not so much the scale of the unfairness but its acuteness for those whom it affected: if you were predicted a B, but were dealt an E by a computer programme, there’s little solace in knowing that the nation did 2.7% better than in 2019.

We rarely think about the numbers of individuals whose three or four grades show that they have been awarded what they should have been awarded because, when thinking about the individual, it’s whether they get into a chosen or a good university that matters most. We don’t think enough about those whose results won’t get them anywhere. Who would we blame?

What lessons do we learn? Well, we must be thankful that the beleaguered officials and flustered Ministers responded to address the principal injustices where they have greatest potential to throw people off course for good. They must now think about the implications for the Class of 2022 whose chances of getting to university mustn’t be adversely affected by the necessarily inflated numbers of students eligible to go to university this year. And it would be good if they were to acknowledge the de facto abdication of an appeals process (to schools) for a student who doesn’t recognise their centre-assessment grade, and supported schools appropriately.

But, now the immediate, unique injustices for this generation have been resolved, I’d like to hope that we won’t waste the averted crisis.

I don’t blame politicians for politicising education, but it rarely helps the most vulnerable pupils when they do make education policy a political football. This Covid-19-inspired debacle could have hit disadvantaged children worst, it appears; we know that lock-down affected them disproportionately – digital poverty, lack of space to work, less effective online teaching resource. What worries everyone now, and should have been worrying us before, is the certainty that the children of more affluent families and of families who rate and believe in schooling (not always the same) do better in any school setting: disadvantaged children do less well, tend to have access to less good schools and apply in smaller numbers to university. There’s little surprise that London, as the wealthiest area of the UK, dominates in applications to the best universities from state and private schools.

I’ve always maintained that aggregated exam results, whether for a nation or a school, won’t give the answers we look for in them: they are rarely a good proxy for school quality because they can’t tell you a pupil’s starting point, or the journey they have made to the exams or the values and qualities they will have developed along the way. But this crisis, which only a last-minute volte-face avoided, must make us ask what exams results can tell actually us.

A successful, hard-working pupil is a child who enjoys good mental and physical health, who eats well, sleeps safely, has a place to work and a place to play, who is supported beyond the school gates and has stability and consistent care – love – in their lives. And goes to a school where most pupils come to class happy, healthy and wanting to learn. We don’t live in a paradise, so where these ingredients can’t be provided, we need to prime and organise health care and accommodation, invest in pre-emptive social care, provide active and varied after-school and holiday care, subsidise and promote healthy eating for those whose incomes and lifestyles don’t make this otherwise possible. That means spending more, and joining up that spending, on health, education, social care, after-school care, with policies to provide threshold qualities of secure, safe accommodation and access to affordable, healthy food, wherever the postcodes tell us there’s deprivation. We can use postcodes, of course, but we could just as easily use a school’s aggregated exam results: high numbers of lower grades equals social and economic disadvantage. If we want to see better exam results, if we are genuinely interested in closing the attainment gap, we need to look beyond the statistics.