Birmingham’s ‘literacy crisis,’ although none measured in making of report
It’s hard to tell if last week was an exceptional one for crises, because there are ‘CRISIS!’ headlines nowadays for almost anything that temporarily falls short of our media-stoked expectations, writes Chris Game.
We had, of course, the ongoing NHS, A&E, and social care crises, the housing crisis, the child refugee crisis, and Labour’s existential crisis, joined by the more recent iceberg lettuce and Leicester City crises, so it’s understandable if you happened to miss the “exposure of England’s deep-rooted literacy crisis”.
I think, incidentally, that there’s a category error involved in many of these supposed crises. ‘Crisis’ isn’t just something that’s really bloody awful; it should be for something RBA that’s at a decisive or turning point.
It derives from the Greek word for decision, and the problem with most of our ‘crises’ – lettuces and Leicester Foxes possibly excepted – is the absence of any direction-changing decision anywhere in sight.
England’s literacy – and numeracy – deficiencies certainly aren’t an exception; hence their ‘deep-rootedness’. They’ve in fact been repeatedly ‘exposed’ for years, by international bodies like UNESCO and OECD, as well as our own national government – of which more shortly.
Last week’s exposure, then, certainly wasn’t new; at most, different – the main novelty being, not its shock factor, but its not having actually measured anyone’s literacy.
Its authors were the charity, the National Literary Trust, in unexpected collaboration with Experian, the global (but conveniently Dublin-headquartered) ‘information services’ group, previously known to me mainly for producing credit rating scores – which, it turns out, is far from entirely irrelevant.
NLT and E’s ‘take’ on our nation’s literacy crisis comprises their creation of a “new metric – a literacy vulnerability score, which explored the social factors most closely linked with low literacy”.
And what got them news coverage was the use of their new metric toy to measure their depersonalised definition of literacy vulnerability – though not at the level of the local authorities who still have some responsibility for schools and their teaching, but of the parliamentary constituencies that will shortly disappear in the boundary review.
There were, though, at least two big reasons for this apparently odd decision. First was the calculation that focusing on MPs rather than councils, launching their report through the Commons All-party Literacy group, and giving all MPs “a tailored report to help them understand the specific challenges in their constituencies” is pretty slick PR. Good luck to them.
But the second reason has to do with the new metric and the kind of work you’d expect a company with the specialist skills of Experian to be undertaking. As with credit ratings, they crunch numbers.
Big sets of numbers, yes – in this case 2011 Census numbers on unemployment, income, qualifications, social mix of population, etc., plus data of their own – but that’s their evidence of the literacy crisis: crunched numbers.
No testing of understanding, evaluation and usage of written texts – which is probably what most of us associate with the measurement of literacy, and, as we’ll see, what most previous studies have done.
No school visits, test results, teacher interviews, observation; nothing, it seems, apart from the numbers. Indeed, they emphasise that “the educational attainment data for children was not included”.
Which largely explains, of course, how they’re able to produce, and get away with, the extraordinary precision of their Literacy Vulnerability scores, and their extraordinary range – from Middlesbrough’s 54 up to NE Hampshire’s, nearly 36 times higher at 1,922.
Or even, as in the right-hand side of my illustrative table covering only the seven West Midlands metropolitan boroughs, the constituency range within Birmingham alone, from Erdington’s 95 to Sutton Coldfield’s, nearly 16 times higher at 1,495.
Imagine the headlines if these stats derived from standard literacy tests: “Sutton residents 16 times more literate than Erdington next-door neighbours”. There’d be riots – I’d hope.
Fortunately, we already have a more meaningful measure from the Government’s own most recent nationwide Skills for Life (SfL) survey in 2010/11 – administered by the Department for Business, Innovation and Skills, and comprising over 7,000 interviews with real adults aged between 16 and 65, covering not just literacy, but also numeracy and basic ICT skills.
The survey’s results, moreover, are analysed and published in an exceptionally wide range of formats – including by local authority, ward, parliamentary constituency, and even LEP area. So it’s easy to compare the NLT and E’s extreme ranges and contrasts with those based on actual personal measures of literacy and numeracy, and those for the West Midlands boroughs are reproduced on the left-hand side of the table.
Literacy scores (or illiteracy scores, if you prefer) for Birmingham constituencies range from Ladywood’s 8.0 and Hodge Hill’s 7.2 to Sutton’s 3.7. Erdington’s 6.5 might suggest that it’s coping better than some with its massive ‘literacy vulnerability’ recorded by Experian.
Both sets of statistics purport to be measuring essentially the same phenomenon, but I’d suggest the SfL’s are more meaningful and therefore more practically and politically useful.
Both sets on their own, though, are just figures, based on particular points in time. For them to have real meaning, they need to be compared over time and with those of other broadly similar countries – and, if you want to start shouting ‘Crisis!’, this is where you’ll find your evidence, illustrated here in three of many disturbing findings.
First is the 2011 SfL survey’s finding that, while more 16-65 year olds than in 2003 were above the Level 2 ‘poor performance’ literacy threshold (57% compared to 44%), those recording ‘poorest’ performance (Entry Levels 1 and 2) rose from 5.4% to 7.1% – described as a slight increase (p.2), but in fact one of nearly one-third.
My other ‘headline’ findings are from the OECD’s massive International Survey of Adult Skills/Competencies, based on samples of 5,000 adults aged 16-65 in 40 countries. The English results were assessed in a special OECD volume published last year, and they make uncomfortable reading – as a whole but particularly in relation to the youngest age cohorts, as shown in the following tables.
Overall we’re a bit below the OECD average for literacy “but well below average for numeracy” (p.22). Taken together, it means that “an estimated 9 million working aged adults in England struggle with basic quantitative reasoning or have difficulty with simple written information.”
Among 16-19 year olds, however, it gets seriously worse. My second headline finding is that England is not just at the foot of both literacy and numeracy tables, but has three times more low-skilled young people than the best-performing countries (p.12), with “around one in ten of all university students having numeracy or literacy levels below Level 2.”
For an erstwhile academic, those last figures are upsetting enough, but probably the single finding I found most depressing was that shown in the lower table: that, in contrast to pretty well every other OECD country, the proportion of 16-24 year olds with low basic skills is no lower than among my own retirement-age generation.
Returning, then, to the NTL and E study that prompted this blog, to me it’s findings like these, based on the questioning and testing of representative samples of real people that have a force and meaning that Experian’s number-crunched results, extreme as they are, lack.
Indeed, I’m not really persuaded that ‘literacy vulnerability’ scores and rankings amount to very much more than the social deprivation rankings I discussed in another recent blog in relation to the Grant Thornton ‘Vibrant Economy Index’. My strong impression is that LV is pretty much just the inverse of VE.
Moreover, it didn’t need ‘literacy vulnerability’ to demonstrate that the country has a huge basic skills problem – or crisis, if you must. But one good thing about the timing of the NTL & E study is its appearance in the run-up to the elections following which employment and skills will formally become one of the most important devolved responsibilities of our mayoral West Midlands Combined Authority.
It’s tempting to say that it doesn’t have to do much to improve on central government’s recent record, but I suspect it may be one of the policy fields on which its performance will be most rigorously judged.
On Saturday night, we broke the story that Mark Rogers, chief executive of Birmingham city
The Chief Executive of Birmingham City Council, Mark Rogers, has announced that he will be
Mark Rogers, chief executive of Birmingham city council since early in 2014, will leave the
The West Midlands Combined Authority (WMCA) Board will consider the final report of its Land
Key social challenges like youth unemployment, transport, mental health and social care could be tackled