The difference between research and marketing

0 Replies

A recent TES article says a new UK report reveals a “Silent army of 40,000 ‘lost girls’ struggling with reading”.

Great, attention-grabbing headline. Shades of Boko Haram. But we already know that many girls can’t decode, or can’t comprehend language very well, or have both problems.

There are fewer struggling girls than struggling boys, and girls are more likely to shrink into themselves than attract attention by behaving badly when they’re struggling. But struggling girls exist in schools everywhere, which should already be identifying and assisting them.

An over-reliance on phonics?

What made me sit up and pay attention in the TES article was its statement that the report it discusses “suggests that an ‘over-reliance on phonics’ is obscuring deeper problems with reading in primary schools – where children can read words but may not understand them”.

A senior publisher and a charity director (but no researchers) say things supporting this statement, and suggest a greater focus on teaching comprehension. An independent education consultant elaborates on how boys (still!) get more teacher attention than girls, as though this is just a fact of life, beyond teachers’ control.

The report itself looks a lot more like a test marketing campaign than scientific research. But teachers who don’t read scientific research might not realise this.

Is this even a research figleaf?

The “Lost Girls” report is by GL Assessment, a company which describes itself as “the leading provider of formative assessments to UK schools, as well as providing assessments … in over 100 countries worldwide”.

The report’s introduction is by GL Assessments’ Chief Executive, who spent his early career in marketing and IT in the oil industry. He hopes “this report plays its part in shining a light on our overlooked and neglected ‘lost girls’. Marketing talk, not research talk.

The report contains no review of relevant literature or detailed description of research methods. A “survey” was done, but we’re not told who or where the respondents were, or where we might find out more detail about methodology which might allow study replication, a cornerstone of reputable research reports.

The report contains statements like “…our findings suggest that approximately more than 40,000 girls in each year group have severe trouble with reading.” It discusses how many “months behind” in reading groups of girls were, rather than the usual reporting of Means and Standard Deviations which would allow us to compare those studied with the average range.

The assessment tasks reported on in the “Lost Girls” report aren’t the kinds of tests which distinguish well between learners who can’t decode and learners with poor spoken language comprehension, who have different intervention needs.

The tasks for 10 and 12 year olds were 1) written sentence completion and 2) written passage comprehension. These form part of the New Group Reading Test (NGRT), available from GL Assessment. I can’t find any information about research into the NGRT’s reliability or validity on their website.

Both assessment tasks require both decoding skills (phonemic awareness and phonics) and oral language skills (vocabulary and grammar, and for the passage comprehension task also some higher-order skills like inferencing). So they can’t help much with sorting the poor decoders from the poor comprehenders.

The remainder of the report reads like articles from a school staffroom magazine or catalogue, with glossy photos, attractive layout and the headings “How to spot undetected problem readers, and how to help them” (Recommendation 1: “Discover each child’s reading age”. Perhaps with the NGRT??), “Why girls with reading problems can be hard to spot” and “Top 10 tips: supporting children with reading”, which include the groundbreaking suggestions “Make time for reading every day”, “Read aloud to your child” and “Keep a large range of reading material readily available”.

The report’s last page has the heading “Identifying the barriers to learning”.  GL Assessment’s Sales and Marketing Director (an Engineering academic before working in many senior marketing roles, who has a particular passion for “driving high growth in competitive markets through better understanding of the customer and the purchase process”), explains “how the company’s New Group Reading Test (NGRT) fits in with their whole-pupil approach to education.”

At this point we are into pure marketing talk, finishing up at the point of sale: “For further information please visit gl-assessment.co.uk/ngrt. To contact your local area consultant to organise a school visit or a free quote please visit gl-assessment.co.uk/consultants or to discuss your specific requirements, call 0330 123 5375.”

How anyone at TES or elsewhere can call this marketing document “a new report on literacy” or a “study” is baffling. But I guess till the sector gets its head around the evidence pyramid and how to recognise and read scientific research, this sort of thing might keep happening.

The UK marketing context

Early years teachers in the UK have for some years been required to include explicit, systematic synthetic phonics in their literacy teaching, along with work on vocabulary, fluency and comprehension (nobody is advocating “phonics-only instruction”, despite what anti-phonics campaigners might tell you). The reason is simple: proper research shows it’s highly effective.

Since 2012 UK teachers have also had to administer the UK Phonics Screening Check we’ve been arguing about in Australia this week (see here, here, here, here, here, and here), to ensure children really are learning how to sound words out, rather than looking at pictures and guessing words, as young children are often (still!) taught to do here in Australia.

Change is hard, and many UK teachers disliked being told to change how they taught. They disliked having the success of their teaching formally evaluated. We live in an era when anti-science thinking seems to be on the rise, so I guess we shouldn’t be surprised that some teachers are among those who have more faith in their own judgement than the findings of scientific research.

Some teachers and others in UK education saw the Phonics Screening Check as an affront to teacher professionalism, even though the changes have had a clear positive impact on student learning. We’ve recently had a teacher union here call it “anti-teacher”.

Confirmation bias is the human tendency to search for, interpret, favour, and recall information in a way that confirms what one already believes.

Teachers steeped in the belief that reading and spelling development is natural, complex and mysterious naturally struggle with the simplicity of the research-based Simple View of Reading.

Teachers taught to consider phonics boring, old-fashioned, potentially harmful, and an approach of last resort can be expected to respond positively to marketing materials which suggest cutting back on phonics. Savvy marketers probably know this.

Most teachers are women, and many senior women now in charge of school purse strings might remember (as I do) being ignored by their teachers, who focussed instead on noisy, demanding boys. Perhaps pressing teachers’ feminist and anti-phonics buttons is a good way to market a new test.

Yes, the NGRT test can probably identify poor readers, both female and male. But most teachers of older kids already know who the poor readers are. They’ve heard them read. The thing they most need to do is work out the underlying cause of their reading problems, in order to address them.

Some of the “Lost Girls” actually need more phonics

Many school-aged girls, like boys, can’t read well because they aren’t good at decoding. Sometimes they haven’t been taught well enough, and sometimes they have dyslexia. They need more explicit, systematic synthetic phonics instruction, just like boys who can’t decode.

Many other kids who can’t read well have poor comprehension of oral language, or language disorder, sometimes called Specific Language Impairment or Severe Language Disorder. It affects about one kid in every classroom. Yup. You can learn about it from the RALLI campaign. The need to identify and assist these kids better in Australia is discussed in this 2013 Senate report.

Language disorder is diagnosed by Speech Pathologists using a test of spoken language such as the Clinical Evaluation of Language Fundamentals (CELF), once other possible causes of severe language problems have been ruled out, e.g. intellectual disability, hearing impairment.

Kids with language disorder need (but often don’t get) intervention targeting their listening skills, and usually also their speaking skills. Most also find it harder than average to learn to decode text, so also need extra work on phonics.

I know a few kids who are good at decoding but poor at comprehension (hyperlexic), all of them on the autism spectrum. A language disorder but a great decoding teacher might also result in this problem, but “barking at print” seems to me mostly something that happens in anti-phonics campaigners’ imaginations. For good strategies to boost reading comprehension, click here.

Poor comprehenders can’t be identified using written language tasks, unless it’s previously been established that they can definitely decode/recognise all the words in the texts to be used. The decodability of assessment texts used is not discussed in the “Lost Girls” report.

Fortunately, the TES article that got me cross enough to write this blog gives the last word to a sensible-sounding UK Department of Education person, who points out that the UK’s increased focus on phonics has improved children’s reading skill levels, and that they are also helping kids do stuff that presses everyone’s happy buttons (and in which more kids can now successfully participate) like joining libraries and book clubs.

«

Leave a Reply

Your email address will not be published. Required fields are marked *