By guest blogger David Robson
We may have multiple sensitive periods for different kinds of skills depending on the brain’s development at that time. By David Robson
If you want to maximise a person’s intellectual potential, the general consensus for a long time has been that you need to start young. According to this traditional view, early childhood offers a precious “window of opportunity” or “sensitive period” for learning that closes slowly as we reach adolescence. It’s the reason that toddlers find it easier to master the accent of a foreign language, for instance.
This view has even shaped educational policy. If you want to help people from disadvantaged backgrounds, for instance, some psychologists had argued that you would do better to target primary schools, with diminishing returns for interventions later in life, as if badly performing teenagers were something of a lost cause.
Sarah-Jayne Blakemore at University College London has spent the last decade over-turning some of these assumptions, showing that the adolescent brain is still remarkably flexible as it undergoes profound anatomical changes. “The idea that the brain is somehow fixed in early childhood, which was an idea that was very strongly believed up until fairly recently, is completely wrong,” she told Edge in 2012. The transformation is particularly marked in the prefrontal lobes (located behind the forehead) and the parietal lobes (underneath and just behind the top of your head): two regions that are involved in abstract thought.
The upshot is that teenagers may go through a second sensitive period, in which they are particularly responsive to certain kinds of intellectual stimulation. A new paper from Blakemore’s lab, published in Psychological Science, builds on this idea, showing that our ability to learn certain kinds of analytical skills doesn’t diminish after childhood, but actually increases through adolescence and into early adulthood.
The team – led by Lisa Knoll – recruited more than 600 participants aged 11 to 33, and randomly assigned them to three groups, each trained in a different skill. They taught the first group “numerosity discrimination”, which involved rapidly guessing the number of coloured dots to appear on a screen. They trained the second group in “relational reasoning”: the ability to detect abstract rules and relationships using the kind of non-verbal puzzles (known as Raven’s Matrices) that are common in some IQ tests (see image left). The third group, meanwhile, honed their facial perception: they judged repeatedly whether two photos, shown in rapid succession, represented the same person or not.
It’s worth noting that these skills don’t require any advanced factual knowledge. Instead, they represent a broader capacity for abstract thought and pattern recognition that might be useful for many kinds of academic work.
The online training sessions were short but frequent, lasting a maximum of 12 minutes a day for 20 days. The participants were then tested – with the same kinds of puzzles used in training – shortly after they had finished their last session, and again a few months later, to see whether the skills had stuck
Sadly, the participants in the third group would have been disappointed if they ever had hopes of being a “super-recogniser” – their facial perception showed next to no improvement.
But the hard work paid off for many of the other participants: those learning the relational reasoning and the numerosity discrimination had all boosted their scores. Based on these successes, Blakemore, Knoll and colleagues next examined whether those gains depended on the participants’ age.
According to the accepted wisdom, they should have seen the greatest dividends among the youngest subjects aged 11-13, when they were closest to that childhood window of opportunity. But as the psychologists had expected, the older participants turned out to gain the most from the training. The late adolescents (aged 16-18) improved their relational reasoning by around 10 per cent, for instance: nearly twice the gains of younger participants. Even the adults in the group tended to perform better than the youngsters, suggesting that our 20s and early 30s may still be a fertile time for self-improvement.
At least in these kinds of analytical skills, the window of opportunity was still wide open, perhaps reflecting the greater neuroplasticity – the ability to forge new neural circuits – of the prefrontal cortex that Blakemore had previously observed in the brain scans. If so, it supports her idea that we may have multiple sensitive periods for different kinds of skills depending on the brain’s development at that time. As one window of opportunity closes, another may open.
There are other potential explanations, though. The researchers tried to control for the participants’ motivation, showing that the results still held when you took into account the number of completed training sessions. It seems the older participants weren’t just trying harder. But it’s also possible that the teenagers and adults had developed better cognitive strategies for learning: deliberate mental procedures that don’t necessarily reflect greater anatomical plasticity.
Given the recent controversies surrounding brain training, it’s also worth noting that we don’t know whether these improvements led to meaningful changes in other areas of the participants’ lives. Blakemore and Knoll couldn’t find any corresponding boost in working memory, for instance, and we don’t know if these abstract analytical skills would correspond to a greater aptitude for mathematics or science at school or university.
Even so, I’m intrigued about the potential implications and the promise this holds for future research, adding further evidence to the growing body of work showing that adolescence is a fascinating and potentially fertile period of intellectual growth. Contrary to our sometimes dim view of teenagers, there’s much more to those years than the acne and the angst.
Ravens pic credit: Wikipedia commons.