Today’s post is a bit more informal and personal than usual.
When I first started attending Learning and the Brain conferences in 2008, I looked at the presenters as Speakers of Truth from a Platform of Verity.
They KNEW THINGS. They had DONE RESEARCH.
I wasn’t there to ask questions. I was there to write down what they told me.
Evolving Perspective
Over the last ten years, I’ve learned to think differently about the relationship between research and understanding. Research always begins as a cheerfully contentious conversation among competing theories.
Can people pay attention for more than 10 minutes? Researchers argue about that.
Does retrieval practice have limits? Researchers argue about that.
I’m also gradually learning to think differently about researchers themselves.
In the past, they struck me as distant, awe-inspiring figures. They were busy, out questing for truth.
I would no sooner have interrupted a researcher to ask a question than I would have interrupted a surgeon mid-slice. They’ve got better things to do.
And yet, I’m learning how eager many researchers are to connect with teachers.
Today’s Uplifiting News
In the last two weeks, I have sent emails to six different researchers, asking them questions about the classroom implications of their work.
To be clear: I’ve never met any of these researchers. I’ve certainly never had the chance to do anything for them. I was, in other words, a total stranger asking a question out of the blue.
You know what? Five of those six have responded; three of them responded in about 2 hours. (I’m still hoping to hear from #6.)
In fact, they all responded substantively and enthusiastically. They liked my questions, had specific suggestions, and pointed me to other articles to check out.
They didn’t see my question as an intrusion, but as an invitation to a teaching conversation they were glad to join.
I’m not naming these researchers here because I don’t want them to be swamped with email. But I do hope you feel as encouraged as I do. If you’ve got a question about the study you just read — for example, how best to make it work in your classroom — you just might reach out to the study’s author.
You might very well start a fascinating conversation.
Regular readers of this blog know that I’m very skeptical about training working memory. Despite all the promises, most studies show that WM training just doesn’t do very much.
Better said: working memory training helps people do better on other, similar working memory tests. But it doesn’t help students learn to read or calculate or analyze any better.
But here’s a tantalizing possibility: what if we could find an even better shortcut to cognitive success?
Training Working Memory: News from Finland
Researchers at Abo Akademi University in Turku wondered why WM training works in psychology labs, but not in classrooms.
(One of the champions of WM training — Dr. Susanne Jaeggi — has spoken at Learning and the Brain conferences. If you’ve seen her, you know she’s an incredibly impressive researcher. You too might reasonably wonder why that research isn’t panning out.)
These Finnish researchers wondered if the WM training simply gave students the chance to figure out a particular WM strategy.
That is: they didn’t have more working memory. But, they were using the WM they already had more strategically.
This strategy applied to the specific working memory task (which is why their WM scores seemed to get better), but doesn’t apply to other cognitive work (like math and reading).
If that hypothesis is true, then we could simply tell our students that strategy. We would then see the same pattern of WM development that comes from the training — only much faster.
Specifically, we would expect to see improvement in similar WM tasks — where students could apply the same strategy — but not on unrelated tasks — where that strategy doesn’t help.
If their hypothesis is correct, then the results that take 6 WEEKS of training might be available in 30 MINUTES. Rather than have students figure out the strategy on their own (the slow, 6 week version), we can simply tell them the strategy and let them practice (the 30 minute version).
Control group #1 did a WM test on Monday and a WM test on Friday. They got no practice; they got no training.
Control group #2 also did WM tests on Monday and Friday. In between, they got to practice a WM task for 30 minutes. This is a mini-version of the WM training model. (If they had gotten the full six weeks, they might have figured out the strategy on their own.)
The study group — lucky devils — were TOLD a strategy to use during their practice session. (More on this strategy below.)
What did the researchers find?
First: As they predicted, the group that was told the strategy made rapid progress, but the other two groups didn’t.
Control group #1 didn’t make progress because they didn’t even get to practice. Control group #2 did practice…but they didn’t have enough time to figure out the strategy.
Only the study group made progress because only they knew the strategy.
Second: As researchers predicted, the group that learned the strategy didn’t get better at WM tasks unrelated to the strategy they learned.
In other words: the group given a strategy behaved just like earlier groups who had discovered that strategy for themselves during 6 weeks of practice. They did better at related WM tasks, but not at unrelated tasks.
We don’t need 6 weeks to get those results. We can get them in 30 minutes.
What, exactly, is this magical strategy?
The precise strategy depends on the working memory exercise being tested.
In general, the answer is: visualize the data in patterns. If you’ve visualized the pattern correctly, you can more easily perform the assigned WM task.
You can check out page 10 of this PDF; you’ll see right away what the strategy is, and why it helps solve some WM problems. You’ll also see why it doesn’t particularly help with other WM tasks — like, for example, understanding similes or multiplying exponents.
Training Working Memory: Classroom Implications
This research suggests that we shouldn’t train students’ general WM capacity, because we can’t. Instead, we should find specific WM strategies that most resemble the cognitive activity we want our students to do.
Those strategies allow students to use the WM they have more effectively. With the same WM capacity, they can accomplish more WM work.
The key question is: what WM strategies are most like school tasks?
We don’t yet know the answer to that question. (I’ve reached out to the lead author to see if she has thoughts on the matter.)
I do have a suspicion, and here it is: perhaps the practice that we’re already doing is the best kind. That is: maybe the working memory exercise that’s most like subtraction is subtraction. The working memory exercise most like reading is reading.
If I’m right, then we don’t need to devise fancy new WM exercises. The great news just might be: the very best WM exercise already exists, and it’s called “school.”
Here’s a counter-intuitive suggestion: perhaps we might reduce stress by writing about failure.
Truthfully, that seems like an odd idea.
After all, it seems logical to think we could reduce stress by writing about puppies, or our favorite grandparent, or a happy holiday memory.
But: writing about failure? Wouldn’t that just add to the stress?
Take 1: Writing Reduces Stress
Earlier research has shown that we can reduce stress by writing.
For example, Ramirez and Beilock placed students in a high pressure academic situation. Each student had to take a difficult math test. Even more stressful, another student’s reward depended on their score.
That is: if I perform badly, I don’t get a reward AND someone else doesn’t get a reward.
(Talk about pressure.)
Half of these students had ten minutes to sit quietly. The other half used their ten minutes “to write as openly as possible about their thoughts and feelings regarding the math problems.”
You might think that this writing exercise would ramp up the students’ anxiety levels. However, it had the opposite effect. Students who had the opportunity to write about their anxiety then felt less anxious.
In fact, when Rabirez and Beilock tested this method with 9th graders taking a biology exam, they found it improved their final scores. (This effect held for the more anxious students, but not the less anxious ones.)
DiMenichi’s team asked some students to write for ten minutes about a “difficult time in which they did not succeed.” (Students in the control group spent ten minutes summarizing the plot of a recent movie they had seen.)
They then asked these students to talk extemporaneously in a mock interview for their dream job; they were told they’d be evaluated by a “speech expert” while they spoke. To add to this devilish stress test, they then had to solve math problems in their head. (When they made mistakes, they had to start over at the beginning of the sequence.)
Sure enough, as they predicted, DiMenichi & Co. found that students who wrote about a prior failure were less stressed as a result of this exercise than the students who had summarized a movie.
That’s right: writing about a prior failure reduced stress.
Did that reduced stress benefit these students?
Well, researchers then asked all the students to try an attention test. They saw letters flash on a computer screen, and had to press the space bar when they saw a consonant. However, when they saw a vowel, they did NOT press the space bar.
As you can imagine, this test requires both attention and inhibition. Once I’ve gotten used to pressing that space bar, I’ve got to restrain myself when I see a vowel.
The students’ stress levels made a big difference.
Students who had written about failure–and who therefore felt less stress–averaged roughly 7.75 mistakes on this test.
Students who summarized the movie–and who therefore felt more stress–averaged 13.5 mistakes.
That’s almost twice as many! (For stats lovers, the d value is 1.17.)
Classroom Implications
We all know students who need some stress reduction in their lives. And, we’ve all heard different ways to get that job done.
These studies, and others like them, suggest that this counter-intuitive strategy might well be helpful to the anxious students in our classrooms. If students can off-load their stress onto paper, they’ll feel less anxious, and be more successful in their schoolwork.
The best way to make the strategy work will depend on the specifics of your situation: the age of your students, the school where you teach, the personality you bring to the classroom.
I myself would be sure to explain why I wanted my high-school students to do this assignment before I asked them to give it a try.
If you attempt to use this approach, send me an email and let me know how it goes: [email protected].
(By the way: if you’re interested in the science of good stress, click here.)
Anyone who works with teenagers — teachers and parents — wonders about the mystery of adolescent self-control.
At times, they prove capable of magnificent cognitive accomplishment.
(A high-school junior I taught once composed a new soliloquy for Hamlet. Speaking of Claudius — the uncle who murdered Hamlet’s own father — Hamlet says: “My unfather unfathered me.” I think the Bard himself envies that line.)
And, at other times, they baffle us with their extraordinary foolishness.
(At the next Learning and the Brain conference, ask me about the teens who kidnapped a teacher’s dog as a gesture of respect and affection.)
Catherine Insel, working as part of Leah Somerville’s lab, wondered if teens recognize the difference between high stakes and low stakes. Better said: she wanted to know if they behaved differently in those distinct settings.
She had students aged 13-20 perform a “go/no-go task.” When they saw a blue circle or a yellow circle or a purpley circle, they pressed a button. When they saw a stripey circle, they did NOT press the button. That is, they had to inhibit the instinct to press the button.
That’s a kind of self-control.
Some of the time, they faced small rewards and penalties: plus twenty cents for getting it right, minus ten cents for getting it wrong.
Some of the time, they faced larger rewards and penalties: plus one dollar for getting it right, minutes fifty cents for getting it wrong.
You might predict that adolescents would be more careful when the stakes were higher. That is, their score would be better when a WHOLE DOLLAR was on the line.
But: nope. That’s not what happened.
In the age groups from 13-18, they did equally well on low- and high-stakes tasks. Only the 19- and 20-year-olds were measurably better at high-stakes than low-stakes.
Put simply: adolescents simply didn’t respond to the difference between high-stakes and low-stakes tests.
Adolescent Self-control: The Brain Part
So far, Insel and colleagues were looking at behavior; that’s the study of psychology. They also looked at brain differences; that’s the study of neuroscience.
In particular, they focused on two brain areas.
The pre-frontal cortex — the part of the brain just behind the forehead — helps manage “higher” brain processes, such as inhibition.
The striatum — deep in the center of the brain — is a key part of the “reward network,” and influences motivation and decision-making.
(By the way, almost ALL brain regions — including the pre-frontal cortex and the striatum — participate in MANY different brain functions.)
They found that the connection between these regions matures over time.
That is, the self-control functions of the pre-frontal cortex are increasingly able to manage the reward networks of the striatum.
No wonder, then, that adolescents get better at controlling their impulses. Only gradually does the “control” part of the brain take firm control over the “impulse” part of the brain.
Teaching Implications
Insel’s research shows not only THAT teens don’t effectively distinguish between high- and low-stakes; it helps explain WHY they don’t: the appropriate brain networks haven’t fully matured.
This research suggests that high-stakes testing just might not be developmentally appropriate for this age group.
After all: adults recognize the importance of high-stakes work. We know to prepare for job interviews differently than we do for daily meetings. We know to be on our best behavior when we meet potential future in-laws; perhaps we relax a bit once they’re actual in-laws.
Teens, however, just don’t recognize that distinction as well.
In other words: if you needed another reason to downplay high-stakes testing, Insel and Somerville’s research provides just that.
More to Know
If you’re particularly interesting in this topic, we’ve posted about it frequently on this blog.
Here’s a link to Somerville’s work, in which she explores the boundaries between adolescence ad adulthood.
Here’s a Ted-talk by Sarah-Jayne Blakemore exploring the mysteries of adolescence.
Richard Cash is running an LatB Workshop specifically on self-regulation. You can check it out here. And, I’m running a Learning and the Brain workshop on teaching adolescents in April. Click here if you’re interested in learning more.
Several days ago, I posted some thoughts about the benefits of Direct Instruction. That post specifically contrasted the benefits of DI with the perils Inquiry Learning. Specifically, Hattie finds Inquiry Learning to be largely ineffective.
The Learning Scientists have also published some skeptical thoughts about Inquiry Learning. In their most recent weekly digest, to promote balance, they offer links to some pro-Inquiry-Learning counter-arguments. If you’re an IL skeptic, you might want to check them out.
Assessing Inquiry Learning: What’s a Teacher to Do?
When we face conflicting evidence about any particular pedagogy, teachers can always focus instead on specific cognitive capacities.
For example: working memory.
If an Inquiry Learning lesson plan ramps working memory demands up too steeply, then students probably won’t learn very much.
Of course: if a Direct Instruction lesson plan ramps up WM demands, then those students won’t learn very much either.
The key variable — in this analysis — is not the specifics of the pedagogical approach. Instead, teachers can focus on the match between our teaching and the cognitive apparatus that allows learning.
In other words: overwhelming working memory is ALWAYS bad — it doesn’t matter if your lesson plan is DI or IL.
The same point can be made for other cognitive capacities.
Lesson plans that disorient students — that is, ones that interfere with attention — will hamper learning. So too motivation. So too stress.
When assessing Inquiry Learning, don’t ask yourself “does my lesson plan fit this pedagogical theory perfectly?” Ask yourself: “does my lesson plan realistically align with my students’ cognitive systems?”
The answer to that question will give you the wisest guidance.
In the debates between “progressive” and “traditional” educational theories, few arguments rage hotter than the battle between project based learning and direct instruction.
PBL’s proponents take a constructivist perspective. They argue that people learn by building their own meaning from discrete units of information.
In this view, teachers can’t simply download conclusions into students’ brains. We can’t, that is, just tell students the right answer.
Instead, we should let them wrestle with complexities and come to their own enduring understanding of the material they’re learning.
An Alternative Perspective: The Benefits of Direct Instruction
In a recent meta-analysis, Jean Stockard’s team argues that direct instruction clearly works.
Looking at 300+ studies from over 50 years, they conclude that DI benefits students in every grade, in a variety of racial and ethnic groups, with a variety of learning differences, from every socio-economic background.
Of course, this research conclusion challenges some often-repeated assurances that direct instruction simply can’t help students learn.
(The recent meta-analysis is, unfortunately, behind a paywall. You can, however, see some impressive graphs in an earlier white paper by Stockard.)
Another Alternative Perspective: Reinterpreting “Constructivism”
Interestingly, Stockard doesn’t disagree with a constructivist understanding of learning. Instead, she sees direct instruction as a kind of constructivism.
“DI shares with constuctivism the important basic understanding that students interpret and make sense of information with which they are presented. The difference lies in the nature of the information given to students, with DI theorists stressing the importance of very carefully choosing and structuring examples so they are as clear and unambiguous as possible.”
(This quotation comes from a brief pre-publication excerpt of the meta-analysis, which you can find here.)
In other words: in Stockard’s view, the difference between PBL and DI isn’t that one is constructivist and the other isn’t.
Instead, these theories disagree about the kind of information that allows students to learn most effectively.
Simply put: PBL theorists think that relatively more, relatively unstructured information helps students in their mental building projects. DI theorists think that relatively less, relatively tightly structured information benefits students.
Stockard makes her own views quite plain:
“It is clear that students make sense of and interpret the information that they are given–but that their learning is enhanced only when the information presented is explicit, logically organized, and clearly sequenced. To do anything less shirks the responsibility of effective instruction.”
You might mentally add a “mic drop” at the end of that passage.
Other Sources
Of course, lots of people write on this topic.
John Hattie’s meta-meta-analyses have shown DI to be quite effective. This Hattie website, for example, shows an effect size of 0.60. (For Problem based learning, it’s 0.12; for Inquiry based teaching, it’s 0.35.)
If you like a feisty blogger on this topic, Greg Ashman consistently champions direct instruction.
And, I’ve written about the difficulties of measuring PBL’s success here.
If you’re like me, you get this question often. Especially on a beautiful spring day…
But do your students have a point? Might there be good reasons to move class outside every now and then?
Outdoor Class Advantage: What We Know
We’ve already got research suggesting that your students might be on to something.
Some researchers suggest that classes outside help restore student attention.
Other studies (here and here) indicated that they might enhance student motivation as well.
We’ve even got reason to think that exposure to green landscape helps students learn. For example: this study in Michigan suggests that natural views improve graduation rate and standardized test scores.
None of the evidence is completely persuasive, but each additional piece makes the argument even stronger.
Outdoor Class Advantage: Today’s News
If I’m a skeptic about outdoor class, I might make the following argument. Outdoor classes might be good for that particular class. However, they might be bad for subsequent classes.
That is: students might be so amped up by their time outside that they can’t focus when they get back indoors.
To explore this concern, Ming Kuo and colleagues put together an impressive study.
Over ten weeks, two teachers taught several pairs of lessons. Half of the time, the first lesson was taught outside. For the other half, the first lesson was taught inside.
Researchers then measured students’ attentiveness during the second lesson in these pairs.
The results?
The Results!
Students were more attentive — A LOT more attentive — after outdoor classes than indoor classes.
In almost 50% of the lessons, attention was a full standard deviation higher after outdoor classes. In 20% of the lessons, it was two standard deviations higher.
Technically speaking, that difference is HUGE.
(By the way: the researchers came up with several different ways to measure attention. Outdoor classes led to improved attention in four of the five measures.)
The Implications
This research suggests that teachers needn’t worry about outdoor classes leading to distraction in subsequent classes.
That finding doesn’t necessarily mean that outdoor classes benefit learning, but it does mean we have fewer potential causes for concern.
Occasionally I try to persuade people that neuroscience is fantastically complicated. In other words: we shouldn’t beat ourselves up if we don’t master it all.
Today I spotted a headline that makes my point for me:
Hippocampus-driven feed-forward inhibition of the prefrontal cortex mediates relapse of extinguished fear
Got that?
What’s the Bigger Point?
Neuroscience is simply fascinating. As teachers, we really want to know how neurons work. And synapses. And brain regions — like the hippocampus and the prefrontal cortex.
However, specific teaching advice almost always comes from psychology. How do teachers help students connect neurons to create memories? Psychology. What classroom strategies support executive function in the prefrontal cortex? Psychology.
At a LatB Conference, you’ll enjoy the neuroscience talks because they show you what’s going on underneath the hood. At the psychology talks, you’ll get specific classroom suggestions.
The best conference experience, in my opinion, combines both.
Given all the benefits that come from retrieval practice, we should surely encourage our students to use this technique as much as possible. How can we best motivate them to do so?
Three researchers in Europe offer this answer: subtly.
More specifically, their research finds that offering students extrinsic rewards for their retrieval practice reduced its effectiveness.
Students offered rewards made more mistakes when they first tried to recall information, and–even taking those initial errors into account–remembered less than their fellow students who had received no enticement to practice.
In this study, the extrinsic rewards were cash payments: students received a euro for every correct answer. In schools, we rarely pay students money to get correct answers. However, we quite often pay them with grades.
This study suggests that retrieval practice should–as much as possible–come in the form of very-low-stakes or no-stakes retrieval.
As schools focus more on STEM disciplines, teachers strive to help our students master complex STEM concepts.
After all, it’s hard enough to say “magnetic anisotrophy,” much less understand what it is.
Researchers Dane DeSutter and Mike Stieff have several suggestions for teachers. Specifically, they argue that spatial thinking–essential to many STEM concepts–can be enhanced by appropriate gestures.