May 2023 – Education & Teacher Conferences Skip to main content
Should Teachers Explain or Demonstrate?
Andrew Watson
Andrew Watson

If I were a chess teacher, I would want my newbies to understand …

… how a bishop moves,

… how castling works,

… what checkmate means.

To help them understand, I could…

show them (“see how this piece moves; now see how that piece moves”)

tell them (“checkmate is defined as…”).

Both strategies sound plausible. Both probably help, at least a little bit.

Is one better than the other?

Today’s Research

I recently came across a fascinating study that explores this question.

A chess board seen from an angle, with red arrows showing how pieces might move in different combinations

In this research, two strangers met over an online puzzle — sort of a maze with prizes at the end of various paths.

Sometimes, one stranger could EXPLAIN to the other the best strategy to get the most points. (“Get the pink triangles, then the hollow squares, then the green circles.”)

Other times, one stranger could SHOW the other the winning path. (“Watch me go this way, now this way, now this way.”)

Which method worked better, show or tell?

PLOT TWIST.

In this case, the answer depended on the complexity of the puzzle.

For simple puzzles, both methods worked equally well.

For complex puzzles, telling helped more than showing.

I would have been surprised if there were a straightforward answer to the question; I am, therefore, more inclined to believe this “it depends” answer.

Take Two

This result — explaining complexity > showing complexity — prompted the researchers to test a second hypothesis.

In this case, the research details get very tricky, so I won’t go into them. But the basic idea was:

Perhas both words and actions can explain concrete things, but

Perhas words do better than actions at explaining abstract things.

Sure enough, the second experiment supported that hypothesis.

As the researchers say in their first paragraph:

Our findings suggest that language communicates complex concepts by directly transmitting abstract rules. In contrast, demonstrations transmit examples, requiring the learner to infer the rules.

In brief, the more abstract and complex the concept, the more important the words.

Teaching Implications?

Before we rush to reform our teaching, we should notice several key points about this study:

It involved adults working with other adults, and strangers working with strangers.

The participants were not — as far as I know — teachers. That is: they have neither expertise nor training in helping others understand.

The task involved (sort of) solving mazes. I’m an English teacher; my teaching — and perhaps your teaching — doesn’t focus on maze-solving like mental activity.

In other words, because this research differs A LOT from typical classroom work, its findings might not apply precisely to classroom work.

Teaching Implications!!

That said, this study reminds me of an important lesson:

Practice. My. Words.

That is: when I’m explaining a concept to my students for the first time, I should script and rehearse my explanation carefully.

Now, because I’ve been teaching for a few centuries, I’m occasionally tempted to wing.

Yes, “indirect object” is a tricky concept … but I understand it well, and I’ve explained it frequently over the years, and I’m sure I’ll do just fine…

No, wait, stop it. This research reminds me: words really matter for helping students understand abstractions.

I need to get those words just right, and doing so will take time, thought, and concentraction. (Ollie Lovell emphasizes a similar idea when he writes about the importance of “bullet-proof definitions”; for instance, in this book.)

A second point jumps out at me as well.

This study contrasts showing and telling. Of course, most of the time we combine showing and telling.

As I’ve written before, Oliver Caviglioli’s Dual Coding offers a comprehensive, research-informed exploration of this complex blend.

When I think about dual coding, I typically focus on the “showing/drawing” half of the “dual.” This study, however, reminds me that the “telling” part is equally important — and, in the case of highly abstract concepts, might even be more important.

 

In brief, in my chess classroom:

I can simply show my students how bishops move: that’s easy.

But “checkmate” is complex. I should both show and tell — and get the telling just right.


Sumers, T. R., Ho, M. K., Hawkins, R. D., & Griffiths, T. L. (2023). Show or Tell? Exploring when (and why) teaching with language outperforms demonstration. Cognition232, 105326.

Book Review: Teaching Secondary Science, by Adam Boxer
Andrew Watson
Andrew Watson

Let’s start by making this simple:

First: You should absolutely buy Adam Boxer’s Teaching Secondary Science: A Complete Guide. Sooner is better than later.

Second: You will probably not READ Boxer’s book so much as you will STUDY it. Have a pen handy; some sticky notes; your favorite memory app. Whatever system you use to keep track of big ideas and vital details — have it ready to work.

Now that I’ve been bossy, let me explain why.

Two Big Surprises

Surprise #1:Book Cover for Adam Boxer's Teaching Secondary Science: A copmlete guide.

I myself don’t teach high-school science. (I taught 10th and 12th grade English, and worked at a summer camp for 8-14 year olds.)

So, the title (Teaching Secondary Science) might suggest that the book isn’t for me.

Well, Boxer’s book (and the precision of his thinking) will absolutely make me a better English teacher; I suspect his approach will benefit almost any teacher.

Here’s why…

Surprise #2:

Longtime readers know my mantra: “don’t just do this thing; instead, think this way.”

That is: cognitive science research cannot provide us with a script (“do this thing”). Instead, that research CAN give us ways to think about memory and attention and motivation and stress. When we “think this way” about those topics, we’ll have better ideas about our teaching.

Well, Boxer’s book comes as close as any to effectively defying this mantra.

His book includes a GREAT MANY “do this thing” kind of instructions.

Phrase your question this way, not that way.

Present topics in this order, not that order.

Calculate cognitive load with this formula, not that formula.

You might think, given my mantra, I’d resist the specificity of his advice.

And yet, over and over, I found myself agreeing with his logic, and believing that I’ll do better classroom work if I understand and follow several of his scripts.

To my astonishment, I’m highly tempted to “do things Boxer’s way.” Why? Because he’s already done so much thinking for me.

Case in Point

I recently discussed Boxer’s book with a group of friends. All of us had highlighted this specific advice:

When introducing a concept, start with examples, not definitions.

Why?

Because definitions are necessarily abstract, and abstraction increases working memory load.

Examples, in contrast, live comfortably in the familiar, concrete world. This very  familiarity and concreteness reduce WM load, and thereby makes learning easier.

When my friends and I tried to apply this advice to our own teaching world, we immediately saw its usefulness.

The Spanish teacher said: don’t start with the abstract definition of the subjunctive; start with familiar examples in English.

The PD provider said: don’t start with abstract definitions of “declarative” and “procedural” memory; start with concrete classroom examples.

And so forth.

Two points merit notice here.

First: although Boxer writes about science instruction, his guidance applies widely across disciplines and age groups.

Second: although Boxer’s advice stems from (abstract) cognitive psychology, he frames it in (concrete) teaching suggestions.

That is: over and over, Boxer’s book practices what it preaches. His book does what he tells us teachers should do.

You perhaps have heard a conference speaker give passionate teaching advice (“never talk for more than ten minutes!”), only to defy this advice in his hour-long talk. Boxer carefully avoids such hypocricy.

The Big One

A few of my opinions in this interdisciplinary field approach heresy. Here’s one:

In my view, cognitive load theory helps experts talk with other experts about working memory load in the classroom.

Paradoxically, however, cognitive load theory almost certainly overwhelms the working memory of non-experts. It is, after all, complicated and jargony. (Quick: define “element interactivity” and “germane load.”)

For that reason, cognitive load theory probably isn’t useful as a framework for discussing working memory load with teachers. (Several people whom I admire are howling as they read these paragraphs.)

Boxer does not articulate this heretical claim directly. However, he enacts its conclusion quite directly.

That is: he translates the abstractions of cognitive load theory into a concrete formula — a proportionality formula using words anyone can understand.

Rather than reproduce the mathematical version of the formula here, I’ll summarize it this way:

Task complexity and abstraction increase working memory load.

The student’s background knowledge and the teacher’s support reduce working memory load.

Therefore, to optimize working memory load, we should look out for those four variables and manage them appropriately. (He’s got CHAPTERS on each of those topics.)

If you speak cognitive load theory, you see exactly how Boxer has translated its abstractions into this concrete formulation.

But — crucially — you don’t need to speak cognitive load theory to get its benefits.

Boxer, again, has taken his own advice. He has started with concrete examples rather than abstract definitions; he has thereby made it MUCH easier to learn from this book.

Always with the Limitations

Having raved for several hundred words, let me add a few quick notes of caution.

First: I don’t agree with absolutely everything Boxer writes. (I don’t agree with absolulety everything I write.) For instance: he emphatically champions mini white boards; I don’t think they’ll work in my context.

Second: Boxer’s examples draw on science teaching in high school in England. All three of those truths require some degree of translation as you apply his ideas to your work.

The English education system thrives of mysterious acronyms; you’ll just have to figure them out. When the SLT talks with the NQT about Supply, well, I can’t help you there.

Third: Full disclosure, I should point out that Boxer’s publisher is also my publisher — so I might have a conflict of interest in writing such an enthusiastic review. I certainly don’t think this connection has skewed my perspective, but you should have that information to make your own decisions.

These few points aside, I return to my initial hearty recommendation.

When you read and study Boxer’s Teaching Secondary Science, you’ll get specific and wise guidance for applying the abstractions of cognitive science to your classroom.

You’ll enjoy it, and your students will learn more.

Outsmart Your Brain by Daniel Willingham
Erik Jahner, PhD
Erik Jahner, PhD

Aligning with my work in this area, Daniel Willingham’s influential insights have greatly contributed to the field of neuroscience and education. His critique of learning styles and debunking of common learning myths and neuromyths have been pivotal. His critique of the premature application of neuroscience to the classroom advocates for translational relevance to teachers and ecological validity. Many of us in the field have been driven by his many talks, books, and insights.

Who of us has not invested extensive hours exploring new study methods and learning techniques, experimenting with trendy apps, organization routines, and innovative solutions to overcome our learning challenges. Often, we seek a quick fix or a supposed magic pills that line the self-help sections of bookstores. However, it’s easy to lose sight of the essence of learning amidst the pursuit of strategies, most of which are unsupported tradition. While this book doesn’t offer instant solutions, it effectively helps distinguish valuable approaches from ineffective ones. Its well-organized collection of learning hacks and tips assists in separating the wheat from the chaff. Daniel stresses that learning requires time and effort—inevitable aspects that can’t be bypassed. Nevertheless, Outsmart Your Brain: Why Learning is Hard and How You Can Make It Easy empowers readers to optimize their time and maximize their learning outcomes. As an educator and lifelong learner, I found this book brimming with invaluable insights to evaluate and enhance my own performance across educational domains.

The book’s origins reveal its essence. Daniel, a college instructor, witnessed his students struggling with learning difficulties. Over time, he honed his ability to identify their problems but lacked a clear solution. This prompted this cognitive scientist and ambitious educator to embark on a journey evaluating and verifying the real-life applicability of various memory and learning strategies derived from his cognition and memory research and investigating what made sense. The result of this ongoing exploration is the book at hand, which undoubtedly lives up to its promise.

This book serves as an indispensable and practical user manual for being a better student and teacher. Its value extends particularly to learning skills specialists, college students seeking lecture support, parents aiming to guide their children’s learning, or the college instructor that wants to do more than act as a sage on the stage simply reading off notes. The book’s well-structured chapters can be utilized as weekly lessons or applied individually.

Thematically organized, the book covers a wide array of essential topics to enhance learning and academic performance. It offers guidance on understanding lectures, effective note-taking strategies, making the most of labs, activities, and demonstrations, and techniques to enhance note organization for improved comprehension. Additionally, it provides valuable insights on tackling challenging readings, studying effectively for exams, assessing exam readiness, approaching different test formats.  Nurturing social-emotional skills in education Daniel also covers efficient work planning, overcoming procrastination through highly effective tips, maintaining focus, building self-confidence, and coping with anxiety.

The strategies presented in the book strike a balance between empirical research, realism, and accessibility. It acknowledges the “education in the wild,” delving into what students truly encounter and guiding them in the right direction. Dr. Willingham expertly supports these concepts with personal anecdotes from his own experiences as an educator, effectively illustrating how to make learning enjoyable and, consequently, more successful. Furthermore, the book is enriched with citations that bolster the suggested strategies and challenge traditional approaches that have been proven to be ineffective – ones we still prolifically teach and practice. I hope you notice, as i did that each chapter can be a friendly audit of our practices and beliefs about learning, urging us to evaluate how students and teachers foster successful learning and prompting tangible improvements. As I reflect on my own study years, I genuinely regret not having had access to this book. However, with my students, this book gives us some tools to be better learners together.

The Potential Benefits of Doodling and Chatting
Andrew Watson
Andrew Watson

This post will cover two topics simultaneously.

First, I’m going to describe recent research into the benefits (?) of doodling.

Second, I’m going to use a cool new artificial intelligence thingy to explore that research.

I found both fascinating; perhaps you will too.

The Story Begins

I don’t doodle. However, I know that some folks like to doodle; they say it helps them relax and focus. So, over the years, I’ve gotten increasingly zen about Team Doodle.

If doodlers don’t distract non-doodlers, I say: let the sketching begin!

Outline drawing of female student drawing

Recently I stumbled across actual research about doodling. I’m always happy to update my prior (mis)conceptions, so I reviewed this study enthusiastically.

At the same time, I thought it would be a fun opportunity to try out a new artificial intelligence platform: chatPDF.

In theory, this online tool:

… reads the PDFs you drop into it, and

… answers your questions about them.

Now, my earlier experience with chatGPT was — to put it mildly — frustrating. As I wrote then, it “hallucinated” citations for its claims.

Honestly, our field has enough difficulties without HAL just making up sources.

However, I thought, maybe chatPDF’s narrower mission will make it more effective. Why not give it a try?

So: here’s what I learned about doodling in the classroom, and about chatPDF…

First Steps

Using chatPDF couldn’t be simpler. Open the platform, drop in a PDF, start asking questions.

ChatPDF starts with a quick summary of the study:

Welcome to this informative PDF file about note-taking and doodling! In this article, you will learn about the surprising truth behind the effectiveness of doodling in reducing boredom and enhancing attention and retention of lecture material.

It also suggests a few questions you might want to ask.

What is the traditional view of doodling and fidgeting in educational contexts?
How many experiments were in this study and what was the total number of participants?
What were the findings of the study regarding the effectiveness of doodling in improving performance?

So, let’s start with the basics. I asked: “according to this study, what is the surprising truth about the effectiveness of doodling?” ChatPDF answered:

It does not appear to improve attention, retention of information, or reduce boredom or mind-wandering compared to other conditions such as note-taking or simply listening.

Wow!

In fact, in Experiment 1, participants who doodled did nominally worse on the memory assessment than those who solely listened. These findings contradict previous studies that have suggested that doodling can be an effective tool for enhancing attention and reducing boredom.

When I asked it about those contradictory studies, it responded:

Previous studies have suggested that doodling can be an effective tool for enhancing attention and reducing boredom. For example, a study by Andrade (2010) found that participants who doodled while listening to a phone message had better recall of the message content than those who did not doodle.

I followed up with some more specific questions: who were the participants? What exactly did they do in each experiment? How did the researchers measure “boredom” and “retention of information”? And so forth…

In every case, chatPDF provided prompt and clear answers. Helpfully, it also provided the page number on which it found the information — very useful for double checking!

Digging Deeper

So far, I have to say, I’m really impressed! When I asked the sorts of questions important to evaluating research, chatPDF found the answers and explained them clearly.

I started pushing a bit, asking the kinds of skeptical questions important in my work.

When I asked, “does the study mention boundary conditions?”, it responded:

Yes…The authors suggest that the lack of effect of doodling on attention and retention of information may be specific to certain types of tasks or situations.

For example, [the authors] note that previous studies have found that doodling can be effective in reducing anxiety during a stressful task (e.g., taking a test), and that it may be more effective for individuals who are more prone to mind-wandering or have difficulty focusing their attention.

A question about limitations got a list of 4 answers: everything from sample size to the fact that researchers measured retention right away — and so can’t make claims about long-term effects.

I tried to get chatPDF to speculate, but (happily) I didn’t succeed. For instance, I asked it about the benefits of doodling for students with ADHD, it responded:

The study does not specifically address whether doodling would be helpful for students with ADHD….

Given that ADHD is characterized by difficulties with attention and focus, it is possible that some students with ADHD may benefit from doodling as a tool for improving their attention and retention of information.

However, it is important to note that the current study found no evidence that doodling improves attention or retention of information.

In brief, chatPDF offered a nuanced balance: given previous research, it’s possible that doodling would benefits students with ADHD, but this research doesn’t answer that question.

In fact, it doesn’t even ASK that question.

Changing Gears

Up to this point, I’ve been so focused on chatPDF that I haven’t responded strongly to the study itself.

If you had asked me before I read this study, I would have predicted that:

Doodling probably distracts most neurotypical students in most cases (and is therefore bad for learning), but

It just might improve focus for many students with ADHD or other learning differences (and is therefore good for learning in some cases).

Having spend a while scanning available research, I’ve updated my thinking.

As far as I can tell, there’s no obvious consensus on any point – especially because the question includes so many variables.

For instance, a few studies look at the difference between doodling-as-drawing and doodling-as-shading-in-drawings (!).

The best known study — the Andrade study mentioned by chatPDF — focuses on doodling while listening to brief phone messages. While that’s an interesting place to start, it hardly sounds like a substantial analog for classroom experience.

So, this most recent study adds to a complex field…but the field remains (frustratingly) complex!

TL;DR

Based on this research adventure, I tentatively conclude that…

… chatPDF could be REALLY USEFUL for teachers who want to explore the specifics of education and psychology research, and

… despite strong claims to the contrary, we don’t have a good enough research basis to draw conclusions about doodling.

For the time being, we — teaachers and students — should rely on our best judgment.


Spencer Mueller, E. K. (2019). Note-Taking for the Win: Doodling Does not Reduce Boredom or Improve Retention of Lecture Material (Doctoral dissertation, University of Guelph).

Andrade, J. (2010). What does doodling do?. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition24(1), 100-106.

Have I Been Spectacularly Wrong for Years? Part 1
Andrew Watson
Andrew Watson

Over the years, I’ve used this blog to make several persistent arguments.

One of those arguments features in almost every post I write: context always matters.

That is: research might suggest that a particular classroom strategy works well. However, teachers should always consider the fit between that research-informed suggestion and their own teaching world.

Recently I read an article suggesting that my advice — whether or not it’s true — is almost certainly useless.

In brief: this article argues that too much focus on context and nuance increases complexity, and thereby makes the advice unpersuasive and easy to ignore. Wow.

Naturally, I followed up with the article’s author: Dr. Morgan Polikoff. This transcript of our conversation has been edited for brevity and clarity.


Andrew Watson:

Thank you Dr. Polikoff for taking the time to talk with me this afternoon.

I’ve been following your work on Twitter for a while because I like the topics you write about, and I’m always interested in your point of view.

So I was surprised recently to read an article you wrote that I disagreed with quite strongly.

I do want to get to the disagreement part, but before we get there, I wonder if you could tell me what it is that you do and why you do it.

Morgan Polikoff:

I’m an associate professor of K-12 policy at the USC Rossier School of Education. And my research primarily focuses on standards assessment and accountability policy and curriculum policy. That’s been my bread and butter over the last 13 years at USC.

Headshot of Dr. Morgan Polikoff, smiling at the camera

I think of myself as a very public facing academic. In addition to traditional publishing in peer review journals and getting grants, I’m really interested in bringing my research, and other people’s research, to bear on pressing policy topics — so that it is actually having an impact on policy and practice.

That was the spirit in which I wrote the piece for Larry Ferlazzo’s blog.

Andrew Watson:

Your Education Week article includes many ideas that I agree with.

For instance, you start by saying:

“When communicating with the public, researchers should have one message and stick to it.”

That idea makes lots of sense to me.

However, the next two points that you make post launch our disagreement.

The first is: when communicating with the public, researchers should be more prescriptive in the advice that they give:

“Academics in general are loath to offer specific recommendations about what policymakers and practitioners should do… This is unfortunately a recipe for your work not to matter, because if you don’t offer specific, actionable guidance, someone else (probably someone less informed than you) will do it.”

And second, while researchers should provide some context and nuance for that research, there’s always the danger of providing too much context and nuance:

“Researchers tend to want to nuance and contextualize their work to an excessive degree. And, of course, it’s true that every child is different and every school is different, so no one intervention is going to work in every context. But the reality is that research is never going to be sufficiently contextualized for every individual setting and what works ‘on average’ is, in most cases, going to work in individual settings, too.”

Have I summarized that correctly? Do you want to say more about those two points?

Morgan Polikoff:

No, I think that’s right.

Andrew Watson:

Okay. I’ll tell you why those two points troubled me and then maybe we can figure out if we still disagree.

In the work that I do – explaining psychology and neuroscience research to teachers – I find that most people boil research down into highly prescriptive, almost script-like messages.

These scripts oversimplify the research and make it “one size fits all.”

So I am regularly encouraging people to be less prescriptive and to make allowance for context as much as possible.

Teachers shouldn’t listen to the script that the speaker is providing, but instead to think about teaching the way researchers understand cognition, and then adapt that message to their own work.

Morgan Polikoff:

I think part of our disagreement is an audience difference.

You’re thinking primarily about teachers, I’m thinking primarily about policy makers at various levels from state leaders down to district folks.

And the second difference: this article was about impact.

How should researchers think about maximizing their impact on things that actually go on?

Now that said, I think that a lot of what I wrote about “being more prescriptive” does apply to working with teachers.

There are lots of instances in which I think we avoid being prescriptive because we think we’re not supposed to, or because we’re fetishizing local control, or because we think that the domain of the classroom is the teacher’s, or because we think that every child is utterly unique.

And some of those things I think are true, but I also think that if you really want to move practices, you have to move systems. And if you want to move systems, you have to tell people what to do a little more than I think we do now.

When I say “tell people what to do,” I don’t mean command them. I mean give them very specific advice about what you think is the best thing to do.

Andrew Watson:

Can you give me an example where prescription is beneficial, at least at the policy level?

Morgan Polikoff:

I gave the example on Twitter of high-quality instructional materials. I believe – and I will say this to any State Department of Ed person or any school district leader:

… that every teacher should have a high-quality core instructional material in every subject that they teach,

… having those materials should be a minimum expectation,

… for the most part, teachers should use those materials as the core of their instruction.

So that’s me being quite prescriptive.

But I think if I were to instead say something less prescriptive: like,

“Well, we think that curriculum materials on average are effective, but we really don’t have evidence on this, or evidence is stronger in math than it is in other subjects,”

That’s just a recipe for irrelevance. The purpose of the piece is about relevance.

Andrew Watson:

Just to play that out a little:

If it’s true that the evidence for high-quality instructional materials is better in math than in English (or foreign language teaching, or soccer coaching), I would hesitate to give prescriptive advice to anyone who isn’t teaching math because I don’t have evidence to support it.

Do you share that hesitation, or you’re all in on giving broadly prescriptive advice?

Morgan Polikoff:

I get what you’re saying.

I also think that the reality of evidence in education is so weak that if people listen to the advice that you just gave, that no one could really give advice about anything.

We just don’t have high quality causal studies with convincing evidence and lots of replication on virtually any topic. That’s certainly the case on curriculum, even in math, which is the place where we have the most evidence.

The best I can say, based on the highest quality evidence, is that I think that there’s convincing evidence that some curriculum materials are more effective than other curriculum materials.

So yeah, you’ve got to make a lot of leaps and you can appropriately caveat.

I think that you shouldn’t lie about what the strength of the evidence is. Nonetheless, people have to make decisions and they have to make decisions now. There are children in schools right now who don’t have curriculum materials, and one of the reasons for that is squishy people don’t want to tell school districts that they must provide children with quality curriculum materials.

And that is harming those children, I am certain of it – even though I don’t think that the convincing causal evidence is there.

Andrew Watson:

So let’s consider an example that you and I discussed in our email exchange.

In cognitive science world, people have insisted on something called the 10-minute rule: “people can’t pay attention to something they’re not intrinsically interested in for more than 10 minutes.”

So, if you’re a classroom teacher, you should design your lesson plan in 10-minute chunks.

Sadly, when we look at the research behind the “10-minute rule,” it’s comically weak. [link]

Would I be applying your way thinking correctly were I just to say,

“Well, the 10-minute rule doesn’t have great evidence, but it might be the only attention rule we’ve got. So I’m going proclaim that rule loud and strong. My claim isn’t a hundred percent true, but it’s better for teachers to plan in 10-minute chunks than 30-minute chunks.”

Morgan Polikoff:

Well, you’re the expert on that topic and I’m not, so I can’t evaluate the evidence. If you think that the evidence doesn’t support the 10-minute rule, then I think that you should say loud and clear that the evidence doesn’t support that: the teacher should not do that.

On the other hand, some people, I guess, think that the evidence is good, and I think they should feel free to … they’re trying to have impact by saying that.

The reality is: people who are wrong can use these strategies, just like people who are right can use these strategies, and there’s not that much I can do about that.

And I think lots of times, people who are wrong or people who are trying to sell you something actually use a strategy like this because they recognize that it’s effective.

On the other hand, people who might understand the literature and the evidence more are reluctant to use this kind of direct language.

Therefore, in some ways, this reluctance facilitates the people who are wrong and their negative impact on what goes on in classrooms.

Andrew Watson:

To contradict a false claim, should I say, “they’re wrong!” or should I provide my evidence for what’s right?

Morgan Polikoff:

I’m not an expert on how to refute wrongness.

My understanding of a refutation approach is: you say what the misconception is, you provide direct evidence that it’s wrong, and then you tell people what they should do instead.

I think being direct is the way to do that; not saying, “Well, it’s right under some circumstances, but it’s not right under others.”

No, you need to be equally as direct as the people who are giving teachers bad advice — advice that you say is bad.

Andrew Watson:

Which is complicated if the correct advice is, in fact, highly nuanced and context dependent.

Morgan Polikoff:

If the advice you think is correct really is very complicated and you need to give different advice to every individual teacher or in every individual setting, then yes, I think that’s going to be a hard message to get across. That’s just the reality of the situation.

If the message gets so complicated that you can’t explain it in a few sentences, or a one page concrete piece of guidance that you can give to a teacher and say, “Here’s what I think you should do” – then I think it’s going to be hard to get them to do what you want them to do, what you think is good.

Andrew Watson:

That’s dismaying but intriguing, isn’t it?

Morgan Polikoff:

Listen, it’s harder to get people to do complicated things than it is to get them to do simple things, and it’s harder to get people to understand complicated things than it’s to get them to understand simple things. Those seem like obvious statements.

Andrew Watson:

To some degree, those obvious statements create a challenge in this sort of work that I do, because the truth is:

… second graders aren’t 12th graders, and

… math teaching isn’t history teaching, and isn’t soccer coaching and

… the cultural context in Massachusetts isn’t the cultural context in Reykjavík, isn’t the cultural context in Korea.

So it’s just true that the specifics of teaching advice will be highly context dependent for all of those things, and many others.

Morgan Polikoff:

I don’t know if I agree with you.

I definitely agree with you, for instance, that kids develop, and so second graders are different from high schoolers.

But as for cultural differences, I don’t talk about Iceland or Korea; I talk only about the United States. The more relevant question is: are kids in Massachusetts that different from kids in Vermont or that different from kids in Nebraska?

There are certainly cultural differences. Obviously there are racial and ethnic differences, there might be language differences.

But I strongly suspect – and I think the literature is pretty clear on this — that much more of the variation in those kinds of things is within school than between schools.

And so the advice that you give about how to teach, I don’t think it’s going to be fundamentally different. Math or reading in Massachusetts is not fundamentally different from math or reading in Nebraska or Texas — at least in terms of the cultural dimensions you’re talking about.

Andrew Watson:

I think within the States, the cultural differences would be more — for instance — teachers at a Montessori school and teachers at a Catholic school and teachers at a military academy.

Although the surrounding culture itself is substantially the same, these school cultures are inviting a different way of thinking about teaching and learning, and the relationships between teachers and learners, which resemble — if they don’t equal – cross-national differences like Massachusetts and Iceland.

Morgan Polikoff:

I’m not sure how important that point is, but that makes sense.

Andrew Watson:

Okay. I guess to wrap things up: is there anything that you haven’t had a chance to say that you’d like to say?

Morgan Polikoff:

Sure.

Both in this conversation and also in the writing, I’m doing what I advise people to do: oversimplifying things.

And also, I’m pushing back at you to be provocative. That’s intentional. That’s a strategy that I’m using.

I think that provocation is useful to people. Oftentimes people are uncomfortable or unwilling to say things that are even as provocative as this discussion…and this is frankly pretty banal.

I’m playing up disagreement because I think that doing so is provocative and engaging to people.

Andrew Watson:

Well, it’s certainly been engaging to me – and I suspect it will be to our readers as well. Thank you for taking the time to chat today.


This post’s title asks if I’ve been wrong for years. I don’t (yet) have an answer to that question. But, here are my quick take-aways from this conversation:

A: Dr. Polikoff and I definitely disagree!

B: I don’t think I changed his mind.

C: I’ll let you know in a later post whether or not he has changed mine…