It’s Not about Cheating

Recent conversation with a contemporary (a man who worked in sales all his life, and whose grandchildren attend a Christian school):

Him: So what do you think about AI? How will your public schools deal with the fact that AI is going to control all jobs in the future?

Me: AI will certainly have an impact on the job market, but I don’t think the future of work is written in stone. As with all technologies, experience will tell us whether AI is actually useful in enhancing learning in any way. Lots of things that sound good in education turn out to be oversold or hype. Or even counterproductive.

Him: But isn’t AI going to make it impossible to tell who’s cheating? That’s what I’d be worried about if I was a teacher.

Me: What do you mean by cheating?

Him: Well, kids will get AI to write their papers and do their assignments. And teachers won’t know who wrote the paper and will be forced to give it a good grade. And if everyone gets good grades, there will be grade inflation, so it will be hard to pick out the really smart students for the top colleges.

Me: It’s not about cheating. It’s about actual learning. Students learn by doing the work, including making mistakes—whether that work is putting two blocks with three blocks to make five blocks, or testing pond water samples, producing an original haiku in class–or writing a research paper. When people talk about AI and cheating, they’re usually thinking about writing assignments—but there are many more paths to learning, K-12, than writing a paper or answering questions on a worksheet. Besides, teachers who know their students well, and have seen their skills in action, will understand how an AI-constructed response would compare to an actual response.

Him: (dubious) I suppose sharp teachers can catch them that way. Besides, you’ll have more time to ferret out cheaters when AI starts grading student work and writing your lesson plans.

Me: Only someone who knows the students and knows the usual flow of content and skills at that level can write useful lesson plans. And assessing student work is how teachers observe what their students have learned, and what they need next. I personally don’t see AI as being particularly useful in developing instructional materials, either. It certainly can’t develop relationships with kids or inspire them.

Him: Of course, this would all be different for you, as a band director—AI will change everything for regular teachers but maybe not for you. If band even exists as a class any more.

———–

Sigh. This conversation actually happened. And the man I was talking to was not an idiot. He had some magazine-article background knowledge about AI, saw its impact as inevitable and teachers as unfortunately unionized Luddites, unwilling to adapt to a rapidly changing world.

He was also right about musical performing groups—as a K-12 musical specialist, I have been having these conversations about electronic alternatives to learning to play an instrument or sing for three decades. Who would want to go to the trouble, a well-meaning friend who teaches English asked me, to learn to play the bassoon? Or even worry about singing in tune, now that auto-tune is available to fix hot musical stars’ vocal uncertainties?Why not grab a bunch of keyboards and software? Isn’t that all the instruction musicians need to, you know, put out musical content?  

The great danger of using the range of AI products in the classroom has nothing to do with cheating, per se. Fact is, students have been cheating—in the ways we usually perceive as academic cheating—forever.

From writing dates on a shirt cuff to paying someone to take your SATs, cheating is deeply embedded in academic practice. If there is a potentially positive outcome here, it might be disconnecting old ideas about plagiarism and cheating. Instead, we might be teaching our students to assess information they are presented with, comparing it to different analyses, perhaps rooting out alternative facts that aren’t really factual.

Fact is: plagiarism is ill-defined, in an era when students have access to the Library of Congress in their raggedy jeans pockets: “Anybody who embarks on a study of plagiarism hoping for bright lines is in for a foggy shock. One of the pleasing facets of plagiarism is that it doesn’t exist—not in the eyes of the law, that is, and especially not if those eyes are American. There is intellectual-property law, and a law that prohibits the trafficking of counterfeit goods. There are laws against copyright infringement. If plagiarists are sent to prison, however, it will not be because they have filched a slice of poetry, or half a juicy ballad, and passed it off as their own. Plagiarism is not a crime. It is a sin.’”

Here’s another fact: Large language models that support the kinds of AI K-12 teachers and students are being urged to adopt are constructed of plagiarized, if you will, content. Speaking of cheating.

But it’s the original point that matters most here: AI in its various platforms robs students of doing the actual work of learning: absorbing, comprehending, analyzing, synthesizing and so on. I would like to think that this is the reason that states and school districts are banning the use of cellphones in the classroom—to prevent students from believing that graded products represent actual learning.

I would also assert that learn-by-doing classes that require groups of learners (like band and choir, debate, drama and so many others) reward students for all the right habits: working together, interdependence, ongoing skill building toward a clear goal, aesthetic pleasure. Creativity, the antithesis of AI use.

Philosophy professor Kate Manne wrote a terrific piece about preventing her university students from using AI, and how it all worked out:  “I feel strongly, as I explained, that their AI use will prevent me from doing my job in helping them to grow as thinkers and writers.” Spoiler alert: students produced such superior work and thinking that she cancelled the final exam. Read the piece. It’s solid evidence.

Pushback against AI is not and never has been about cheating. It’s about genuine learning.

How about a Pause on the Race to Embed AI in Schools?

I haven’t written much about AI and education, for several reasons.

First, there are already many people writing compellingly and with considerable expertise about the uses and misuses of AI in the classroom. Some of those people will show up in this blog. Follow them. Read what they write.

Also, some years ago I developed a reputation for being a cranky Luddite. I wrote pieces about the downside of the ubiquitous online gradebook, accessible to parents 24/7, and other uses of computer programs that added to teachers’ workloads and didn’t fit with the important content and skills I was teaching students (lots of students) the old-fashioned way. The real costs of “free” programs and apps, no matter how glittery and hip, seemed obvious to me. Why didn’t other educators see this?

This came to a head when I was invited to be part of an online panel on ed technologies. Presenters sent me the language they planned to use to introduce me—did I approve? I confirmed, and then they messaged back: the bio had been created by ChatGPT. Ha-ha.

Finally, I haven’t written much about AI because I just find it hard to conceptualize how it could be useful in the classroom. In other fields, perhaps—with a lot of caveats, oversight and suspicion—but it runs contrary to the essential purpose of teaching and learning. Doesn’t it?

It’s never seemed right to let machines do the ‘thinking’ or ‘creating’ that is better done, or at least attempted, daily, by children. In short, I don’t get it. Maybe that’s because I haven’t been enlightened? So—shut up already?

I think many, if not most, practicing educators are in the same boat: Unclear about what AI actually is, and what use could be made of AI tools in their vital mission to make children independent thinkers, evaluators and creators.

For starters, who’s cool with Big Data collecting info on our public school kiddos’ engagement with their products? NEPC Report on digital platforms:  

While educators may see platforms as neutral tools, they are in fact shaped by competing interests and hidden imperatives. Teachers, students, and administrators are only one market. The other market involves data on performance, usage patterns and engagement—data flowing to advertisers, data brokers and investors, often without users’ knowledge or consent.’ 

A pretty good synopsis of what AI is, from Josh Marshall, Talking Points Memo:

“AI is being built, even more than most of us realize, by consuming everyone else’s creative work with no compensation. It’s less ‘thought’ than more and more refined statistical associations between different words and word patterns.” He goes on to make the salient point that the AI “products” being produced that will be “privately owned and sold to us.”

Doesn’t sound like something that schools need to quickly embrace, what with all our other problems, like teaching kids to read, rising absence rates and budgets stripped of our ability to feed children a nutritious breakfast and lunch.

Add in the environmental concerns and rampant intellectual property theft to teachers’ uncertainty about dumping more new, unvetted toys into an already-crammed curriculum. So I was thoroughly surprised to see the AFT get on the “AI in the classroom!!” bandwagon.

Why not take a pause—let’s call it a shutdown—on the race to embed AI in our schools? Why not sort through those competing interests and hidden imperatives? We’ve been bamboozled by climbing on attractive but ultimately damaging educational bandwagons before. Just who wants us on this one?

Well, scammers. And the folks who turned DEI into something to be avoided. Clueless Tik-Toking middle schoolers could up their game with AI. And right-wing edu-site The 74 says educators can save six hours a week by using AI to make worksheets, tests and exit tickets. Really? That’s an awful lot of worksheets.

Wouldn’t it make more sense to approach this transformative technology with great caution, holding fast to the evergreen principle of teaching and learning being a social endeavor? To look at the available research before being bedazzled by something new?

‘Participants, mostly undergraduate and graduate students, who constructed essays with the assistance of ChatGPT exhibited less brain activity during the task than those participants who were asked to write on their own. The AI-users were much less likely to be able to recall what they had written and felt less ownership over their work. Independent evaluators who reviewed the essays found the AI-supported ones to be lacking in individuality and creativity.’

If you want to read better pieces on AI, many are hyperlinked in this blog. But here are a few folks whose words and thoughts come from places of deep knowledge and experience:

Audrey Watters, the best Ed-Tech thinker on the planet,
for my money.

Pete Buttigieg, who thinks ahead of trends. Stop worrying about when he’s going to run for President and start absorbing his ideas on politics and relevant policy. Including AI.

Lucian Truscott, who writes about many things and made me understand why AI may ultimately fail: The men who run the big AI companies would do well to think through what they are doing with all those big buildings and all that electricity they consume. The “answer,” such as it is, to what they are seeking to accomplish may not exist, or it may be simpler than they think.

Educator Alfie Kohn, who points out that those most receptive to this technology are the people who know the least about it. This piece made my skin crawl.

My friend Peter Greene does a better job of debunking AI crapola than anyone I know. I credit this to his decades of classroom experience, during which he Paid Attention to Things—things more important than launching new products and making the big bucks.

So why should anyone pay attention to what a tech skeptic writes about AI in schools?

Because we’ll all be lured into making photos come to life, or relying on a questionable AI answer to an important question, or laughing at Russ Vought as Grim Reaper. Sticky and fun, but ultimately shallow, inconsequential.  Not what school-based learning should be.

Earlier this year, on a day when I made a (delicious) strawberry pie, I clicked on a song-writing app. Give us some lyrics, and a musical style, and we’ll write a song for you.

Here is my song: Strawberry Pie. Sticky and fun, but not much effort on my part.