Dan Cohen's always-thoughtful Humane Ingenuity recently tackled "The Unresolved Tension Between AI and Learning," providing an accessible entry point for those wrestling with how or when or even why to introduce these tools into their instruction models. Learning, Cohen explains, requires slow and often difficult work, an accretion of skills or knowledge over time that builds a foundation upon which new learning can take place. Citing some scholarship along the way, he suggests the key question for educators is "Are we using AI to enhance learning, or to replace some learning steps that turn out to be essential?"
I respect Cohen's wariness here; in fact, I share it. But I think we -- especially those of us who primarily teach or support introductory general education courses -- ought to be very careful of what we deem evidence of student learning. I work with a lot of composition instructors, and for those who are personally resistant to AI but feel an obligation to engage the tools their students are already using or will likely use soon, the suggestion I'm hearing is: They can use GenAI on the front end to brainstorm ideas or develop an essay plan, but they cannot use it to write their paper.
At first glance, that makes sense. It is a writing course, after all. But I continue to nudge, to suggest, to even perhaps write overly-earnest internet essays on the subject: The ideas are the learning that matters.
Certainly, there are students who need to hone their academic writing skills because the work they hope to pursue (which may or may not be a traditional "job," in my opinion) will demand they be able to write -- the actual creation of the sentences and paragraphs and pages that amount to written work. But for most students enrolled in Composition courses, the class is a General Education requirement, a thing we've decided all college educated people need and a thing they need to get out of the way to allow them to advance to the work they actually want to do.
This is important. I attended a presentation this week on the still-substantial gap between the earning potential of a student who graduates from one of our programs and a student who does not. Our data show that a huge number of our graduates are able to earn a living wage within 5 years of completing their program. That's not wealth, of course, but it's significantly better than the data on those who start but do not complete a program here. For this reason (among others), we talk a lot about Student Success. We can see the difference it makes just for students to graduate. It can mean so much.
The faculty I work with are so dedicated -- both to their disciplines and to their students. Community colleges seem to attract those who value education in both the abstract (student learning) and the practical (student success). They resist any efforts to "dumb-down" their courses and make every effort to help students achieve. It is a beautiful and admirable combination.
In a previous essay, I argued that when it comes to evading the pressures around AI, focusing on student voice is critical. While some might read that as support for allowing AI on the front end and prohibiting it on the back, I am doubling-down:
Writing is thinking, and without practice, students will struggle to identify, articulate, and trust their own ideas -- they won't recognize their own voice. We will still have emails and reports and slide presentations, but students won't see themselves in what they create.
If we outsource the thinking to AI, the papers may see some improvement, but the students will not be fully present in them. What if, instead, we flip that equation?
What if we work with them on developing ideas, having opinions, creating structures, formulating arguments, and then ask AI to write the paper? What would we lose? Anecdotal evidence is everywhere for the overall improvement instructors are seeing in student written work, undoubtedly the result of using such tools. These results, then, are not unlike what researchers found (cited in the Cohen piece) in a study looking at AI use in the lab of professional researchers. The study showed that researchers performed their work faster and often better using the AI tool (though with markedly less satisfaction in their work), but there was a significant gap between early professionals and those who had been in the field some time. Cohen connects the dots:
The combination of the AI autocompletion of scientific processes and lab automation holds the potential to greatly shorten the distance between a scientific hypothesis and experimental confirmation. In this wonderful world of accelerated science, however, the middle steps formerly tackled by early stage scientists — tomorrow’s future conjurers of breakthroughs — are erased.
In the same way, early stage students (those in their first few years of college) should not erase those middle steps of reading, thinking, generating and organizing ideas. Those are the skills we need them to have.
So, back to the seed of this essay, found in the title. What's more important: Student Success or Student Learning? I, like Cohen, believe deeply in the value of learning. I agree with his assertion that "Process over time leads to expertise." But I find myself questioning which processes our students need more time with, which skills or habits I most hope they will continue developing once they leave our campus. For me, thinking is the process we cannot skip.
I think I'd rather one of our graduates have well-examined and exciting ideas than for them to be able to write a grammatically and mechanically-correct 5-page paper. I think I'd rather they graduate trusting their voice than not graduate because they weren't allowed to use AI to write an essay. If in 15 years, we all feel silly for having questioned the use of AI writing tools, I'd rather have young professionals who can think than those with no ideas but an ability to write. I think.
This conclusion isn't one, not really. The ideas I'm playing with here are not proclamations; they are merely ruminations, something I will continue to chew on. Thoughts? Ideas? I welcome them.