“Greg Toppo is a Senior Writer at The 74 and a journalist with more than 25 years of experience, most of it covering education. He spent 15 years as the national education reporter for USA Today and was most recently a senior editor for Inside Higher Ed.” This appeared in The74, September 22, 2025
William Liang was sitting in chemistry class one day last spring, listening to a teacher deliver a lecture on “responsible AI use,” when he suddenly realized what his teachers are up against.
The talk was about a big, take-home essay, and Liang, then a sophomore at a Bay Area high school, recalled that it covered the basics: the rubric for grading as well as suggestions for how to use generative AI to keep students honest: They should use it as a “thinking partner” and brainstorming tool.
As he listened, Liang glanced around the classroom and saw that several classmates, laptops open, had already leaped ahead several steps, generating entire drafts of their essays.
Liang said his generation doesn’t engage in moral hand-wringing about AI. “For us, it’s simply a tool that enables us not to have to think for ourselves.”

William Liang
But with AI’s awesome power comes a side effect that many would rather not consider: It’s killing the trust between teachers and students.
When students can cheaply and easily outsource their work, he said, why value a teacher’s feedback? And when teachers, relying on sometimes unreliable AI-detection software, believe their students are taking such major shortcuts, the relationship erodes further.
It’s an issue that researchers are just beginning to study, with results that suggest an imminent shakeup in student-teacher relationships: AI, they say, is forcing teachers to rethink how they think about students, assessments and, to a larger extent, learning itself.
If you ask Liang, now a junior and an experienced op-ed writer — he has penned pieces for The Hill, The San Diego Union-Tribune, and the conservative Daily Wire — AI has already made school more transactional, stripping many students of their desire to learn in favor of simply completing assignments.
“The incentive system for students is to just get points,” he said in an interview.
While much of the attention of the past few years has focused on how teachers can detect AI-generated work and put a stop to it, a few researchers are beginning to look at how AI affects student-teacher relationships.
Researcher Jiahui Luo of the Education University of Hong Kong in 2024 found that college students in many cases resent the lack of “two-way transparency” around AI. While they’re required to declare their AI use and even submit chat records in a few cases, Luo wrote, the same level of transparency “is often not observed from the teachers.” That produces a “low-trust environment,” where students feel unsafe to freely explore AI.
In 2024, after being asked by colleagues at Drexel University to help resolve an AI cheating case, researcher Tim Gorichanaz, who teaches in the university’s College of Computing and Informatics, analyzed college students’ Reddit threads, spanning December 2022 to June 2023, shortly after Open AI unleashed ChatGPT onto the world. He found that many students were beginning to feel the technology was testing the trust they felt from instructors, in many cases eroding it — even if they didn’t rely on AI.

Tim Gorichanaz, Drexel University
While many students said instructors trusted them and would offer them the benefit of the doubt in suspected cases of AI cheating, others were surprised when they were accused nonetheless. That damaged the trust relationship.
For many, it meant they’d have to work on future assignments “defensively,” Gorichanaz wrote, anticipating cheating accusations. One student even suggested, “Screen recording is a good idea, since the teacher probably won’t have as much trust from now on.” Another complained that their instructor now implicitly trusted AI plagiarism detectors “more than she trusts us.”
In an interview, Gorichanaz said instructors’ trust in AI detectors is a big problem. “That’s the tool that we’re being told is effective, and yet it’s creating this situation of mutual distrust and suspicion, and it makes nobody like each other. It’s like, ‘This is not a good environment.’”
For Gorichanaz, the biggest problem is that AI detectors simply aren’t that reliable — for one thing, they are more likely to flag the papers of English language learners as being written by AI, he said. In one Stanford University study from 2023, they “consistently” misclassified non-native English writing samples as AI-generated, while accurately identifying the provenance of writing samples by native English speakers.
“We know that there are these kinds of biases in the AI detectors,” Gorichanaz said. That potentially puts “a seed of doubt” in the instructor’s mind, when they should simply be using other ways to guide students’ writing. “So I think it’s worse than just not using them at all.”
‘It is an enormous wedge in the relationship’
Liz Shulman, an English teacher at Evanston Township High School near Chicago, recently had an experience similar to Liang’s: One of her students covertly relied on AI to help write an essay on Romeo and Juliet, but forgot to delete part of the prompt he’d used. Next to the essay’s title were the words, “Make it sound like an average ninth-grader.”

Liz Shulman, Evanton Township High School
Asked about it, the student simply shrugged, Shulman recalled in a recent op-ed she co-authored with Liang.
In an interview, Shulman said that just three weeks into the new school year, in late August, she had already had to sit down with another student who used AI for an assignment. “I pretty much have to assume that students are going to use it,” she said. “It is an enormous wedge in the relationship, which is so important to build, especially this time of the year.”
Her take: School has transformed since 2020’s long COVID lockdowns, with students recalibrating their expectations. It’s less relational, she said, and “much more transactional.”
During lockdowns, she said, Google “infiltrated every classroom in America — it was how we pushed out documents to students.” Five years later, if students miss a class because of illness, their “instinct” now is simply to check Google Classroom, the widely used management tool, “rather than coming to me and say, ‘Hey, I was sick. What did we do?’”
That’s a bitter pill for an English teacher who aspires to shift students’ worldviews and beliefs — and who relies heavily on in-class discussions.
“That’s not something you can push out on a Google doc,” Shulman said. “That takes place in the classroom.”
In a sense, she said, AI is contracting where learning can reliably take place: If students can simply turn off their thinking at home and rely on AI tools to complete assignments, that leaves the classroom as the sole place where learning occurs.
“Because of AI, are we only going to ‘do school’ while we’re in school?” she asked.
‘We forget all the stuff we learned before’
Accounts of teachers resigned to students cheating with AI are “concerning” and stand in contrast to what a solid body of research says about the importance of teacher agency, said Brooke Stafford-Brizard, senior vice president for Innovation and Impact at the Carnegie Foundation.
Teachers, she said, “are not just in a classroom delivering instruction — they’re part of a community. Really wonderful school and system leaders recognize that, and they involve them. They’re engaged in decision making. They have that agency.”
One of the main principles of Carnegie’s R&D Agenda for High School Transformation, a blueprint for improving secondary education, includes a “culture of trust,” suggesting that schools nurture supportive learning and “positive relationships” for students and educators.
“Education is a deeply social process,” Stafford-Brizard said. “Teaching and learning are social, and schools are social, and so everyone contributing to those can rely on that science of relational trust, the science of relationships. We can pull from that as intentionally as we pull from the science of reading.”
Gorichanaz, the Drexel scholar, said that for all of its newness, generative AI presents educators with what’s really an old challenge: How to understand and prevent cheating.
“We have this tendency to think AI changed the entire world, and everything’s different and revolutionized and so on,” he said. “But it’s just another step. We forget all the stuff we learned before.”
Specifically, research going back more than a decade identifies four key reasons why students cheat: They don’t understand the relevance of an assignment to their life, they’re under time pressure, or intimidated by its high stakes, or they don’t feel equipped to succeed.
Even in the age of AI, said Gorichanaz, teachers can lessen the allure of taking shortcuts by solving for these conditions — figuring out, for instance, how to intrinsically motivate students to study by helping them connect with the material for its own sake. They can also help students see how an assignment will help them succeed in a future career. And they can design courses that prioritize deeper learning and competence.
To alleviate testing pressure, teachers can make assignments more low-stakes and break them up into smaller pieces. They can also give students more opportunities in the classroom to practice the skills and review the knowledge being tested.
And teachers should talk openly about academic honesty and the ethics of cheating.
“I’ve found in my own teaching that if you approach your assignments in that way, then you don’t always have to be the police,” he said. Students are “more incentivized, just by the system, to not cheat.”
With writing, teachers can ask students to submit smaller “checkpoint” assignments, such as outlines and handwritten notes and drafts that classmates can review and comment on. They can also rely more on oral exams and handwritten blue book assignments.
Shulman, the Chicago-area English teacher, said she and her colleagues are not only moving back to blue books, but to doing “a lot more on paper than we ever used to.” They’re asking students to close their laptops in class and assigning less work to be completed outside of class.
As for Liang, the high school junior, he said his new English teacher expects all assignments to come in hand-written. But he also noted that a few teachers have fallen under the spell of ChatGPT themselves, using it for class presentations. As one teacher last spring clicked through a slide show, he said, “It was glaringly obvious, because all kids are AI experts, and they can just instantly sniff it out.”
He added, “There was a palpable feeling of distrust in the room.