It’s nearly impossible to go through an average day without hearing about generative AI tools like ChatGPT or Gemini. AI has shifted the learning landscape in recent years — in both positive and negative ways. For the first time, accessible and free technology can help students solve complex prompts in seconds. However, AI also allows students to bypass the learning process in schools. The widespread usage of AI by students is a persistent issue that schools are grappling with, and LAHS is no different. As one of the most widely adopted generative AI tools, ChatGPT, has reached its two-year anniversary, The Talon decided to investigate the effect of AI on teachers and students.
AI in the classroom
The rise of generative AI has transformed how information is accessed and created. With its ability to perform complex tasks at remarkable speed, the capabilities of AI raise questions about originality, ethics, and the future of education.
Most students and teachers are familiar with Large Language Models (LLM), such as ChatGPT, Gemini, Llama, Bing Chat, and GitHub Copilot. LLMs first recognize text that the user inputs, then generate text, images, audio, video, or combinations of these mediums in response.
LLMs are especially user-friendly and practical, as they generate original responses in seconds. What makes them more accessible is that many of these tools are available for free. As a result, they’ve been widely utilized by students and teachers.
Although the increased usage of generative AI has been a topic of discussion in all subjects, there’s one department that was undoubtedly hit the hardest — English.
When ChatGPT first came out in 2022, English teacher Michael Kanda immediately understood the implications that it would have on students’ writing, from short answer homework assignments to major assignment essays. This became especially evident when Kanda caught multiple seniors in his English Literature class using AI to generate their research project write-ups.
“I could see that it was going to be a big issue,” Kanda said. “It’s human nature to take the easy way out. No matter what we say, someone’s going to think that it’s worth it to use AI and they’re going to use it.”
English teacher Caitlin Hannon believes that the usage of AI is most prevalent in out-of-class essays. Although students start with original ideas and evidence, she finds that many students use AI for the writing and synthesizing.
“I’m able to see my student’s ideas because we do a lot of writing together in class,” Hannon said. “I also have a good sense of my students’ voices and what their writing sounds like, though. When they turn in and that’s a different voice from what I’ve seen in class, it becomes clear to me that they’ve used a resource outside of their own brain.”
Robert Barker teaches AP Literature and Film Analysis and advises New Media Literacy. As a self-proclaimed early adopter of technology, he has spent a considerable amount of time learning about AI tools since they first appeared.
When he first experimented with generative AI, Barker tried asking it to outline and write a summary for a novel he taught in AP Literature. What should have been a straightforward task turned into an example for AI’s limitations: the tool invented an entirely new character that didn’t exist.
“I’ve read the book 10 times, so I know that that character is not in the book,” Barker said. “But if you’re a student, you don’t have the background knowledge or experience to weed out good and bad information.”
While generative AI can be a valuable assistant in synthesizing ideas and summarizing information, it lacks the depth and contextual understanding of a human reader. By lulling students into a false sense of security, they may be less inclined to engage in deeper critical thinking processes.
In another English department meeting, teachers experimented with AI tools to generate an English essay together. Although they agreed that it was by no means great, it also wasn’t terrible.
“It would have maybe gotten a B, maybe a C+,” English teacher Michael Smith said. “But for some kids that’s good enough.”
Why are we seeing students use AI?
Smith believes that students mainly use generative AI on assignments from a temptation to save time. While using AI provides instant answers and gratification, it can also lead to a cycle of dependency, where students prioritize speed over earning.
“It’s not like you’re waiting 15 minutes,” Smith said. “You’re waiting 15 seconds. You think, ‘I could sit here and pound this essay out in 45 minutes, or I could put it in AI.’ The addiction is with the amount of time that we think it’s saving us. And if a student succeeds, they’re going to use it again.”
But Smith does not think students are the only ones at fault. He also believes heavy workloads in classes make using AI a last resort for students — AI becomes even more tempting if students aren’t equipped with the right resources or time to navigate difficult assignments.
“It’s a terrible thing for you to squish time arbitrarily,” Smith said. “The lesson can’t be, ‘I’m going to fail every time if I don’t do something by this date.’ Sometimes students need more time, so teachers should give it to them.”
Kanda believes that another factor is the increasingly competitive academic culture. Students are expected to balance homework, standardized testing, extracurricular activities, and everything in between. As a result, some turn to using AI to save time.
“It’s the whole system, it’s the whole culture,” Kanda said. “Students think, ‘If having something written by AI saves me time, then I spend that time working on something else. Then, maybe that’s a trade off worth making.’”
To Hannon, AI-related academic integrity violations are also appealing because to some students, copying text from generative AI doesn’t necessarily qualify as plagiarism.
“Stealing the idea of a person sitting next to me clearly feels wrong,” Hannon said. “But when it’s just this thing in my computer, it doesn’t feel like it has the same moral implications.”
The risks of plagiarism from another student are arguably greater, as plagiarism detectors like Turnitin flag and attribute sources with a high degree of certainty, leaving little room for ambiguity.
On the other hand, it’s difficult to detect AI-generated responses. The sophistication of AI-generated content makes it hard to distinguish from human-written work. As detection technology lags behind the capabilities of these advanced generative AI tools, teachers face an uphill battle in identifying and addressing AI misuse.
Student experiences
AI produces writing using advanced vocabulary and free of grammatical errors, which is objectively better work than many students can produce themselves. Among students, there are concerns that the prevalence of AI usage has inadvertently raised the standard for the work they’re expected to do.
“Everyone uses the same AI sounding language and structure that feels very polished and correct, yet very impersonal,” said Jane. “I feel like AI sets the bar higher for teacher expectations because the writing produced by AI feels a lot better than the writing that sophomores tend to produce.”
“I think AI has made the threshold to reach an A higher,” said David. “It has such a developed language that many teachers likely raise their standards as grading, resulting in a worse grade for those who did not use AI.”
“If you let a computer program get in the way of that and you start to critique your own work because they don’t sound ‘as good as’ a robot, that gets problematic,” English teacher Lisa Battle said. “Hearing that makes me very sad, I value what students have to say more than AI.”
Even when students are capable of completing assignments on their own, the temptation to save time by using generative AI is strong. With the pressures of school and extracurriculars piling up, students are drawn to AI as a quick time-saving solution.
“It’s much easier to have AI summarize my reading homework in ten minutes than to do it myself,” said Jane. “I’ve also seen more than one student ask ChatGPT to straight up write their narrative essay and major assignments for them and then just submit them without even paraphrasing.
For some, the reliance on AI goes beyond convenience; it has become an integral part of their academic strategy.
“It’s easier to turn to ChatGPT when there’s a question I can’t answer or a paragraph that I don’t know how to write,” Ava said. “I can’t imagine getting through the school year without AI.”
Despite efforts by teachers to counteract such usage, AI-detection tools like Turnitin remain inconsistent. While some teachers rely on these tools or pay attention to whether a student’s work aligns with their usual voice, gaps persist. Inconsistencies across online-AI detection services further this issue, leaving a significant portion of AI-generated work unnoticed.
For some students who avoid using AI on assignments, knowing that their peers get away with AI-generated work is also disheartening.
“It’s really frustrating to know that when I got an A on an assignment that I tried really hard on, that someone else also got an A even though Gemini had written their whole essay,” said Jane.
“I have classmates who tell me, ‘just use AI on assignments,’” Joanna said. “These classmates often get similar grades that I do, which is frustrating because I put in the effort and time — they don’t.”
Finding a balance
As a self-proclaimed technology enthusiast, Barker is a firm believer in using generative AI for practical reasons. As a result, he will be incorporating AI into Film Analysis for the first time this spring. He plans on introducing his students to NotebookLM, a new online summarizing and note taking software developed by Google Labs.
NotebookLM can create summaries and key points from documents. It can synthesize information from multiple sources, creating an easily digestible overview. When the user doesn’t understand specific parts of the source material, they can ask questions to fill in their gaps in understanding. Barker believes that it will be beneficial to students who might struggle with using databases to find articles.
“A lot of the research can be drudgery, but if we have AI streamline a lot of the process, it might be a scaffold for a lot of students,” Barker said. “There’s definitely a place for AI to replace the more mundane aspects of conducting research, but you still have to use critical thinking when you get the content.”
In the classroom, AI has proven capable of taking on roles traditionally reserved for critical thinking. While this can streamline workloads, it also raises important questions about the long-term impact on students’ development of problem-solving skills.
“We need to find a way for AI to do less,” librarian and former English teacher Gordon Jack said. “We have to figure out how to take advantage of everything that it can do, but not so much that you’re not doing the important work anymore.”
Kanda and Hannon stress the importance of practicing such skills because they ultimately contribute to building important critical-thinking abilities.
“In our world today, everything is about fast thinking,” Kanda said. “Reading is a slow activity that requires understanding the situation of a character. One of the most important human skills is empathy, and if we just use AI to do that thinking, then we’re not working on an important skill.”
“The ability to communicate effectively, what we’re thinking, and have people understand what we mean is so important,” Hannon said. “Just because a computer can do those things doesn’t mean that they are no longer important. We’ve lost if we just say we’re going to outsource that from now on.”
The challenge, then, is not just to efficiently incorporate AI into the classroom, but to ensure that students still develop the critical, creative, and empathetic skills that are crucial beyond the classroom.
Looking to the future: Our trajectory and its implications
It’s been nearly two years since generative AI tools were introduced to the public. AI is already a problem — and there’s been minimal guidance for teachers and students, leaving them to deal with the issue on their own.
“Most school districts don’t have their arms around this,” Barker said. “We had a brief panel discussion last year, but haven’t spoken about it in a school-wide manner at all — in terms of weaving it into the curriculum, it’s really been left to the teachers.”
“Not a single teacher of mine has talked about AI usage in their class this year,” said Ava. “Either they’re ignoring how students are using it or they just don’t know it’s happening, but either way, it’s concerning.”
Every teacher has a different philosophy regarding education and AI usage. As students navigate between classes, they’re faced with inconsistent expectations. While one teacher may utilize AI tools in their class as a means to enhance creativity and efficiency, another might prohibit any use.
“If a student’s going from one teacher to another, and suddenly using AI is ethical in one class and not ethical in the next, that’s going to cause unnecessary disruption,” Barker said.
“We’re doing students a disservice by not being more clear about the boundaries of AI,” Kanda said. “When the boundaries aren’t set, it’s not a healthy situation to learn in.”
Without adequate support on assignments, students can be further inclined to turn to AI tools to bridge gaps in their learning. And while AI can serve as a powerful assistant, the lack of clear guidance and consistency has raised concerns about the future of education in a technology-driven age. Some educators, like Smith, stress the importance of creating an environment where students feel supported enough to prioritize understanding over mere completion of assignments.
“I can guide you through any assignment I give you, because for one, I designed it, and two, because I’ve done it,” Smith said. “As your teacher — as your tutor, primarily — let me help you and guide you through assignments with integrity.”
Additionally, teachers are finding ways to discourage AI use in assignments in general. One of the ways is through hardening punishments for academic integrity violations.
“I’m trying to protect those students who have academic integrity and aren’t willing to take the shortcut,” Smith said. “When I do have an academic violation, I tend to be pretty hard on that, because it’s not fair to the kid who did the assignment.”
Even though it’s still in the early stages, there is also a concern that AI usage would exacerbate technology gaps between students.
“People who are very astute at using tech tools, including AI, are going to move forward at a faster rate than the people who aren’t using those,” Jack said. “Just as we tried to help with information literacy and tech literacy, we need AI literacy to teach students how to use it in school and real life.”
From some students’ perspectives, the increasing reliance on AI has shifted the focus of education away from learning and toward merely staying afloat.
“AI has made school as a whole feel fake,” said Jane. “It becomes less about knowing how to write or comprehend or analyze, and more about who can tweak AI writing the best. It’s a relentless cycle of teachers expecting AI quality work and students feeling as if they have no choice but to use AI to keep up with the expectations.”
Teachers, too, are grappling with the pressures of integrating AI into the learning process. The time-consuming nature of policing AI usage has significantly impacted their experience.
“Quality and job satisfaction for me as a teacher has been affected in a bad way,” Kanda said. “These situations take a lot of time, and I didn’t become a teacher to police and question if someone’s done their work.”
“What is the role of the teacher in this situation?” Jack said. “It’s an existential crisis for us. That may explain some of our reluctance to move forward on AI usage as a district, because the easier thing to do is just say you can’t use it.”
The hesitation reflects a deeper uncertainty about the role of educators in an age where technology can increasingly fill in student’s gaps in knowledge. The challenge lies in positioning AI as a tool that complements, rather than replaces, human instruction.
“When does AI stop being a tool and start becoming something that gets in the way of us practicing important skills?” Hannon said. “It’s just like how if you use a calculator all the time, your brain becomes lazy at arithmetic. At the same time, I’m excited about the possibilities that are available if we figure out how to use AI in a way that helps and empowers students.”
Outside of the context of education, the broader implications of AI ring alarms. To Barker, this highlights a crucial question: what happens when we begin to outsource not just basic tasks, but our agency and judgement to AI? The temptation to rely on such tools increases the risk of diluting creativity.
“My real fear is when we cross the line from deciding what we want AI to do for us, into a place where we outsource our decisions to AI,” Barker said. “LLMs tend to drive all their answers toward the mean, the average, the composite of what “people like me” must want. What will that do to our sense of self?”
As a school, we need to have an honest conversation about the role of AI in classrooms. Not just drawing lines between acceptable and unacceptable AI use but finding solutions to preserve the core purpose of education: empowering students to think, question, and grow as individuals.