Saving Head Start
Falling rates, rising risk: Vaccination rates down in California
Five Years Later: Covid’s Lasting Impact on Education
Getting Students Back to School
Calling the cops: Policing in California schools
Black teachers: How to recruit them and make them stay
A few weeks ago, my high school chemistry class sat through an “AI training.” We were told it would teach us how to use ChatGPT responsibly. We worked on worksheets with questions like, “When is it permissible to use ChatGPT on written homework?” and “How can AI support and not replace your thinking?” Another asked, “What are the risks of relying too heavily on ChatGPT?”
Most of us just used ChatGPT to finish the worksheet. Then we moved on to other things.
Schools have rushed to regulate AI based on a hopeful fiction: that students are curious, self-directed learners who’ll use technology responsibly if given the right guardrails. But most students don’t use AI to brainstorm or refine ideas — they use it to get assignments done faster. And school policies, built on optimism rather than observation, have done little to stop it.
Like many districts across the country, our school policy calls students to use ChatGPT to brainstorm, organize, and even generate ideas — but not to write. If we use generative AI to write the actual content of an assignment, we’re supposed to get a zero.
In practice, that line is meaningless. Later, I spoke to my chemistry teacher, who confided that she’d started checking Google Docs histories of papers she’d assigned and found that huge chunks of student writing were being pasted in. That is, AI-generated slop, dropped all at once with no edits, no revisions and no sign of actual real work. “It’s just disappointing,” she said. “There’s nothing I can do.”
In Bible class, students quoted ChatGPT outputs verbatim during presentations. One student projected a slide listing the Minor Prophets alongside the sentence: “Would you like me to format this into a table for you?” Another spoke confidently about the “post-exilic” period— having earlier that week mispronounced “patriarchy.” At one point, Mr. Knoxville paused during a slide and asked, “Why does it say BCE?” Then, chuckling, answered his own question: “Because it’s ChatGPT using secular language.” Everyone laughed and moved on.
It’s safe to say that in reality, most students aren’t using AI to deepen their learning. They’re using it to get around the learning process altogether. And the real frustration isn’t just that students are cutting corners, but that schools still pretend they aren’t.
That doesn’t mean AI should be banned. I’m not an AI alarmist. There’s enormous potential for smart, controlled integration of these tools into the classroom. But handing students unrestricted access with little oversight is undermining the core purpose of school.
This isn’t just a high school problem. At CSU, administrators have doubled down on AI integration with the same blind optimism: assuming students will use these tools responsibly. But widespread adoption doesn’t equal responsible use. A recent study from the National Education Association found that 72% of high school students use AI to complete assignments without really understanding the material.
“AI didn’t corrupt deep learning,” said Tiffany Noel, education researcher and professor at SUNY Buffalo. “It revealed that many assignments were never asking for critical thinking in the first place. Just performance. AI is just the faster actor; the problem is the script.”
Exactly. AI didn’t ruin education; it exposed what was already broken. Students are responding to the incentives the education system has given them. We’re taught that grades matter more than understanding. So if there’s an easy shortcut, why wouldn’t we take it?
This also penalizes students who don’t cheat. They spend an hour struggling through an assignment another student finishes in three minutes with a chatbot and a text humanizer. Both get the same grade. It’s discouraging and painfully absurd.
Of course, this is nothing new. Students have always found ways to lessen their workload, like copying homework, sharing answers and peeking during tests. But this is different because it’s a technology that should help schools — and under the current paradigm, it isn’t. This leaves schools vulnerable to misuse and students unrewarded for doing things the right way.
What to do, then?
Start by admitting the obvious: if an assignment is done at home, it will likely involve AI. If students have internet access in class, they’ll use it there, too. Teachers can’t stop this: they see phones under desks and tabs flipped the second their backs are turned. Teachers simply can’t police 30 screens at once, and most won’t try. Nor should they have to.
We need hard rules and clearer boundaries. AI should never be used to do a student’s actual academic work — just as calculators aren’t allowed on multiplication drills or Grammarly isn’t accepted on spelling tests. School is where you learn the skill, not where you offload it.
AI is built to answer prompts. So is homework. Of course students are cheating. The only solution is to make cheating structurally impossible. That means returning to basics: pen-and-paper essays, in-class writing, oral defenses, live problem-solving, source-based analysis where each citation is annotated, explained and verified. If an AI can do an assignment in five seconds, it was probably never a good assignment in the first place.
But that doesn’t mean AI has no place. It just means we put it where it belongs: behind the desk, not in it. Let it help teachers grade quizzes. Let it assist students with practice problems, or serve as a Socratic tutor that asks questions instead of answering them. Generative AI should be treated as a useful aid after mastery, not a replacement for learning.
Students are not idealized learners. They are strategic, social, overstretched, and deeply attuned to what the system rewards. Such is the reality of our education system, and the only way forward is to build policies around how students actually behave, not how educators wish they would.
Until that happens, AI will keep writing our essays. And our teachers will keep grading them.
•••
William Liang is a high school student and education journalist living in San Jose, California.
The opinions expressed in this commentary represent those of the author. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.
On July 1, the unpopular Reading Instruction Competence Assessment will be replaced with a performance assessment to ensure teachers are prepared to teach students to read.
The National Council on Teacher Quality gave most graduate credentialing programs an “F” in elementary math preparation. Are math and everything else too much to squeeze into a year-long program?
A quick screening test will be administered to all students in kindergarten through second grade to detect possible reading difficulties, but it is not intended to be a final diagnosis.
Head Start students learn, eat healthy meals and receive dental and medical services. Parents worry federal funding cuts could limit the number of children in the program or end it altogether.
Comments (5)
Comments Policy
We welcome your comments. All comments are moderated for civility, relevance and other considerations. Click here for EdSource's Comments Policy.
James 1 month ago1 month ago
This article provides no answers or solutions it's just complaining, the sad reality is AI is too good now, AI detectors like Turn It In which is used by almost every major educational institution can be easily defeated by an AI "humanizing" a paper so long as the user removes footers and prompts from the AI, you want real learning to occur? Then you don't pawn it off onto homework, you do it in class, … Read More
This article provides no answers or solutions it’s just complaining, the sad reality is AI is too good now, AI detectors like Turn It In which is used by almost every major educational institution can be easily defeated by an AI “humanizing” a paper so long as the user removes footers and prompts from the AI, you want real learning to occur? Then you don’t pawn it off onto homework, you do it in class, with school internet firewalls that block all AI sites but still allow research to be done.
Worried about AI apps on phones? Easy day, seeing phone use is easy and since it’s banned in most classes you confiscate the phones you see out, not permanently just for the remainder of class they can sit on your desk up front
Replies
Wess 4 weeks ago4 weeks ago
I don't think it's the responsibility of a high school student to find solutions to one of the most difficult challenges educators have had to face. This student is offering an honest perspective that has a lot of value and is at least attempting to define the problem. Your solutions seem to be to continue doing whatever was being done before, but with more policing and more repression, essentially pretending you can will the problem away. … Read More
I don’t think it’s the responsibility of a high school student to find solutions to one of the most difficult challenges educators have had to face. This student is offering an honest perspective that has a lot of value and is at least attempting to define the problem.
Your solutions seem to be to continue doing whatever was being done before, but with more policing and more repression, essentially pretending you can will the problem away. I think you’re overlooking many aspects of the issue, particularly that: 1.Engagement is the most critical aspect of instruction, and it’s very unlikely to be sustained in a repressive environment. And 2. Your approach completely ignores that we now have a tool that can allow students to achieve things we couldn’t dream of before if it’s used intelligently. It’s opening so many doors to students who understand its potential, but they likely won’t learn that in a class such as you describe.
You can pencil-and-paper it and even use cell phone scramblers all you want, sir, but please remember: if they’re not learning, you’re not teaching.
Mark Loundy 1 month ago1 month ago
I love this and I’m going to be sharing it with one of our school board members tomorrow. You’re right that AI has exposed a broken system, but the answer is not to revert to pen and paper. That would be giving up. The answer is to truly change education by moving into an experiential, constructionism mode, and away from instructionism. That would make AI “cheating” a non-sequitur.
Replies
Antoine 1 month ago1 month ago
I don't agree, original and constructive work has its place in school but realistically it won't ever make the bulk of learning. Every art has its drills and its part of learning, building you brain. Pen and paper is fine. The only downside is organisations have to accept learning requires staff and time. The investors dream of having 80% of learning done at home on a computer to cut costs is dead. You need people … Read More
I don’t agree, original and constructive work has its place in school but realistically it won’t ever make the bulk of learning. Every art has its drills and its part of learning, building you brain. Pen and paper is fine. The only downside is organisations have to accept learning requires staff and time. The investors dream of having 80% of learning done at home on a computer to cut costs is dead. You need people to talk read correct etc..
Tom Courtney 1 month ago1 month ago
William, this was an articulate and well-written commentary for which I am very grateful. After the pandemic, many pieces were written to highlight the new uses of technology. Meanwhile, the experience in my class, and that which I tried to share, was that human beings need one another in ways we should take note of in that moment. With the rise of generative AI and so many folks lining up to make dollars, I think … Read More
William, this was an articulate and well-written commentary for which I am very grateful. After the pandemic, many pieces were written to highlight the new uses of technology.
Meanwhile, the experience in my class, and that which I tried to share, was that human beings need one another in ways we should take note of in that moment.
With the rise of generative AI and so many folks lining up to make dollars, I think we’re also not seeing something. You named it in the class by sharing exactly what you see. As a teacher, I see it too. We must be careful what we call outdated or old fashioned for the sake of new and shiny things. And there is an element that without proper checks, students will use AI for ways we can all predict. Learning still needs to happen. Thank you for reminding us of this.
I hope those policymakers and state leaders reading this fine commentary are taking note of what this articulate and honest student is sharing with us all in this moment.
Keep writing William!