Lila Shroff argues that education has entered its Wild West phase in her essay “The AI Takeover of Education Is Just Getting Started,” and she’s right in the way that makes administrators nervous and instructors tired. Our incoming college students are not stumbling innocents. They are veterans of four full years of AI high school. They no longer dabble in crude copy-and-paste plagiarism. That’s antique behavior. Today’s students stitch together outputs from multiple AI models, then instruct the chatbot to scuff the prose with a few grammatical missteps so it smells faintly human and slips past detection software. This is not cheating as shortcut; it is cheating as workflow optimization.
Meanwhile, high school teachers may be congratulating themselves for assigning Shakespeare, Keats, and Dostoevsky, but many are willfully ignoring the obvious. Students are using AI constantly—for summaries, study guides, feedback, and comprehension scaffolding. AI is CliffsNotes on growth hormones, and pretending otherwise is an exercise in institutional denial.
Educators, of course, are not standing outside the saloon wagging a finger. They are inside, ordering drinks. Shroff notes that teachers now use AI to design assignments, align curriculum to standards, grade against rubrics, and complete the paperwork that keeps schools legally hydrated. Nearly a third of K–12 teachers reported weekly AI use last year, and that number has only climbed as profession-specific tools like MagicSchool AI churn out rubrics, worksheets, and report-card comments on demand. The teacher as craftsman is quietly becoming the teacher as editor.
AI’s grip tightens most aggressively where schools are already bleeding resources. In districts short on tutors and counselors, AI steps in as a substitute for services that were never funded in the first place. It is not reform; it is triage. And once institutions get a taste of saving money by not hiring tutors and counselors, it is naïve to think that teaching positions will remain untouchable. Cost-saving rarely stops at the first ethical boundary it crosses.
That is why this feels like the Wild West. There is no shared map. Some schools welcome AI like a messiah. Others quarantine it like a contagious disease. Many simply shrug and admit they are baffled. Policy is reactive, inconsistent, and often written by people who do not understand the technology well enough to regulate it intelligently.
I see the consequences weekly in my college classroom. I read plenty of AI slop—essays with perfect grammar and no pulse, paragraphs that gesture toward ideas they never quite touch. Some students have clearly checked out, outsourcing not just sentences but thinking itself. And yet AI is also an undeniable equalizer. Students emerging from underfunded schools with sixth-grade literacy levels are submitting essays with clean syntax and logical structure. They are using AI to outline arguments, test thesis ideas, and stabilize skills they were never taught. The tool giveth and the tool holloweth out.
People like to invoke “too big to fail,” but the analogy is incomplete. We do not know which AI—ChatGPT, Gemini, Claude, or some yet-unseen contender—will dominate. What we do know is that AI is already embedded in education, culture, and the economy. There is no reversing this process. The toothpaste is not going back in the tube, no matter how sternly we lecture it.
So I tell my students the only honest thing left to say: we don’t know what we’re doing. Our roles are unsettled. Our identities are unstable. We are feeling our way through a dark cave without a map and without guarantees. There may be light ahead, or there may not. The only sane posture is humility—paired with curiosity, caution, and a sober gratitude that even a force this disruptive may yield benefits we are not yet wise enough to recognize.

Leave a comment