Quick Take

Some local teachers are changing up assignments and returning to handwritten work as artificial intelligence tools threaten students' critical thinking development. But Santa Cruz County educators also see opportunities for AI to help students and teachers alike.

For years, Santa Cruz High School world history teacher Stacy Newsom Kerr has tested her students’ creative thinking skills by assigning them the same project: write a television script about the Enlightenment, using historical documents from the time.  

In the past, the unique nature of the assignment made it hard for students to cheat by plagiarizing or purchasing a copy of an existing paper. But as artificial intelligence quickly weaves its way into our daily lives, Newsom Kerr is finding that even her TV script assignment is no longer immune to the forces of modern technology. Now, AI can be trained to create work that is convincing enough to meet the assignment requirements. 

With another school year underway across Santa Cruz County, local teachers say the widespread availability of AI tools and AI-generated content threatens to shift how they approach their curricula, assign homework and tackle the difficult task of imparting critical thinking skills. They say AI has already changed how students learn, think and reason.

Before OpenAI popularized ChatGPT — one of the first generative AI models to generate text responses to questions, ask follow-ups, admit mistakes and challenge concepts — it would have been unimaginable to think that students would be able to outsource their thinking completely.  

These days, it has become increasingly difficult to convince students that research without AI is essential, said Newsom Kerr. To try to avoid the use of AI, she and some of her colleagues have started having students hand-write papers and have moved required readings from digital to print. Over time, as technology becomes more advanced, Newsom Kerr worries it will become even harder to convince students to use their own creativity and critical thinking skills to complete class assignments.

“As a teacher, I want you to be able to think. I want you to be able to understand the biases in the source,” she said. “I want you to be able to read something online, and go, that’s bulls–t.”

It’s easy to assume that the information we receive from AI chatbots is accurate and reliable. But AI technologies are making it challenging to distinguish what’s true from what’s false. This poses enormous problems for students, teachers say, even when they attempt to use the programs not to write their papers, but to learn about a subject or find sources for a project. 

“You ask a question, you get an answer, and you think, Oh, my God, this is amazing. And you read the answer, and it sounds like a well-written language,” said Richard Sonnenblick, Santa Cruz resident and chief data scientist at Planview, a business consulting firm that implements AI technologies. “But if you look a little bit more closely, then you realize that it’s created a title of a book and an author for that book that doesn’t actually exist.” 

AI consumes all information it has access to – both the “best and worst of humanity,” as Newsom Kerr describes it. 

AI models are created over multiple steps, mainly beginning with programmers feeding information into the program, said Sonnenblick. Current AI models are not yet sophisticated enough to differentiate between words with a particular meaning that must be used versus those that have a high probability of appearing in a sentence. 

Current AI models are not designed to understand the difference between words that must appear in a particular order, like the title of a real book, and those that commonly appear together in a similar context. That helps explain why AI models will sometimes generate made-up book titles in response to a prompt.

During the development process, programmers and researchers focus on having the model adapt to produce more accurate answers, Sonnenblick said. It is unlikely that AI models will eventually reach a point where there is no risk of generating false content, like inventing a book that has never existed. 

This makes generative AI programs, which students are increasingly reliant on, incredibly vulnerable to misinformation. Newsom Kerr pointed to Grok, the chatbot for X (formerly Twitter), which in early July went on a now-deleted rant of hate speech. 

Newsom Kerr doesn’t place all the blame on students for the proliferation of AI infiltrating classrooms. It’s a combination of factors, she said. High-quality academic research platforms are being put behind paywalls, while AI programs are often free and plentiful. At the same time, during the pandemic students became used to far more lenient consequences for cheating on tests and assignments. This has bred an academic cohort that tends to seek out the easier path rather than challenging themselves. 

Santa Cruz High School. Credit: Kevin Painchaud / Lookout Santa Cruz

Some students echo those worries, including concerns that teachers themselves might not be able to accurately tell the difference between an assignment that’s actually been written by one of their students and one that student might have generated using AI.  

Julia Kennedy, a senior at Santa Cruz High and Lookout intern, said teachers are overestimating how much AI-generated content they’re able to catch: “It kind of sucks because I’ll put a lot of work into the paper or an assignment, and then someone says, ‘Oh, I just use ChatGPT,’ and then we get the same grade.”

Newsom Kerr notes that even programs that are AI “checkers,” such as Turnitin, an AI detection tool for educators, are often unreliable, leaving Newsom Kerr to rely exclusively on her 15 years of teaching experience to accurately detect plagiarism.

“Technology is moving really quickly,” said Matt Bruner, the co-president of the Greater Santa Cruz Federation of Teachers. From the union’s perspective, when students submit something that a teacher is confident was completed with AI, it creates a gray area. How do we begin to judge and determine whether or not it is actually the student’s work, asked Bruner. 

Sam Rolens, communications officer for Santa Cruz City Schools, said the district is working to balance the risks with the opportunities presented by AI, including how the technology can help teachers develop assignments and encourage students to express their thoughts and perspectives. 

Last year, Santa Cruz City Schools adopted a publicly available policy outlining the expectations for AI use.

“I think that we’re ahead of the curve in general compared to most of the school districts that we encounter,” Rolens said. 

For students, the policy allows them to use AI for brainstorming, translation or creating digital images, provided that students cross-reference and properly cite AI-generated content. The guideline further specifies that misuse, including plagiarism, deepfakes or harassment, carries disciplinary consequences. Students must sign a pledge committing to responsible use.

The policy also encourages staff to use AI for educational planning and administrative work. Still, the district notes that AI cannot replace a teacher’s judgment, particularly when it comes to grading. Staff are prohibited from entering personal student information into AI systems and should exercise caution when using non-approved AI platforms, such as ChatGPT, due to privacy concerns.

Rolens points to tools like MagicSchool, an AI assistant specifically designed for educators that assists with tasks such as lesson planning, assessments and parent-teacher communication. Rolens described the tools the district is implementing as focused on helping teachers save time on some mundane tasks, such as assigning table groups.  

Other schools have taken their use of AI even further. Some, such as Alpha School, whose Austin, Texas, location was recently profiled by The New York Times, are experimenting with replacing teachers in the classroom with AI tools. The school uses AI tutors that work individually with students for only two hours a day, boasting that students learn twice as fast in this setting. The rest of the day is spent on other practical lessons that incorporate both AI and human guides who are not licensed teachers. 

Alpha School is now expanding to more locations across the U.S., with a new campus in San Francisco that is accepting enrollment for this fall. President Donald Trump, in late April, also signed an executive order promoting the more widespread implementation of AI in the classroom

Rolens said he is skeptical of this approach. The success of instruction and the success that a student will have is defined by the connection they have with their teachers, said Rolens, making it clear that the district has no interest in going down a similar path. 

As Newsom Kerr welcomes another group of students to her world history class this month, she still worries that it’s a slippery slope when it comes to AI. Sometimes it can be helpful, but at other times it can rob students of the opportunity to strengthen their abilities, she said. 

“Education is very willing, usually, to jump on the next sort of, especially tech bandwagon. We leapfrog. And oftentimes, whatever the fancy shiny thing of the month is, it goes away, and then you leapfrog to the next,” said Newsom Kerr. “But that’s not going to be AI. It’s here.”

Have something to say? Lookout welcomes letters to the editor, within our policies, from readers. Guidelines here.

Gwyneth rejoins Lookout Santa Cruz as a newsroom intern. Originally from Santa Cruz, she recently graduated from Anglo American University in Prague, and is now pursuing a joint masters in international...