By Razylin M. Avendano
Has the childish fantasy of a homework-doing robot finally become reality?
Possibly, and the emergence of the artificial intelligence chatbot ChatGPT has faculty and administrators at Southwestern College scrambling for boundaries.
ChatGPT is the latest artificial intelligence (AI) technology developed by San Francisco-based OpenAI. It is described as a “large language model” that uses an extensive natural language database to respond to user prompts in a conversational, human-like manner.
It learns much like a human does, too.
Dr. Joshua Davis, assistant professor of psychology, had experience with similar technology as a graduate student at the UCSD Language and Cognition Lab. He worked with early latent semantic analysis models which, like ChatGPT, operate by sampling a language database and figuring out relationships between words and how they come together, he said.
“ChatGPT and all these other models are based on cognitive psychology,” he said. “It relies on something called ‘reinforcement learning,’ where people will ask it questions and when it gives an answer people will say ‘yes that is good’ or ‘that is incorrect’ and (it learns) from what people have said.”
ChatGPT has become a topic of interest among academics because it can easily spit out essays, speeches or even programming code in a matter of seconds. AI’s response can seem human-like, leading educators to worry about the potential of cheating and students submitting AI-written responses as their own.
This concern has led some institutions like Oakland Unified School District to ban ChatGPT.
Other educators, however, argue that ChatGPT has potential deserving of consideration.
D’Angelo Silva, a business major, said cheating concerns are valid, but insisted that ChatGPT can be used by students as a legitimate tool to support learning.
“I have used it on some assignments where the questions (are) unclear and the professor did not explain it well in class,” he said. “When it is inconvenient to email them, I go to ChatGPT and ask it to reword (the question).”
Davis agreed, saying students can use ChatGPT to help them understand challenging concepts. If a student struggles to understand a topic, they can ask the chatbot to explain it in a way they can understand.
Institutions of higher education should educate faculty and students on productive uses of ChatGPT, Davis said.
“I believe the more we can do to help students learn how to use this tool in a way that helps them learn and understand and achieve things in life, the better,” he said. “There are concerns about cheating, which are definitely warranted. But the world, I believe, is going to be fundamentally transformed by (tools like ChatGPT).”
The chatbot website has amassed 100 million monthly users within two months of its late November launch. Southwestern’s Academic Senate decided to weigh in. At the March meeting faculty expressed conflicting opinions about ChatGPT. A training workshop offered by the Professional Development Department showcased the learning and teaching potential of the chatbot. It was guided by Dr. Ryan Watkins, Professor of Educational Technology at the George Washington University.
Professional Development Coordinator Jonathan Henderson said faculty has discretion on ChatGPT matters, though the college is likely to develop policy defining breach of academic integrity.
Dr. Erika Behrmann, assistant professor of communication studies, teaches Oral Communications and plans to use ChatGPT as an educational tool in the classroom.
“I think that communication is always evolving and changing,” they said. “It is my job as an educator and scholar in the field to be okay with those changes and move along with them because they are going to happen either way.”
Behrmann acknowledged that educators in other fields of study might feel different.
Behrmann expressed concerns over biases in the AI coding and the accuracy of the responses. For example, this technology — coded by White men — may generate “interesting responses” when writing speeches about the bodies of those assigned female at birth, Behrmann said.
Davis agreed and said it is unwise to assume that all responses by ChatGPT are true and unbiased. He also emphasized the importance of critical thinking.
“I think (ChatGPT) can be highly problematic if used incorrectly,” he said. “That is why I think it is important that we help students understand how it works, how they can use it and its dangers.”
In April the Academic Senate approved the use of ChatGPT in campus computer labs. Henderson said that regardless of how SC plans to regulate ChatGPT, students should be given the benefit of the doubt that they will use AI ethically.
“I think most students want to learn and they want to use ChatGPT in a way that will help them learn,” he said. “Any sort of policy that (SC) makes has to assume good intentions because our students are doing great.”