Task force evaluates generative AI technology in the classroom
As artificial intelligence grows in popularity, college campuses across the nation are considering how to best address the integration of traditional education with constantly advancing technology.
In response to the release of ChatGPT in November 2022, members of faculty formed an AI task force.
Seen through its position statement, the task force's goal is to “create a safe and supportive environment where both students and employees can leverage AI technology to enhance their educational and professional journeys while maintaining our institutional values.”
At first, the task force focused on understanding how AI works and how it might jeopardize academic integrity, Pete Folliard, dean of the School of Music — who co-chairs the AI task force with Carl Olimb — said.
Using ChatGPT Teams licenses, the task force leads training in building custom GPT for faculty, staff and administrative leaders. Approximately 50 individuals have been trained so far from multiple departments, ranging from academic affairs, the student success center, registrar, strategic marketing and communications and natural science.
“We need to move into a new phase where we’re not just going to talk about AI anymore. We’re going to do AI,” Folliard said.
Giving ChatGPT clear instructions, Folliard said, is like asking an assistant to do a specific task and it can work well if you program it correctly.
For students, AI may serve as a tutor they can study with or an adviser they can direct questions toward, Folliard explained.
Director of Instructional Technology and AI task force member Sharon Gray predicts that this new technology will eventually become like today’s calculator.
Senior Andrew Berntson is a computer science and biology major, and he isn’t sure he would trust ChatGPT to help him study.
“It’s often very wrong and will make stuff up,” Berntson said.
It’s up to the user to double check AI’s work.
Students have turned in essays completely generated by ChatGPT to Professor of English Beth Boyens. It does not take her long to realize this, especially when she comes across quotes that do not exist in her assigned readings.
Being a computer science and biology major, Berntson would rather trust his own judgment when solving a math problem; however, he is not opposed to consulting ChatGPT when struggling with writer’s block. It provides what he calls “a good place to start.”
Associate professor of English and AI task force member Sarah Rude agrees.
Rude is currently teaching two sections of ENGL 200, in which students directly use ChatGPT to help construct academic writing. Rude found that ChatGPT assists students’ essay organization.
“Right now, the field is undecided about the things we value when writing a paper,” Rude said. “What are the skills that we hope that our students come away with? And I think different people answer those questions differently.”
Rude teaches her students to make essay outlines using AI, but she also wants them to appreciate the time spent in class brainstorming. In her classes, students are expected to write their thesis statements before consulting AI like ChatGPT.
For Rude, critical thinking, in which AI should not be used, happens in the planning and drafting stages as opposed to the outline stage, but not all professors agree.
When Rude shared her thoughts with the head of the journalism department Janet Blank-Libra, Blank-Libra “almost fainted.”
“In writing the outline, I prove to myself that I know what I’m doing,” Blank-Libra said. “How are we to learn to think better if we don’t practice the act and art of doing so?”
Boyens shares similar sentiments. She understands that there are ethical ways of using AI in a writing course, but since she isn’t sure of what that looks like yet she doesn’t allow students to use ChatGPT in her classes.
“From my point of view and the classes that I teach, I want students to think critically and generate their own ideas,” Boyens said.
Gray values the functionality of AI, but also worries what lessons it takes away from students.
“Here at Augustana, we are teaching students to be liberally educated, critical thinkers, creative writers and to find their own voice,” Gray said.
ChatGPT may be able to write well, but Gray feels that there is a difference between writing a well-structured essay and having a strong writing voice. In the age of generative AI, continuing to foster critical thinking and creative writing skills is increasingly important, Gray said.
“This is why I think Augie is in a particularly important position to form the people going out into the world because they’re going to need to think about these things, not just from an efficiency standpoint, but from a critical and ethical standpoint,” Gray said.
Folliard hopes that the new general education plan will include a digital literacy component that teaches students to use digital tools and maintain ethical integrity.
How much AI is integrated into classrooms depends on the professor. A student may be able to use ChatGPT to build an outline in one class but not use it at all in another.
As Folliard puts it, “There are many ways to slice the pie.”
Both the opportunities and risks of AI are being assessed by the AI task force.
“Let’s not put our heads in the sand like this isn’t happening, that these tools aren’t out there,” Folliard said. “My charge to everybody is, make your decision, but make an informed decision.”