The sudden rise of ChatGPT — the AI-powered chatbot released by OpenAI last fall — has educators across the country rethinking how they teach and assess student work. The University of Washington is among them, providing guidance to its faculty on how to navigate technology’s latest impact on education.
The UW’s Center for Teaching and Learning, which supports the advancement of the school’s teaching community, has issued strategies for instructors aimed at helping them communicate with students, set expectations, and develop assignments in the age of ChatGPT and other AI-based tools.
The strategies are broken into six categories, including the need to set expectations; communicate the importance of college learning; acknowledge that struggle is part of learning; assess process as much as (or more than) product; design assignments that ask students to connect course content, class conversations, and lived experience; and consider teaching through AI-based tools.
ChatGPT has generated an equal share of intrigue and concern with its ability to quickly answer complicated questions and instantly produce content — including such things as software code and student essays.
The bot builds on existing GPT natural language technology developed by OpenAI, the San Francisco-based organization formed by tech leaders Sam Altman and Elon Musk, and backed by Microsoft, which helps power the back-end cloud computing for OpenAI products.
The New York Times reported this month on the potential “huge shift” in teaching and learning as educators at all levels react to the rise of AI and in some instances redesign courses entirely to stay ahead of the technology and students trying to use it.
The Times reported that at some universities, professors are phasing out take-home, open-book assignments which seem vulnerable to chatbots. They are instead opting for in-class assignments, handwritten papers, group work and oral exams.
Seattle Public Schools joined a growing number of school districts banning ChatGPT on all school devices, saying that the district “does not allow cheating and requires original thought and work from students.”
But while recognizing logistical and ethical questions, the UW does say that technology can be “a vital part of advancing knowledge.”
“In the classroom, AI-based tools like ChatGPT have the potential to either advance learning or shortchange students,” UW spokesperson Victor Balta told GeekWire. “Our instructors are exploring how AI-based tools can be used to facilitate learning and help students think critically about digital literacy and the accuracy of information.”
Balta acknowledged that at the same time, “students who use AI-based tools as shortcuts to complete assignments shortchange themselves.”
The UW says the fact that ChatGPT is constantly evolving makes it very difficult to detect its use, but faculty are paying careful attention to student performance and have reported that some students “seem to be using the tool to complete work.”
The Center for Teaching and Learning says that instructors who prohibit the use of AI-based tools, such as ChatGPT, and suspect that a student has engaged in academic misconduct, can make a report to Community Standards and Student Conduct.