ChatGPT

CU Denver Panel Discusses the Role of ChatGPT in Higher Education

February 14, 2023

A machine that can write an academic paper? A computer that can draft a novel? Technology that can generate an image of an event that never happened? It might sound like the plot of a blockbuster movie, but because of recent advancements in machine learning and other artificial intelligence (AI), this is a reality.  

These technologies—including OpenAI’s ChatGPT, which was released in November 2022—are dramatically changing the way we live, work, and learn. ChatGPT is a large language model (LLM). LLMs are deep learning algorithms, a subset of machine learning, which can identify, condense, translate, predict, and produce text (along with other forms of content). LLMs can do this because they are “trained” on massive datasets, which in the case of ChatGPT are enormous swaths of text from the internet. 

The tech has caused quick legislation, numerous headlines, and fervent debate. At universities, some people have expressed concern as to AI’s potential to undermine the education system or contribute to academic dishonesty if students use the technology in place of their own thoughts, opinions, and research. Others highlight potential positive uses. 

On Feb. 7, Crystal Gasell (EdD, Director of Academic Technology and Training) of The Division for Teaching Innovation and Program Strategy (TIPS) and Dennis Debay (PhD, Clinical Assistant Professor, STEM Education, SEHD) of ThinqStudio brought CU Denver faculty and staff together for an overview of ChatGPT and other recent AI advances in the context of higher education. Led by Cameron Blevins (PhD, Associate Professor CCT, History, CLAS), it featured a panel of university faculty: Ashis Biswas (PhD, Assistant Professor, Computer Science & Engineering, CEDC), Drew Bixby (CU Denver Writing Center), and Rachel Stein (PhD, Clinical Assistant Professor, SEHD). 

The panelists took questions and discussed their experiences with ChatGPT, illustrating the issues and opportunities offered by this technology within their areas of expertise. Attendees were also encouraged to share their thoughts on the topic. 

Questions included: “What are opportunities you see for this technology within your specific discipline in terms of teaching and/or research?” and “What are the major concerns you have?” Others spurred dialogue about our community, such as “What is your department or unit doing, if anything, to address this?” and “How do you think CU Denver should address this technology from an institutional level?” 

Important Highlights of the Event 


Overall, the panelists shared information that examined potential uses, caution with emerging technology implementation, and curiosity. Questions often focused on students: “What do we want students to actually learn?” Some key takeaways from this discussion are as follows: 

  • Busywork. Panelists highlighted ChatGPT’s potential for lessening the burden of paperwork or busywork. They hypothesized that the technology may be able to expedite or even automate some work, which would allow focus to be placed on higher-importance tasks. In the words of one panelist: “ChatGPT may create efficiencies for writing tasks that aren’t reflections of deep cognition, creativity, or humanity.” 
  • Ideation. While not a perfect source of information, panelists conjectured that students could use ChatGPT for inspiration. They saw opportunities to use the technology to brainstorm and query, which could help students build out their own ideation.  
  • Equity. ChatGPT has potential for increasing equity surrounding grammar and syntax, and general writing skills. The technology can revise writing and produce content with fewer grammar mistakes. While this certainly has potential to be misused by students, it was also identified as a way to increase equity in particular academic situations.  
  • Process. As ChatGPT can significantly lower the burden of work on a range of academic assignments, faculty could have the opportunity to place more weight on a student’s process rather than just the final product. This means delving further into questions such as “What was your strategy?,” “How did you come to that conclusion?,” and “What was your experience like?” 
  • Limitations. While lots of hype has been generated around ChatGPT and tools like it, clear limitations still exist. The technology often produces errors or “hallucinations” that require attentive oversight and fact-checking. One panelist said that ChatGPT “is just a better Google search engine,” and suggested that while impressive, excessive worry about the technology is misplaced.  
  • Misinformation. One danger of ChatGPT that was emphasized by panelists is the potential for the spread of misinformation. The model is trained on vast amounts of text data from the internet, including a significant amount of false or misleading information. If students use ChatGPT as a source of information, they could be exposed to inaccurate information, negatively impacting their understanding. 
  • Bias. Another danger of ChatGPT underlined by panelists is the potential for bias. The model’s training data is heavily influenced by the biases of the individuals and groups who have created the information systems the text was trained on. If these biases are not addressed, they can be reflected in the model’s responses, which could perpetuate harmful stereotypes or discriminatory attitudes.  

Mark your Calendars: There will be a follow-up event, titled “ChatGPT: Where Do We Go From Here?,” on Tuesday, Mar. 7 from 9 a.m. to 10:30 a.m. Register for the event here or reach out to the TIPS office for questions and further information.