Screenshot of ChatGPT interface

AI at UR: Faculty and staff seize new learning opportunities

April 17, 2023

Research & Innovation

The rapid emergence of AI-based chatbot technology has led faculty, staff, and students to investigate their potential benefits — and their misuses. The widely used ChatGPT, in particular, can quickly produce various types of writing based on prompts from the user, including poetry and prose, as well as computer code.

Faculty are encouraged to determine a policy for AI-generated text and communicate that policy and expectations with students in their syllabi.

“In teaching, there was an initial concern about these chatbots impacting the academic integrity of the courses taught here,” said Andrew Bell, technology consultant and operations manager at the Teaching and Scholarship Hub. “We are still in the early days of these tools, but so far most of our conversations with faculty have been about how the tools can help students achieve their learning objectives.”

They aren’t oracles of truth. And their output needs to be treated with skepticism.
headshot of Andrew Bell
Andrew Bell
Technology Consultant & Operations Manager

Some of the potential chatbot uses at UR, Bell said, include brainstorming ideas for discussion topics, summarizing student drafts, and developing new writing prompts.

Bell also said students need to realize that, while AI chatbots are trained to create text that looks like it was created by a human, the results can be flawed. “They aren’t oracles of truth,” he said. “And their output needs to be treated with skepticism.”

Joe Essid, director of the Writing Center, also sees potential opportunities for students getting assistance from AI.  

“These tools may prove a starting point for baseline knowledge and locating sources,” Essid said. “Like Wikipedia, once shunned by academics, AI chatbots may come to be a source of help. We need something simpler than the search techniques in our current databases.”

ChatGPT can also help writers improve their essays by modeling structure, for example, or correcting grammar.

Essid says AI could assist students in learning to avoid repetition, vary sentence lengths, and perform other style checks. “That said, I don't know when and how the software may cross the line from ‘helper’ to ‘doer.’  That would be dangerous indeed, ethically and pedagogically.”

It is essential that we all know when to trust the information from these AI technologies and when to disregard or override these technologies by leveraging knowledge, experience, and ethical reasoning that students learn in the classroom at UR.

headshot of Shital Thekdi
Shital Thekdi
Associate professor of analytics and operations in the Robins School of Business

In addition to writing, students — and alumni — may make use of AI chatbots in their career searches. Elizabeth Soady, associate director of professional development for Arts & Sciences, points out that the use of artificial intelligence in the career services world is not new. AI helps hiring mangers find job candidates, and Career Services uses an AI tool called VMock to help Spider job seekers improve their resumes.

“AI can be a great starting place when looking for answers, and I see that as one of its uses at UR,” Soady says. “Whether a student needs help finding a resource, solving a problem, or getting connected, AI tools can provide instant answers.”

However, Soady advises students and grads to avoid relying on the technology exclusively.

“ChatGPT should not be used as a substitute for traditional career advising, as it does lack the human element that students can gain from meeting with an adviser.”

Shital Thekdi, associate professor of analytics and operations in the Robins School of Business, researches risk assessment and is helping students investigate AI technologies that are likely to influence their future professions.

“It is essential that we all know when to trust the information from these AI technologies and when to disregard or override these technologies by leveraging knowledge, experience, and ethical reasoning that students learn in the classroom at UR,” Thekdi said.

In her analytics courses, Thekdi discusses the credibility of various types of analysis, and the ones created by AI can lead to some interesting discussion topics. 

“These algorithms are generally trained to fill in gaps when they don’t have sufficient knowledge to answer a question prompt,” she said. “Similarly, they fail to disclose sources and uncertainties when there is limited knowledge used to reinforce stances. This can be dangerous in high-stakes situations and can lead to the sharing of misinformation and poor decision-making.” 

AI tools also reinforce the need to quickly adapt in business, she said, since methods and practices need to be adjusted on the fly to respond to unforeseen events.

“The core of business principles remains tested and constant, but flexible enough for us to adapt to new technologies and the constantly changing environment,” she said. “We all need to remain well-versed in these changes.”