Ember Duke | Layout Editor
In light of recent and emerging advancements in artificial intelligence, the university has updated its guidelines regarding the use of AI and generative technology.
The update serves as a set of parameters rather than strict policy change, intended to give students and faculty space to learn about AI within the limits of the school’s academic integrity policy, said David Dausey, vice president of the provost. The university felt a guide was better suited because of how fast AI changes.
“We are really encouraging both faculty and students to embrace AI. It is something that is not going away,” Dausey said. “For many disciplines, and for many areas, in many fields, you might need to be conversant in the use of artificial intelligence to be successful in your career, so we want to be training people to use it.”
The guide recommends three instructional models for implementing AI in the classroom. An open, moderate and a restrictive approach, all at the discretion of the professor.
“I want them [students] to be able to feel comfortable with the use of AI within the guidelines established by the faculty in each course. And this is a matter of academic freedom for the faculty, and so we are not prescribing to faculty,” he said. “I think that you know, at least having a general understanding of the tools, what they’re capable of doing, and how they might be utilized, is absolutely critical.”
A faculty based committee helped design the guideline. They plan to meet regularly throughout the academic year and to keep up on the topic. They also consulted students in the guide development.
Jeff Lambert, assistant director for educational development, philosophy professor and member of the faculty AI committee, said the major policy changes regard data usage.
“The idea here is they’re just making it an addendum to the data governance guidelines saying generative AI restricted data may not be used in conjunction with any generative AI tool that the university is not licensed and or contracted for use,” Lambert said.
A change in written policy that is applicable to students includes an addition to the administrative policy number 26. It states that, “users of Generative AI must ensure that computing resources are safeguarded. In addition, users of Generative AI are responsible for the accuracy, privacy, regulatory compliance, and ethical use of content developed using generative AI tools or services and used in campus communications and documentation including email.”
Preparing students to use AI responsibly and promoting literacy are the foundation of the new guidelines, Lambert said.
“One of the other benefits of going with a guideline approach is it allows us to be a lot more agile and dynamic in responding to consistent changes in technologies like generative AI,” he said.
James Purdy, English and writing studies professor and the director of the university writing center, is teaching an essential questions seminar this semester titled: Will Generative AI replace Writing? The EQ class will engage students in a discussion about the constraints and ethical uses of AI, he said. Purdy is also a member of the faculty AI committee.
“What are the ways that are appropriate to use it within a context for this course, but also a framework that they can take into other writing situations and learning situations to make informed decisions,” he said.
In his classroom he promotes AI as a “bookend” tool.
“At the beginning, for like invention or brainstorming, and at the end for proofreading and editing, rather than for like drafting a text, or for asking it to write something or summarize something for a writer,” he said. “The idea, of course, is that generative AI will augment learning, rather than replace or outsource learning.”
Athletic training professor, Erica Beidler, also takes a moderate approach to AI in the classroom. She uses it as a research generation and writing improvement tool.
“It just became really apparent there’s no there’s not going to be a reality moving forward where AI doesn’t exist anymore,” she said. “So for me, it was the decision as an educator to embrace it, because it’s not going away, and how can we leverage it for good.”
Though the approach to AI in the classroom differs among faculty and courses, the general sentiment is to open a dialogue, said Purdy.
“There’s [not] only one way of thinking about it, other than it’s important to be responsible in articulating clear expectations for students and in modeling best practices from an instructor perspective,” he said. “Rather than kind of ignoring the advent of generative AI and pretending that it’s not there, we need to engage it responsibly and ethically … we need to be very clear for students, the context in which it’s appropriate and not, and that that can differ by different fields.”
Students can read the new guidelines on Artificial Intelligence at Duquesne page of the university’s website.