Serving the Platte Valley since 1888

Gillette College develops rules for student AI use

GILLETTE —- Gillette College is in the process of approving new language to give more direction to students and teachers on artificial intelligence when it comes to academic dishonesty and potential uses in the classroom.

The additions to the college’s academic dishonesty policy and classroom syllabi were developed by the college’s AI task force, which was put together this spring.

The task force, led by Dean of Letters and Sciences Martin Fashbaugh, has been working throughout the summer on a slate of broad goals related to developing the college’s game plan for determining when and how the college should be using AI.

While the language hasn’t received final approval, the additions to the academic dishonesty policy will define the college’s parameters for what it means to use AI to cheat.

“Because AI is such a common avenue for students who are engaging in academic dishonesty … we thought we needed a clear statement,” Fashbaugh said.

Besides deciding if and when to prohibit the use of AI, the task force is also looking at ways to incorporate instruction on using AI into the school’s curriculum. The task force will have a list of AI assignment ideas for instructors by the end of this fall, as well as AI certification ideas by the end of spring next year.

Fashbaugh said statistics show employers are significantly more interested in job candidates who can effectively use AI to research, write, and analyze data, and that the college is in a good position to meet that demand early.

It would be one of the first community colleges in Wyoming to offer any kind of certification in AI.

Direction on if and when students can use AI is still up in the air. Fashbaugh said classroom use will probably be dependent on the goals of the class and what skills students are expected to learn from course to course.

“If it’s a writing course that’s for business, and preparing students to enter industry, and the course outcomes suggest that some utilization of AI technology is acceptable, great,” Fashbaugh said. “If you’re in a composition class … and you’re trying to get students to practice invention … to develop authentic arguments while using their own voice, that’s a different scenario.”

He recommended TurnItIn’s AI detection feature as a tool for teachers to use to identify the use of AI in classwork but warned the program isn’t immune to false positives.

The task force considered three versions of the AI policy language to be added to classroom syllabi: prohibit the use of AI by students, allow its use in limited circumstances, or permit students to use it freely.

The task force settled on a fourth option. The policy will direct students to ask their professors for specifics about how and when it’s appropriate to use AI in their classrooms, and instructors are free to add to the statement on their syllabi. The new language is currently under review.

“The statement acknowledges that AI is used in ethical and unethical ways,” Fashbaugh said. “(It) can really prompt instructors to add instruction on how it's used, or not at all.”

Fashbaugh said the growing role of AI in higher education was brought to his attention when he attended the annual Higher Learning Commission’s conference with Gillette College president Janell Oberlander, where there were several panels on AI use in higher education.

“Going to any academic conference, any administrative conference, you’ll find a lot of panels on AI,” Fashbaugh said. “(It’s) important that we do our best to train our students to, in an ethical manner, use AI tools.”

 

Reader Comments(0)