Skip to Main Content
Home

Morris Library

AI: Guidelines for SIUC Faculty

A very brief introduction to AI and AI-powered tools for teaching and learning.

The SIUC Faculty Senate adopted a set of guidelines addressing AI usage in teaching and research in July 2024. These guidelines are general by design to allow individual faculty members to chose the tools and policies best suited for their students and courses.

There are three major action items in the guidelines:

  1. Be Transparent about Costs to Students
  2. Include a Syllabus Statement on AI Use
  3. Educate Yourself on the Potential Harms of AI Requirements or Bans

What follows are resources to help you tackle these before the start of your course.

1. Be Transparent about Costs to Students

Instructors should proactively investigate the accessibility and costs of AI tools before requiring students to use them. Instructors should follow the University IT procurement process before requiring students to utilize an AI tool (https://procurement.siu.edu/how-to/categories/compsoft.php).

To this end, instructors are encouraged to indicate this in their course descriptions and syllabi and include the monetary cost to students.

This professor's opinion: Keep in mind that free versions of AI tools often have very limited feature sets compared to premium versions. Make sure that a free version offers enough usage time and tools for students to complete the necessary work, with an ample margin for safety, before requiring it. It's also important to remember that most AI tools require subscriptions rather than a one-time download, so their cost will vary depending on the length of time they'll be needed. Many vendors also offer per-month discounts with the purchase of longer subscriptions. While the per-month cost may be lower, the total cost of the subscription paid all at once may be more than a student can afford. Your students will appreciate it if you do the math to find them the best deal for the situation.

2. Include a Syllabus Statement on AI Use

Instructors are encouraged to include clear guidelines regarding the citation and acknowledgment of AI-generated content in academic submissions in their course syllabi. It is vital that students understand how to use AI tools responsibly and recognize the boundaries of legitimate use from the start of a course. Emphasis should be placed on the importance of originality and honesty in using AI-generated content, reinforcing our institution's policies on academic integrity and honesty.

Below are some examples of syllabus statements about AI usage in university courses and the processes behind developing some of them. A search of the open web for "example ai syllabus statements" will lead you to more.

 

This professor's opinion: Complete bans of AI in courses run the risk of making work unacceptably difficult for conscientious students or encouraging them to dissemble about their AI use. AI is becoming an integral part of the software tools necessary for research. It's in open web search engines (Google, Bing), voice assistants on mobile devices (Siri), speech-to-text and text-to-speech applications for people needing accommodations, writing software (Word, the Grammarly app), online translators (Google Translate, DeepL), and soon even the operating systems powering computers (Microsoft Copilot). Even library-provided databases and the venerable library catalog have available AI search assistants or are working to add them. A complete ban disallows all of these uses, so it may be wise to carve out limited exceptions or limit a ban to AI-generated text or images.

3. Educate Yourself on the Potential Harms of AI Requirements or Bans

The Faculty Senate advocates for a culture where ethical considerations such as transparency, accountability, discrimination, bias, and safety are at the forefront of AI use in academic settings.

We should commit as an institution to ensuring that all AI technologies are accessible to every member of our community, including those with disabilities. This involves adopting AI tools that comply with accessibility standards and providing necessary accommodations to support their use.

This professor's opinion: All AI models have biases built in. Know what these are before requiring use of a tool. See the infographic at the bottom of this page for some general categories of known issues with AI tools. A search of the web with a general search engine or something like Google Scholar will surface research on known issues with specific tools. Discuss the known issues and any your class discovers with your students. Pay special attention to what a tool, especially text-based tools, does with the data it's fed. Does it automatically go into the training data pool to potentially be spat back out to other users? That's a major problem if you or the students are inputting personal or private information. Be sure to warn students about submitting personal information, even if the AI tool promises to safeguard inputs.

An absolute ban cuts off students from technologies which may help them succeed in your course, especially students requiring learning accommodations. AI underpins a growing number of assistive technologies on which students may rely. Create a general exception for assistive technologies or include a statement that specific exceptions will be made for students requiring accommodations.

More on Potential Harms

Attribution and Sharing

AI Introduction and Tools by Cassie Wagner is licensed under CC BY 4.0