Generative AI is a digital technology that creates and alters content from existing data. AI is a rapidly changing tool that can help or hinder learning. Understanding how to use AI tools ethically and effectively is part of developing college-level digital literacy, and this process may vary by discipline. This resource, developed in conjunction with writing tutors and faculty from a range of academic backgrounds, provides general considerations about learning, policy, and practices related to AI and academic assignments. However, as faculty, always tailor assignments to your students and academic contexts.
When considering whether to allow, encourage, or require students to use AI on an assignment, make sure the use of AI will not cause students to bypass learning important concepts or skills.
Learning typically requires time, engagement, and effort, so be mindful of how AI may impact these aspects of students’ learning processes or educational experiences.
When developing or revising an assignment, state the policies related to both AI use and academic integrity. Note that AI policies are specific to different academic contexts and may conflict with each other.
Not all uses of AI tools are the same. Establish and model the degree of AI use for students.
AI is tool that cannot think or understand, so faculty should instruct students on how to check accuracy and determine the value and effectiveness of content created by AI. Faculty should help develop students’ ability to analyze, evaluate, and revise AI output since students are responsible for their work.
Faculty should teach students to cite or acknowledge AI use in appropriate and relevant ways.
Learning how to use AI ethically and effectively is an ongoing process. These guidelines, along with practice and experience, are a good starting point for the ongoing development of best practices for student AI use.