AI & Academic Assignments (Faculty)

Download PDF

Generative AI is a digital technology that creates and alters content from existing data. AI is a rapidly changing tool that can help or hinder learning. Understanding how to use AI tools ethically and effectively is part of developing college-level digital literacy, and this process may vary by discipline. This resource, developed in conjunction with writing tutors and faculty from a range of academic backgrounds, provides general considerations about learning, policy, and practices related to AI and academic assignments. However, as faculty, always tailor assignments to your students and academic contexts.

Determine the Learning Purpose

When considering whether to allow, encourage, or require students to use AI on an assignment, make sure the use of AI will not cause students to bypass learning important concepts or skills.

  • Skills & Knowledge—What skills and knowledge could students develop by completing the assignment without AI? Will AI enhance or inhibit critical thinking and creativity?
  • Assignment Context—What is the role of this assignment? What is it building on?
  • Individual Learning Goals—How might AI use further a student’s individual personal or professional learning goals or processes?

Consider the Larger Picture or Learning Process

Learning typically requires time, engagement, and effort, so be mindful of how AI may impact these aspects of students’ learning processes or educational experiences.

  • Time—Do not assume that AI use will always save time. If AI allows students to move forward quickly without learning essential concepts or skills, when will they engage in that learning?
  • Learning Processes—The learning process is unique to each student. Consider how individual students learn to avoid skipping steps in their personal learning processes.
  • Proficiency—Mastering a skill or increasing in knowledge and proficiency takes practice and problem solving. Encourage students to use AI to engage with learning, not avoid it.
  • Limitations—AI is not good for all tasks. AI should be used for learning only when it is the right tool for the job.
  • AI Literacy—Using AI effectively requires its own learning process. Devoting time and effort to develop the skillset needed to use AI may or may not overlap with course assignments and goals.

Use Relevant AI Policies

When developing or revising an assignment, state the policies related to both AI use and academic integrity. Note that AI policies are specific to different academic contexts and may conflict with each other.

  • Assignment Policy—Establish guidelines, so students know how they may or may not use AI.
  • Course Policy—Outline general course guidelines for AI use.
  • Department Policy—Be consistent with AI policy from your department or program.
  • University Policy—Understand how university AI policies guide policy and practice across campus.
  • Platform Policy—Be aware of and comply with individual AI platform policies.
  • Disciplinary Conventions—Learn the expectations for AI use within your academic discipline.

Establish the Degree of AI Use

Not all uses of AI tools are the same. Establish and model the degree of AI use for students.

  • AI-Assisted—AI is used to brainstorm or plan, but AI content is not included in the assignment.
  • AI-Guided—AI is used to refine original content, including guiding revising and editing processes.
  • AI-Collaborative—AI is used frequently to prompt or expand ideas. The student’s original argument, voice, and direction are still prominent.
  • AI-Expanded—AI is used to develop more extensive notes or full paragraphs based on a prompt, assignment, or outline provided by a student.
  • AI-Generated—AI is used to generate full paragraphs, sections, or papers with little or no revisions by the student. Without clear faculty permission or assignment requirements, this level of AI use by students may be seen as academic dishonesty.

Guide Students in Checking AI Output

AI is tool that cannot think or understand, so faculty should instruct students on how to check accuracy and determine the value and effectiveness of content created by AI. Faculty should help develop students’ ability to analyze, evaluate, and revise AI output since students are responsible for their work.

  • Accuracy—AI is not always accurate and can fabricate information (hallucinations).
  • Bias—AI is often biased because it is trained on large samples of online data containing bias.
  • Tone—AI output may have a tone of expertise and still be inaccurate.
  • Revision—Rather than accepting AI output, encourage students to continue refining and prompting AI for best results.

Teach How to Cite or Acknowledge AI Use

Faculty should teach students to cite or acknowledge AI use in appropriate and relevant ways.

  • Disciplinary norms—Different disciplines (e.g., business, history, math, etc.) will have different norms for acknowledging AI use. In addition to establishing policy, establish disciplinary norms.
  • Citation styles—Assign the citation style appropriate for the task, course, or discipline when asking students to cite content created with or by AI.
  • Acknowledgment—Encourage students to note or acknowledge AI use even if they are not using AI-generated content word-for-word.

Continue Learning Best Practices for AI Use

Learning how to use AI ethically and effectively is an ongoing process. These guidelines, along with practice and experience, are a good starting point for the ongoing development of best practices for student AI use.

  • Understanding Purpose—AI output is only as good as a student’s prompt. Developing students’ understanding of assignment content and purpose is necessary for effective prompt engineering.
  • Disciplinary Evolution—AI will continue to evolve, so stay current on how AI is used in your field.
  • Ethics—Be aware of the big picture and how AI use impacts labor, privacy, the environment, etc.
  • Digital Divides—Not all students have the same access to AI tools. Avoid assumptions about students’ experience with or understanding of AI.
  • Critical Thinking—Continuing thinking critically about AI and student learning as you develop or revise assignments.