Instructional Resources

Although the technology and conversations around the impacts of Artificial Intelligence (AI) are evolving quite rapidly, it is already becoming clear that AI will have a definitive impact on educational institutions, along with many other aspects of society. For the instructor navigating this new space, there is a lot to know, with questions that range from AI and Academic Integrity (link includes an example syllabus statement) to Privacy Policies and Ethics. A recent article in the Washington Post (August 13, 2023) summed up the mixed fear of cheating and excitement about new opportunities that are being felt by instructors across the country.

For the Fall 2023 semester, UNM has some some AI detection capabilities built into TurnitIn Similarity (a plagiarism detection tool that is integrated with UNM Canvas). However, you should be aware that AI detection tools can generate false positives, and ubiquitous access to AI is not going away. Setting clear expectations with your class on when and how to ethically incorporate AI into their studies is an excellent best practice, and you may want to look at this crowd sourced list of AI Classroom policies to help you identify the right approach for your class. The UNM Lib Guide also has some great suggestions for thinking through how you structure assignments in the context of ubiquitous AI. This library of Creative Commons licensed syllabus icons also provides an interesting way to frame your thinking around how AI might be used in your course. If you do allow the use of AI, make sure your students know how to properly cite it.

Not sure where to start? Try this Research Guide of materials curated by UNM Libraries, or check out the training and workshops we've identified.

Below is a recording of the recent College of Arts & Science's event: Teaching and Learning with AI, with José Antonio Bowen.

Teaching and Learning with AI

 

We're not alone

In April 2023, WCET (the the WICHE Cooperative for Educational Technologies) undertook a survey of how and why institutions were leveraging AI in education. The findings indicate that the use of AI to support teaching and learning is in its infancy and there is not a centralized response on many college campuses. If you'd like to learn more or participate in a community of practice around this, we'd love to have you join us.

A note about FERPA

Under the Family Educational Rights and Privacy Act of 1974 (FERPA), students have the right to inspect and review most education records maintained about them by the University of New Mexico, and, in many cases, decide if a third person can obtain information from them. Many online services that have not been licensed by UNM or reviewed by UNM's Information Security and Privacy Office may have terms and conditions that do not protect privacy rights associated with educational records. It is also always a good idea to consult with UNM's Data Stewards before sharing UNM Data with external services. For this reason, it is important that students and instructors understand what limitations on privacy are imposed on them when they use online services, including AI engines. Students cannot be required to waive their FERPA rights to complete an assignment, and faculty and staff should avoid submitting information that could include private student information to services that have not been reviewed and secured for that purpose by the University of New Mexico.

 

Addressing AI Use in the Classroom Mindfully: How to approach students about AI misconduct 

It is important to develop and communicate a clear policy on the use of AI in coursework, including examples of acceptable and unacceptable uses. Here are some ways you can incorporate the policy into your course:  

  • Provide students with a syllabus statement outlining the policy on AI and academic integrity (e.g., AI and Academic Integrity (link includes an example syllabus statement). You can also share a separate document with your stance on  Generative AI and outline all the details on how AI may be used in your course. Dr. Vanessa Svihla, from the OILS department, has generously shared an example here. 
  • Community agreements are another way to provide a clear policy for students.Instructors can collaboratively work with students to create a policy for the course. Involving students in creating the policy can foster a sense of ownership. Instructors have used collaborative tools such as Canvas Discussion Boards, Office 365 Shared Docs, and editable Canvas Pages for students to share their input. Dr. Natalie Kubasek, Adjunct English Faculty at UNM Valencia, has shared one of her course’s community agreements and the assignment the students use to arrive at the final agreements and policies for the class: Community Agreements Example - Dr. Natalie Kubasek 
  •  An example statement for a community agreement: "We agree to use AI ethically and responsibly, ensuring it enhances rather than replaces our learning efforts."  
  • Regularly discuss the ethical use of AI and its implications in academic work.

  • Use real-life scenarios to illustrate appropriate and inappropriate uses of AI. For example: 

    • “If you use an AI tool to brainstorm ideas for your essay, that's fine. However, using an AI to write your essay without any input from you is not acceptable.”  

  • Foster an environment where students feel comfortable discussing their use of AI tools. Reassure them that questions and concerns about AI use are welcome. 
    • “I want to talk about how we use AI tools in our coursework. AI can be incredibly helpful, but we need to use it in ways that support our learning. If you're ever unsure about whether your use of AI is appropriate, please come and talk to me.” 

    •  “Let’s discuss some scenarios where AI might be used. For instance, using AI for grammar checking is great, but generating entire paragraphs probably crosses a line. What do you think?” 

    • “How do you feel about using AI tools in your assignments? Do you have any concerns or questions about what’s acceptable?” 
  • Explain how academic dishonesty can hinder students’ learning and their professional development.  
  • “Relying too heavily on AI can prevent you from developing the critical thinking and problem-solving skills that are essential in your field.” 
  • Discuss potential consequences, including damage to students’ future reputation and career prospects. Explain in your own disciplinary terms how limiting it will be for students if their work is only to the standard of what AI can produce. This may limit their abilities to compete for the positions they want or demonstrate their professional value for advanced opportunities. 
  • “Using AI to do your work for you can seem like a shortcut, but it shortchanges your learning. Let's talk about how it might impact your skills in the long run.” 
  •  “Think about the skills employers are looking for. If you're only using what AI delivers to your queries, it could hurt your job prospects. Let’s find ways to use AI that enhance rather than replace your work.” 

If you have a clear AI policy for your course and suspect that a student has used AI in a way that is out of line with your expectations, use the same practices you would to address any academic integrity concern. 

  • Gather evidence of suspected AI misuse thoughtfully and objectively. 
  • Schedule a meeting or send direct messages/email with the student to discuss the assignment, focusing on specific passages or elements of concern. 
  • Approach the discussion with a supportive attitude, aiming to understand the student's perspective and with the end goal of helping them to learn. Avoid accusative statements. Prompt students to explain their reasoning and understanding of the assignment in a controlled environment. 

  • I noticed some parts of your assignment that seem different from your usual work. Can you walk me through your process for this assignment?”
     
  • Ask the student to explain their understanding and reasoning behind their work. 
  •  “I have some questions about specific parts of your assignment. Could you explain how you arrived at these points? This will help me understand your process better.” 
  • “It seems like there might have been some misunderstanding about how to use AI tools for this assignment. Let’s talk about what happened and how we can address it.” 
  • Students may not get it right the first time. Students are also still learning how AI fits within their academic journey, so they may have not misused it purposefully. 
  • Frame the conversation around the learning goals of the assignment and how improper use of AI may undermine these objectives. 
  • “The goal of this assignment is to help you develop your analytical skills. Therefore, using AI improperly can hinder your learning since you will not have a chance to analyze the issue yourself.” 
  • Offer guidance on how to properly integrate AI tools to enhance learning rather than replace critical thinking and originality.  
  • “AI can be a great assistant, but it's important to make sure your work reflects your own thinking. Tools like Grammarly can help you refine your writing without taking over the process.” 
  • “I understand that integrating AI into your work can be tricky. Let’s go over how you can use these tools to support your learning while maintaining your own voice in your assignments.” 
  • “If you’re struggling with the balance of using AI, we can look at some tools together that can help with editing and refining your work without taking away from your originality.” 
  •  “Remember, AI should enhance your learning, not replace it. Let’s talk about how you can use these tools effectively while ensuring your work remains your own.” 

By implementing these actionable steps and using these scripts/ conversation starters, instructors can effectively address AI misconduct while fostering a positive and ethical learning environment. 

Instructors have found that rethinking some of their assignments' prompts and structure helps curb the overuse of AI by students. Here are some examples of how faculty can structure the use of AI in their classrooms and examples of assignments that are difficult for AI to replicate: 

  • Ask ChatGPT to answer your assignment or discussion prompt, then review the generated responses with your students to identify inconsistencies and errors. This exercise not only serves as a learning tool for students but also helps you understand the nature of AI-generated responses. Additionally, it allows you to adjust your assignment instructions to make it more challenging for AI to generate responses. 
  • Another alternative could be asking students to submit rough drafts of their work as part of the writing process, or to submit smaller segments of their work as they are working on larger projects – which would provide way for you as an instructor to see/evaluate/provide feedback on in-progress work too, or to incorporate peer review/collaboration. 

  • Requiring students to use and cite specific sources or data sets can be a useful way to make assignments difficult for AI to answer accurately.  

  • Requesting students to write reflection pieces that connect the content to their individual experiences, as well as incorporating peer review and collaboration, are effective strategies to limit AI use. Since AI cannot authentically reflect on individual experiences, these methods help ensure genuine student engagement. 

  • Ask ChatGPT to help refine your assignment instructions and prompts. Here are some helpful ChatGPT prompt examples compiled by the University of Michigan- Flint: Assessments in the AI Era 
  • Lang, J. M. (2013). Cheating lessons: Learning from academic dishonesty. Harvard University Press.