How can I help students use AI to support their learning?
AI seems to be everywhere in our current moment, and in the higher education environment we encounter an avalanche of suggestions about how to use artificial intelligence in our courses. In this guide, we cut through the confusion to focus on three sound pedagogical strategies to productively and thoughtfully integrate AI into assignments and activities as a kind of partner in the learning process. We explore how instructors and students can use AI as a thinking partner, a quizzing partner, and a fallible partner. These strategic partnerships, however, are designed to help students develop their own thinking and to critically reflect on their experiences using AI.
For each strategy, you will first read examples from different disciplines. You will then read a short analysis of the roles that the instructor and the students play in the examples. Then we invite you to respond to questions we provide to help you reflect on how you might use the strategy to help students productively use AI in your assignments or activities. We encourage you to stop and respond to these questions in writing as you work through this guide.
Strategy 1: Using AI as a thinking partner
When students are doing assignments and activities for our courses, slowing down and regulating their thinking can be beneficial for their learning. When prompted appropriately, AI is a patient and fairly well-prepared thinking partner that students can use to monitor and consider more fully their thinking process. This use of AI should help students internalize a mindful set of practices that allows them to develop their own thinking.
Let’s look at three examples of this use of AI.
Example: Students use AI to get started on an assignment
An assignment in a public speaking course requires students to develop a speech over a period of weeks and then deliver the speech in class. The first step is to come up with a topic for a persuasive speech. Students have read about the key elements of a persuasive speech and the key areas that a speaker should consider when choosing a topic.
Working in small groups, students use prompts based on key course concepts and ask AI to generate a list of 20 possible topics that might interest their peers. The students then workshop those topics, deciding which ones might be of interest and which might not be based on their previous readings and discussions. Students then individually use the conversations they have had and their own ideas to make decisions about which persuasive speech topic they may want to continue exploring. Students spend some time writing an individual journal entry about what they think they want to speak about, why that topic will be effective, and the ways that thinking with AI helped or hindered their decision-making process. Students also write critically about which topics AI suggested that they felt were particularly helpful, bizarre, or evidence of the limitations or bias that can influence AI generation of text.
Example: Students use AI for suggestions
Students in a statistics course are working on an assignment that requires them to make a set of decisions to help a fictional statistician. They are required to explain their suggestions in their own words to the statistician.
The instructor tells students that as they work with concepts from the course (e.g. “degrees of freedom” or “p. value”), they can prompt AI to generate examples or explanations that will help them understand these concepts, but they should use their own words when they write up the assignment. The instructor asks them to share their conversations with AI as an appendix to the assignment and a brief written statement about how this conversation changed their thinking about specific concepts.
Example: Students use AI for “feedback”
Students in a marketing research course are working on a semester-long marketing research assignment where they work with a community partner. One of the early steps in this assignment is the development of survey questions.
The instructor asks students to work in small groups and draw on their individual efforts to draft 10 questions for a client survey based on survey principles and frameworks they have read about previously. Students then prompt AI to evaluate the questions for construct reliability in relation to two key constructs. Students then work in their groups to analyze the feedback they got from AI. If the suggestions were valid, they need to be able to say why. If the suggestions were not useful, they also need to explain why they are rejecting those changes. Students then spend some time reflecting on the strengths and weaknesses of AI as a “thinking partner.” This annotated new list of survey questions is submitted for feedback to the instructor along with the reflections about the use of AI at this stage of the marketing research process.
Analyzing this strategy
It's important to note that in all these examples, the instructor and the student have essential roles in this partnership with AI. These roles are active and help ensure that the student is deeply engaged in meaningful disciplinary thinking. AI is simply predicting a sequence of words in relation to the prompts it is given; AI is never doing the work for the student.
The instructor’s role is to help students create prompts using course concepts that elicit suggestions or specific feedback from AI. The instructor also helps students decide the relative value of the suggestions or feedback using course concepts. Finally, the instructor helps students make changes to their work or thinking as a result of the “conversation” they are having with AI about their thinking.
The students’ role is to make changes to their thinking, planning, writing, and so on by drawing on course concepts and the experiences they had with AI. The students also critically reflect on the contributions of AI to their thinking.
Reflection: How could you use this strategy?
Now that you’ve read and explored these examples, make some notes in response to these questions:
- What ideas for using AI as a thinking partner in the examples look potentially helpful for your students?
- In what other ways might your students use AI as a thinking partner as they work on activities or assignments in your course?
- What would you want your students to do with the suggestions or feedback AI gives them?
- How could you ask your students to reflect on the experience they had with AI as a thinking partner?
Strategy 2: Using AI as a quizzing partner
When students have completed readings, reviewed their notes, or participated in a class meeting, they often assume that they have learned. But the learning really comes when students must retrieve, explore, and use the content they’ve encountered. When prompted appropriately, AI is a patient and fairly well-prepared partner that can be used by students to test or quiz their emerging abilities in our courses. This use of AI helps students see more clearly what they do and don’t understand; it also helps students discover the value of self-quizzing. As your students prompt AI to quiz them, they also internalize a mindful set of practices that allows them to think about what they are learning and how to develop their thinking. Let’s look at three examples of this use of AI.
Example: Students use AI to prepare for a test
Students in a phonetics course have a test coming up on the phonetic characteristics of consonants. The instructor gives them the following prompt as a model of how they might ask AI to quiz them on their knowledge of the test material.
“I am an undergraduate student taking a course in phonetics and we have a test coming up on phonetic characteristics of consonants. Ask me questions about this topic to help me see what I understand and where I may need to study more. Ask me one question at a time and make each question short answer or multiple choice. After I answer a question, give me feedback on my answer, then ask another question. When I get a question wrong, make the next question a little easier. When I get a question right, make the next question a little harder. Keep asking me questions one by one until I ask to stop. When I ask you to stop, give me a summary of which topics I need to study more.”
After this experience, students write a brief summary of what new learning took place during the quizzing session and how they can use AI in the future as a quizzing partner.
Example: Students use AI to practice key skills in a course
Students in an art history class are focusing on the post-impressionists. The instructor realizes that students need a lot more practice in making finer discriminations between works from this period in terms of form and content. The instructor gives students the following prompt as a model of how they might ask AI to quiz them:
“I am an undergraduate student taking an introduction to art history class. I would like you to quiz me. I want to develop my ability to compare and contrast the styles and the work of post-impressionist artists. Please give me pairs of paintings (show me the images) and ask me difficult questions that require me to compare and contrast them in relation to both the technique the artists use as well as the subject matter of the paintings. Keep giving me pairs of paintings until I ask to stop. When I ask you to stop, give me a summary of what aspects of post-impressionism I understand well and what aspects I have confusion about.”
After this experience, students write a brief summary of what new learning took place during the quizzing session and how they can use AI in the future as a quizzing partner.
Example: Students use AI to quiz them about a reading
I am an undergraduate ecology student and I am trying to improve my research analysis skills. I read this article [insert link or file]. Please ask me questions about the methods, findings, and limitations in this article. Ask me one question at a time and make each question short-answer or multiple choice. After I answer a question, give me feedback on my answer, then ask another question. When I get a question wrong, make the next question a little easier. When I get a question right, make the next question a little harder to go beyond memorization and comprehension to evaluation and synthesis. Keep asking me questions one by one until I ask to stop. When I ask you to stop, give me a summary of which parts of the article I understand best and which parts I have confusion about.
Analyzing this strategy
Note again that in all these examples, the instructor and the student have essential roles in this partnership with AI. These roles are active and help ensure that the student is deeply engaged in meaningful disciplinary thinking. AI is simply predicting a sequence of words in relation to the prompts it is given; AI is never doing the work for the student.
The instructor’s role is to help students create prompts using course concepts that elicit the right level of quizzing from AI.
The students’ role is to evaluate their own progress using course concepts, make a plan for next steps in learning, and critically reflect on the experience of using AI as a quizzing partner.
Reflection: How could you use this strategy?
Now that you’ve read and explored these examples, make some notes in response to these questions:
- What ideas for using AI as a quizzing partner in the examples look potentially helpful for your students?
- In what other ways might your students use AI as a quizzing partner to practice key skills or prepare for assessments in your course?
- What would you want your students to do with the feedback AI provides about their performance?
- How could you ask them to reflect on the experience they had with AI as a quizzing partner?
Strategy 3: Investigating AI as a fallible partner
AI may be a tool used in your field, and it may be important for your students to learn how to critically use and evaluate work generated by AI. Students need practice reflecting on, critiquing, and discovering the limits and biases of a large language model. This use of AI should help students develop habits of critical thinking about AI. Let’s look at three examples.
Example: Students critique AI-generated work in a discipline in the social sciences
Students in an education course prompt AI to create lesson plans for the grade level they intend to teach. They then evaluate those lesson plans using course principles such as culturally responsive pedagogical practices, skills differentiation, or alignment with key standards. Students formalize their critique of the lesson plan in a short essay where they also describe new learning about the course principles they used and thoughtfully describe the limitations of AI in their discipline of education and plans for how they will use AI in their fields in the future.
Example: Students critique AI-generated work in a discipline in the humanities
Students in a sculpture class are learning about installation art. They prompt AI to design a museum space and an installation within that space. They then evaluate the installation space generated by AI using aesthetic, cultural, and critical theories they are studying in their course. Students formalize their evaluation in writing, describing their new understandings of both course concepts and the limitations and potential of AI for artists and for those who design spaces for them.
Example: Students critique AI-generated work in a health care-related discipline
Students in a public health course learn about a commercial algorithm that is used to make health care decisions in the U.S. health care system. They then estimate the level of bias that might be built into the system. Students share ideas with each other and reach consensus on the level of bias they believe might be built into the system. The instructor debriefs their ideas and then reveals the findings of one study of the system. The abstract is provided here:
The U.S. health care system uses commercial algorithms to guide health decisions. Obermeyer et al. find evidence of racial bias in one widely used algorithm, such that Black patients assigned the same level of risk by the algorithm are [in reality] sicker than White patients. The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half. Bias occurs because the algorithm uses health costs as a proxy for health needs. Less money is spent on Black patients who have the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients. Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care. (Obermeyer et al., 2019)
Students then work to determine how the AI bias might have come about and how public health can put precautions in place when training AI to work with large data sets.
Analyzing this strategy
Again we can see that in all these examples, the instructor and the student have essential roles in this partnership with AI. These roles are active and help ensure that the student is deeply engaged in meaningful disciplinary thinking. AI is simply predicting a sequence of words in relation to the prompts it is given; AI is never doing the work for the student.
The instructor’s role is to require students to prompt AI to generate work in ways that are aligned with its use in the discipline. In addition, the instructor structures the students’ thoughtful reflection and critique of the work AI generates.
The students’ role is to critique and evaluate the work AI generates. In addition, students thoughtfully critique the role of AI in the field and in practice; and make plans to use AI effectively, ethically, and critically in the future.
Reflection: How could you use this strategy?
Now that you’ve read and explored these examples, make some notes in response to these questions:
- What ideas for investigating AI as a fallible partner in the examples look potentially helpful for your students?
- How is AI used in your discipline, in the field, or in practice? What kind of work does AI produce that your students could investigate?
- How might you have students evaluate or critique the work that AI generates?
- How could you ask them to reflect on and make use of the experience they had with AI as a fallible partner?
When using AI is unproductive for our students
Now that you’ve explored three pedagogically productive strategies for using AI to support student learning, we pause to consider uses of AI that are not productive. AI creates a combination of excitement and worry for instructors, so it is easy to get overwhelmed by the many ideas we encounter about teaching with AI. But as with any technology that instructors encounter, it is good to slow down and ask, “Will this help my students do the learning that I want them to do?” As you think about that question, you might decide that AI is appropriate in a course you teach, that it isn’t, or that it might be appropriate for one assignment in your course (e.g., one in which you want students to use AI to generate code because they need to learn how to review and use AI-generated code in your course) but inappropriate for another assignment (e.g., one in which you want students to know how to review lines of code without AI assistance).
Deciding if and how to use AI effectively as an instructor takes some time. But you can avoid wasting your time by recognizing that it is unproductive for students to use AI if using it robs them of the opportunity to learn a skill. Let’s take a look at some examples of unproductive uses of AI and the problem behind these uses of AI.
Example: AI “de-skilling” students by doing work for them
An instructor wants students to make decisions about the primary sources they will draw on to create their history projects and to find themes in those sources to help focus their interpretations. She tells students to use AI to find five relevant primary sources for a history paper and to use AI to create an annotated bibliography of those sources. She tells students that when they turn in their annotated bibliography to tell her what prompts they used. Her rationale is that historians can use AI to do this work.
The problem: If the instructor wants students to learn the skills of choosing and annotating sources, the students must build these skills. Offloading this work to AI means she is “de-skilling” her students. Historians might use AI to do some of this work, but they would also be critically exploring what AI generates. The instructor should not have students use AI to do the work for them. Alternatively, the instructor should have students review and critique the work AI does.
Example: AI “de-skilling” students by simplifying work for them
An instructor assigns a weekly research paper to students in an introductory genetics course that aligns with key concepts and topics to demonstrate the relevance of these concepts and topics. She wants students to know about this research, but she feels that freshman are not prepared to read and understand the research papers. She tells students to feed the papers to AI and have it summarize the papers for them.
The problem: If the instructor wants AI to do this work for her students, reading research may not be a necessary assignment in her course. Offloading this work to AI indicates that the work is unnecessary. The instructor should reconsider what students can do with the summaries AI produces or should reconsider this assignment.
Example: Students doing work with AI that is entertaining but pedagogically unsound
An instructor wants students to learn to identify, create, and use figures of speech to good effect in public speaking. He has students prompt AI to create twenty novel metaphors and asks them to find the most unusual and unexpected metaphors.
The problem: If the instructor wants students to create metaphors, simply seeing metaphors generated by AI will not help students. The instructor must design work for students so that they develop the skills they need to use figures of speech themselves. There is little to no connection between the use of AI and the aims he has for his students’ learning.
References
Alby, C. (2023). ChatGPT: A Must-See Before the Semester Begins, Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/chatgpt-a-must-see-before-the-semester-begins/
Bowen, J. A., & Watson, C. E. (2024). Teaching with AI: A Practical Guide to a New Era of Human Learning. Johns Hopkins University Press.
Bruff, D. (Host). (2024, June 4). AI’s impact on learning (No. 40). [Audio podcast episode]. In Intentional Teaching. UPCEA. https://intentionalteaching.buzzsprout.com/2069949/episodes/15157479-ai-s-impact-on-learning-with-marc-watkins
Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used manage the health of populations, Science (366), 6464, 447-453. DOI: 10.1126/science.aax2342
Stachowiak, B. (Host). (2020, June 27). Toward a more critical framework for AI use (No. 524). [Audio podcast episode]. In Teaching in Higher Ed Podcast. Teaching in Higher Ed. https://teachinginhighered.com/podcast/toward-a-more-critical-framework-for-ai-use/