Teach with Generative AI
Resources for faculty
Overview
When it comes to the future of education, virtually no recent technology has sparked as much debate as generative AI (GenAI) and large language models (LLMs). Some have seen this technology as destructive, with school districts from to initially banning its use, and others have touted its and possibility of changing the game for educators and students alike.
Harvard has consistently tried to embrace new technology across our classrooms, residential and virtual. GenAI has been no different.
Faculty and students have access to a range of tools. Some of these tools are free and open to all faculty, students and staff behind Harvard Key (Harvard Sandbox) other tools require a license or approval. .
As our faculty and students have engaged with these technologies we have invited faculty to reflect on questions such as the following:
- What is the challenge you were trying to address?
- How did you use generative AI tools to tackle it?
- What did you learn?
Out of this there have been several learnings worth considering:
- GenAI tools have raised concerns about how they may compromise student assessments, promote academic dishonesty, and facilitate “lazy learning.” Our faculty colleagues who experimented with these tools were not oblivious to these concerns; indeed, many share them. At the same time, faculty are looking to understand how GenAI tools can enhance the educational experience and build more vibrant classrooms.
- Several colleagues are leveraging LLM features that everyone should keep in mind:
- Beyond text: For and , , , , , and more.
- Prompt design: There’s an old saying: “garbage in, garbage out.” The output of LLMs is only as good as the input, and it’s essential to learn (and perhaps teach) how to write a prompt that works. This is highlighted through discussions on the critical role of deliberate prompt formulation, from , to engaging students in debate on the , to . Our new offers a range of effective prompts that can be used by educators.
- Interrogate hallucinations: Errors arise not just because of algorithmic or data limitations but, importantly, because LLMs are fundamentally probabilistic. Faculty have found that errors can be reduced through and.
- Some consistent patterns and learnings emerge from how GenAI has been implemented for use by our colleagues:
- Going beyond the simple question-and-answer interface: Sal Khan popularized the idea of using LLMs to ask questions of a student, not just answer them. Several faculty colleagues take this further, illustrating how LLMs can be used to simulate any persona you want, and to ask anything of them. Examples include simulating , , , and .
- More than the “first draft”: GenAI needn’t compromise student creativity; in fact, it can augment it. Some colleagues are using it to help students and .
- Work alongside what you already have: Many faculty used LLMs to improve different (and sometimes mundane) aspects of existing teaching and learning “workflows,” such as , , , , and .
- Identify, and overcome, hidden or invisible barriers: Students and educators sometimes confront hidden prerequisites that present barriers for teaching and learning. GenAI can assist with overcoming these skill gaps: , foreign languages for research, art skills for building visual aids, and even
- Reimagining the classroom: While we’re still in the early days of GenAI, some of these examples already start to surface more profound questions: What does a class with GenAI at its core look like? Ultimately, what is the role of a teacher?
- Questions around genAI’s efficacy on learning arise: can we use GenAI tools—specifically tutor bots—to improve the way students learn? One faculty member created a tutor bot that . Beyond such research on students’ interactions with genAI tools, it might be helpful to imagine how it can help you and your students now, in other ways as well: by increasing task efficiency, improving student engagement, increasing their confidence, or even improving learning outcomes.
- The risks of LLMs present valid concerns. While popular debates often focus on “big picture” concerns like algorithmic biases, digital divides, and fake content, some faculty explore the risks , within our classrooms, such as hallucinations, , or superficial thinking, to understand these issues more deeply.
Video interviews that fed these reflections are featured here in the first Harvard GenAI Library for Teaching and Learning. And across Harvard there have and will continue to be convenings large and small bringing together faculty, students, staff, and administrators. As you think about what’s relevant for your course, support exists across Harvard to help you and your team experiment as well.
Frequently asked questions
The following information offers advice for educators interested in using generative AI tools in their teaching and course preparation. As this technology is constantly evolving, this page will be updated frequently with new resources and advice.
Creating materials for courses—syllabi, lesson plans, assignments—takes time. Generative AI tools aren’t just useful at broad, general prompts, but, as our colleagues have shown, they are useful in tackling the preparations before a student even arrives in the classroom:
- Preparing to teach: Starting a syllabus or a lesson plan from a blank page is daunting. AI can help you while making content that fits your course by .
- Assignments: Reusing the same assignments across multiple years can be time efficient, but it creates challenges for assessments. Some faculty have explored how AI tools can help write, modify, or create question sets. you put in about the structure and concepts you want it to use, . And it can even for it.
- Course content: Classes often include large amounts of content for reading, from case studies to readings to slideshow text. Just a few sentences of a prompt using GenAI can or videos you might want to include for students, to discuss, and even what your should include. Some faculty have gone a step further, inputting all the course materials in the materials that trained a teaching assistant chatbot. Hear what learned from students’ interactions with this “faculty copilot.”
How to engage your students in the classroom is an age-old question. Some faculty have used GenAI’s real-time thinking and range of outputs to help offer contemporary solutions, such as a content-generator, a data analyst, or a personal tutorbot.
- Activity leader: Creating interactive classes is easy to advocate for, but hard to do. Some of our faculty have used GenAI to to stimulate critical thinking, to make them feel heard, or even help ,no coding needed!
- Personal tutor: You (or your TA) can’t be everywhere at once, but GenAI might be able to. Feeding it your syllabus, lectures, and an in-depth prompt can help make a personal tutorbot, or even .
- Custom reviewer: LLMs can be used to provide initial personalized feedback to your students, so you can focus on the big picture. Some faculty used them to before office hours, or even.
- Skill leveler: Classes often have hidden prerequisites: familiarity with coding, texts barred in foreign languages, or even art skills. GenAI tools can be leveraged creatively to help students overcome such barriers, like without requiring every MBA student to learn code, or without having to do so manually.
Since LLMs can write essays, respond to readings, and finish problem sets, how can one not be concerned about misuse? For you and your students, addressing this concern means first making sure we know what GenAI can and can’t do—then creating assessments that emphasize skills where AI tools fall short.
Here are some strategies faculty have found useful:
- Human-based learning: Design tasks and assessments that require creativity, practical application of concepts, and critical thinking. For example, instead of asking your students to summarize perspectives on a given issue, you may ask them to critically analyze which perspective is most convincing to them and explain why; to relate their answers to class discussions; or to assess their peers’ performance during a live problem-solving session.
- Process-based assessments: Another approach is to test intermediate steps in the learning process, instead of just the final product. (It’s easy to fudge your report card to your parents; it’s harder to fudge not having gone to school for the past two months.) Testing evidence of original thinking, planning, peer-to-peer conversations, etc. can make relying on genAI less attractive.
- Establishing norms: Emphasize original work and academic honesty, and at a minimum provide clear guidelines about the use of AI-generated content in assignments and assessments.
Here are the most important ones to keep in mind:
- AI models can make mistakes: We’ve all had an incident (or two!) where a GenAI tool seems to have lost its mind, yielding garbled or entirely made-up answers. These are called AI hallucinations. It’s tempting to think these will get eliminated over time as technology improves. But since LLMs are fundamentally probabilistic rather than deterministic, this may not be the case.
- AI models can be biased: AI adopts the biases of the material and data it was trained on. Good AI use involves being aware of, checking for, and making efforts to correct such biases.
- AI models can violate privacy: AI is very good at doing what you want, but it is also very bad at knowing if what you want it to do is allowed. Personal data is not supposed to be fed to GenAI models. Make sure you are aware of Harvard’s and .
- AI models can be misused: Of course, AI could be used to plagiarize assignments. Unless you are interested in grading robots, you should shift the kind of assignments you are providing students (see above) as well as enforce academic dhonesty policies.
GenAI’s ability to meet learners where they are, both in terms of prior knowledge and learning progress, can , and AI-powered educational games can , particularly in STEM courses. One fascinating study showed that when students tutored by AI are pitted against students taught in the traditional classroom setting, the .
It is natural for instructors, particularly successful ones, to wonder: GenAI may be useful for the average educator. But my classes are great; why would I need it?
One way to think about this is in terms of the efficiency benefits of GenAI tools—they can save time, facilitate meaningful non-classroom learning experiences, and make classroom discussions more interactive. For example:
- Utilize with GenAI to build a that gives students an unlimited number of interactive statistical problems.
- Challenge students with DIY created on short order—and without any coding prerequisite!
- Empower students to experiment with with just a few minutes prompt engineering with DALL-E.
- Where to start experimenting: If you’re ready to jump right in, then the best place is to start is the . Like any sandbox, it’s a great place to play around with tools; unlike any sandbox, it has five large-language-models to choose from and is accessible .
- Where to modify images: If you’re looking to use GenAI to manipulate and create images, then you can use download Adobe Firefly through the . For more information, see .
- Where to learn more: If you want to level up your GenAI knowledge before you start creating, check out the or the Bok Center’s page.
- Where to find higher level tools: API access to tools like Azure OpenAI, Google Vertex, and AWS services is available by request from HUIT. If you don’t know what these are, that’s okay; teachers have made remarkable tools just using ChatGPT and a well-crafted prompt!
When using large language models (LLMs) in your classroom, it’s essential to be aware of the designed to ensure responsible and effective use of these technologies and to refer to School-specific policies and resources. It is also important to remember that other existing policies, such as Harvard’s , , and , also apply to GenAI and the use of LLMs.
Across Harvard, there’s a strong emphasis on using LLMs . The use of generative AI must align with the principles of honesty, respect, and responsibility, ensuring that students’ work remains original and reflective of their understanding and skills. In crafting a response to the use of LLMs in the classroom, it’s crucial to strike a balance between leveraging these tools for educational enhancement and ensuring that they do not compromise educational objectives.
At the course level, Schools within Harvard allow for the , provided these are . Each encourages innovative and thoughtful , including . Above all, students and faculty are encouraged to be transparent about the use of generative AI in academic work. This includes proper when AI-generated content or assistance is utilized in the creation of academic materials. Consider co-creating course-specific norms around the use of generative AI with your students.
For any faculty who wish to simply explore the use of these tools, we encourage you to engage with peers and organizations at Harvard that are interested in these topics.
School-based resources
Visit your School’s website for the latest policies and guidance around using GenAI in the classroom. This list will be updated as further School-specific guidance becomes available.