Rethinking Summative Assessment in the GenAI Context
Written by Omelia Tennant and Jill Ballard
February 9, 2024 • 6 minute read
In the context of generative AI (GenAI), many instructors are reimagining their summative assessments to ensure academic integrity and AI-literacy, a process which often leads to better assessment overall.
By definition, summative assessment offers top-level evidence that students have met course learning objectives. At the higher education level, such assessments are often completed outside the classroom where students have access to a variety of GenAI tools and opportunity to leverage untraceable AI-assisted output. Although this reality is true across a course’s learning activities, it particularly challenges established forms of summative assessment—simply, a term paper on its own may not provide a clear view of students' authentic performance. But it’s not a question of mitigating GenAI use. To support students’ AI literacy and professional readiness, quality summative assessment should also address the critical thinking required to manage GenAI applications strategically.
There are many ways to adapt summative assessments with consideration for GenAI tools use, but certainly, one size does not fit all. To customize assessment solutions for your course, consider meeting with an instructional designer.
Start Here
To begin rethinking summative assessment, consider these foundational points:
- Re-review your course learning objectives. As these objectives don’t prescribe an assignment, there are many ways to show evidence of learning in addition to the assessments currently in use.
- Consider assessment as a process, not just a product. Assessing both development and the final submission better ensures academic integrity and provides an understanding of students' progression as a whole.
- View assessment through the lens of equity. Considering why students would want to use AI-assisted output entirely, rather than to support their own work, can provide insight into the student experience and possible adaptations to assessment practices.
- Lean in to creating a course that helps students value their authentic voice, AI literacy, and the importance of developing a stronger critical perspective overall.
From There
Consider these approaches for adapting or creating summative assessments:
- For writing-based assessments, such as term papers and projects:
- Integrate a graded component that asks students to show evidence of their development along the way—explicitly noting if, when, how, and why GenAI tools were employed. Where appropriate, allow students to leverage GenAI tools for brainstorming, draft analysis, and feedback among other applications. Consider including an additional self-reflection component to gain a personal perspective on the development process.
- What can be written can also be spoken and shown. Consider adapting existing written assessments to include other components, such as a video presentation, debate, or explained imagery in addition to or in place of the written submission.
- Consider personalizing essay topics to individualize content and make it specific, which is outside GenAI capability to some extent.
- Prioritize critical thinking as part of any summative assessment and determine how this component will be displayed and evaluated.
- Utilize authentic assessments that leverage real-world applications and formats, and whenever possible, localize the topic or context.
For any summative assessment approach, prize integrity at the beginning by setting clear parameters for GenAI tools use at the course- and/or assessment-level. In addition, comprehensive rubrics can help students distinguish what development and performance points are important, including how GenAI tools may be employed.
To customize summative assessments and rubrics, USF Instructional Design can help you determine solutions that will work best for your course. Visit our scheduling page to make an appointment with an Instructional Designer.
Across the world, instructors are rethinking their summative assessments, alone or as part of course restructure.
- Dr. Joy Lopez (MA in Educational Technology program) restructured her graduate course, Navigating the Digital Divide: Digital Leadership, to focus on GenAI and the challenges and opportunities it poses to education and society. At the summative level, students were assessed on their ability to critically analyze and interrogate GenAI tools to create professional development training on AI and resources for their current educational roles.
- Dr. Jennifer Cromley, University of Illinois, Urbana-Champaign: Facing the Challenge of Generative AI: One Faculty's Innovative Approach to Teaching Statistics
Explore Generative AI Tools
To learn how to use ChatGPT and other GenAI technologies, schedule a training session with Instructional Technologies & Training.
- 10 Best Practices for AI Assignments in Higher Ed
- Reimagining Your Assessments in Light of AI
- Rethinking Assessment in the Age of AI Content Generators: Moving Beyond Traditional Papers
- How Do I (Re)design Assignments and Assessments in an AI-Impacted World?
- Six Approaches from the Reassessing Assessment in the Age of GenAI, USF Instructional Design workshop, fall 2023
Return to the GenAI Main Page | Read the next article: Rethinking Learning Activities