UAlbany Researcher Part of Team Building AI Tool to Improve Math Instruction

By Bethany Bump
ALBANY, N.Y. (Feb. 13, 2025) — A University at Albany professor is collaborating with researchers at the University of Virginia to develop and test an artificial intelligence tool that can be used to improve classroom instruction in mathematics.
The project, “Artificial Intelligence for Advancing Instruction at Scale (AI2S),” received a $1.4 million grant from the Bill & Melinda Gates Foundation and is being led by Peter Youngs at UVA’s School of Education and Human Development and Scott Acton at UVA’s School of Engineering and Applied Science. Jonathan Foster, an assistant professor of mathematics education at UAlbany’s School of Education, is a collaborator on the project.
The research team is working on an AI model that will be able to analyze video and audio recordings of classroom instruction and a dashboard that will provide efficient, applicable feedback to teachers looking to improve their instruction.
“I think teachers are innovators,” said Foster. “They think deeply about their practice and they make inquiries about their practice. And so I see this tool as supporting what they already naturally do — innovate — and as a way to support them in their inquiry and growth, and then hopefully, as a way to keep teachers in the profession.”
A former high school math teacher, Foster got his start on the project while enrolled as a post-doctoral research associate at UVA, where his research explored teacher education and artificial intelligence in STEM education.
While at UVA, Foster worked with Youngs and Acton to develop a proof-of-concept AI model that could detect and summarize activities happening in a classroom video. After successfully piloting the model with a small group of teachers, the researchers began exploring ways to pair audio and video together to provide more meaningful feedback to teachers.
“There are limitations with current AI models in the classroom,” Foster said. “A lot of them are just audio-based, so it will do some speech-to-text recognition and analyze the resulting classroom transcript. But it’s often missing other modalities, like the video itself. So if a teacher asks a question and students are raising their hands or doing some non-verbal communication, you’re missing out on some of those dimensions.”
The multimodal tool Foster and his team are developing uses synchronized video and audio to evaluate classroom dynamics and would be able to capture details such as the quality of teacher-student dialogue, cognitive rigor of lessons and instructional patterns.
Teachers have long relied on observation and feedback to improve their classroom instruction, whether through video-based professional learning communities where other subject-area teachers meet to review video from a classroom lesson and provide feedback, or more personalized instructional coaching.
“A concern we hear from the community is, well, is this going to replace the instructional coach?” Foster said. “Our standpoint is, no, we don't want to replace the instructional coach because we know what a benefit that partnership is. What we're offering and proposing is what if you had another thought partner in that equation? What could be offloaded to AI, and could that then support the coach's role and also the teacher's analysis on their own classroom practice?”
Researchers are partnering with R3 Collaboratives, the makers of education technology company Edthena, to help build the technology. Teachers were invited to help design the tool to ensure it can meet a wide range of needs across diverse classrooms, Foster said.
The goal is to pilot the technology with a small cohort of teachers later this summer and distribute it for wider use by its second or third year. The technology could eventually be adapted for other grade levels and subject areas, as well, he said.
“A lot of early career teachers go into the field and don't always feel supported,” he said. “But having a tool like this could be a way for them to feel connected to their practice. They can collect some data, and then maybe reach out to an instructional coach and say, hey, can you help me make sense of this? And so having that third partner of the AI may help facilitate some of those conversations and next steps for their practice.”