PhysicsAssistant: An LLM-Powered Interactive Learning Robot for Physics Lab Investigations
This paper presents PhysicsAssistant, a multimodal LLM-powered robot designed to assist 8th-grade students during physics lab experiments by combining YOLOv8 object detection with GPT-3.5-turbo for natural language interaction. The study evaluates the system through a user study with 10 students, comparing its performance against GPT-4 and human expert ratings using Bloom's taxonomy.
Robot systems in education can leverage Large language models’ (LLMs) natural language understanding capabilities to provide assistance and facilitate learning. This paper proposes a multimodal interactive robot (PhysicsAssistant) built on YOLOv8 object detection, cameras, speech recognition, and chatbot using LLM to provide assistance to students’ physics labs. We conduct a user study on ten 8th-grade students to empirically evaluate the performance of PhysicsAssistant with a human expert. The