PhysicsAssistant: An LLM-Powered Interactive Learning Robot for Physics Lab Investigations
This paper presents PhysicsAssistant, a multimodal robot combining YOLOv8 object detection with GPT-3.5-turbo to provide real-time interactive assistance to 8th-grade students during physics lab experiments. The system is empirically evaluated through a user study with 10 students, where expert ratings based on Bloom's taxonomy assess the quality of responses compared to GPT-4.
Robot systems in education can leverage Large language models’ (LLMs) natural language understanding capabilities to provide assistance and facilitate learning. This paper proposes a multimodal interactive robot (PhysicsAssistant) built on YOLOv8 object detection, cameras, speech recognition, and chatbot using LLM to provide assistance to students’ physics labs. We conduct a user study on ten 8th-grade students to empirically evaluate the performance of PhysicsAssistant with a human expert. The