Measuring the Impact of Student Gaming Behaviors on Learner Modeling
This paper investigates how student gaming behaviors (hint abuse, random guessing, rapid clicking) contaminate data quality in knowledge tracing models used by adaptive learning systems, conceptualizing these behaviors as data poisoning attacks and evaluating their impact on learner modeling performance. The authors simulate diverse gaming patterns to systematically assess vulnerabilities in KT models and explore unsupervised detection approaches.
The expansion of large-scale online education platforms has made vast amounts of student interaction data available for knowledge tracing (KT). KT models estimate students'concept mastery from interaction data, but their performance is sensitive to input data quality. Gaming behaviors, such as excessive hint use, may misrepresent students'knowledge and undermine model reliability. However, systematic investigations of how different types of gaming behaviors affect KT remain scarce, and existing