Intelligent Descriptive Answer Evaluation System
This paper proposes an automated Descriptive Answer Evaluation System (DAES) using NLP and machine learning to evaluate student-written descriptive answers, aiming to reduce manual grading workload and provide faster feedback. The system focuses on semantic similarity, word order, sentence sequence, and spell-checking for assessment.
Assessing the quality of responses in educational and professional assessments is a critical task, especially when dealing with subjective questions that demand descriptive answers. Traditional multiple-choice assessments fall short in evaluating a student's ability to express complex ideas and critical thinking skills. In this paper, we highlight the growing need for a Descriptive Answer Evaluation System (DAES) that can address the shortcomings of current evaluation methods. The primary goal o