Enhancing Automated Grading in Science Education Through LLM-Driven Causal Reasoning and Multimodal Analysis

Abstract

Automated assessment of open responses in K–12 science education poses significant challenges due to the multimodal nature of student work, which often integrates textual explanations, drawings, and handwritten elements. Traditional evaluation methods that focus solely on textual analysis fail to capture the full breadth of student reasoning and are susceptible to biases such as handwriting neatness or answer length. In this paper, we propose a novel LLM-augmented multimodal evaluation framework that addresses these limitations through a comprehensive, bias-corrected grading system. Our approach leverages LLMs to generate causal knowledge graphs that encapsulate the essential conceptual relationships in student responses, comparing these graphs with those derived automatically from the rubrics and submissions. Experimental results demonstrate that our framework improves grading accuracy and consistency over deep supervised learning and few-shot LLM baselines.

Cite

Text

Zhu et al. "Enhancing Automated Grading in Science Education Through LLM-Driven Causal Reasoning and Multimodal Analysis." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/1150

Markdown

[Zhu et al. "Enhancing Automated Grading in Science Education Through LLM-Driven Causal Reasoning and Multimodal Analysis." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/zhu2025ijcai-enhancing/) doi:10.24963/IJCAI.2025/1150

BibTeX

@inproceedings{zhu2025ijcai-enhancing,
  title     = {{Enhancing Automated Grading in Science Education Through LLM-Driven Causal Reasoning and Multimodal Analysis}},
  author    = {Zhu, Haohao and Li, Tingting and He, Peng and Zhou, Jiayu},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {10352-10360},
  doi       = {10.24963/IJCAI.2025/1150},
  url       = {https://mlanthology.org/ijcai/2025/zhu2025ijcai-enhancing/}
}