Iterative Teacher-Aware Learning

Abstract

In human pedagogy, teachers and students can interact adaptively to maximize communication efficiency. The teacher adjusts her teaching method for different students, and the student, after getting familiar with the teacher’s instruction mechanism, can infer the teacher’s intention to learn faster. Recently, the benefits of integrating this cooperative pedagogy into machine concept learning in discrete spaces have been proved by multiple works. However, how cooperative pedagogy can facilitate machine parameter learning hasn’t been thoroughly studied. In this paper, we propose a gradient optimization based teacher-aware learner who can incorporate teacher’s cooperative intention into the likelihood function and learn provably faster compared with the naive learning algorithms used in previous machine teaching works. We give theoretical proof that the iterative teacher-aware learning (ITAL) process leads to local and global improvements. We then validate our algorithms with extensive experiments on various tasks including regression, classification, and inverse reinforcement learning using synthetic and real data. We also show the advantage of modeling teacher-awareness when agents are learning from human teachers.

Cite

Text

Yuan et al. "Iterative Teacher-Aware Learning." Neural Information Processing Systems, 2021.

Markdown

[Yuan et al. "Iterative Teacher-Aware Learning." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/yuan2021neurips-iterative/)

BibTeX

@inproceedings{yuan2021neurips-iterative,
  title     = {{Iterative Teacher-Aware Learning}},
  author    = {Yuan, Luyao and Zhou, Dongruo and Shen, Junhong and Gao, Jingdong and Chen, Jeffrey L and Gu, Quanquan and Wu, Ying Nian and Zhu, Song-Chun},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/yuan2021neurips-iterative/}
}