RoTaR: Efficient Row-Based Table Representation Learning via Teacher-Student Training (Short Paper)
Abstract
We propose RoTaR, a row-based table representation learning method, to address the efficiency and scalability issues faced by existing table representation learning methods. The key idea of RoTaR is to generate query-agnostic row representations that could be re-used via query-specific aggregation. In addition to the row-based architecture, we introduce several techniques: cell-aware position embedding, AutoEncoder objective in transformer models, teacher-student training paradigm, and selective backward to improve the performance of RoTaR model.
Cite
Text
Chen et al. "RoTaR: Efficient Row-Based Table Representation Learning via Teacher-Student Training (Short Paper)." NeurIPS 2022 Workshops: TRL, 2022.Markdown
[Chen et al. "RoTaR: Efficient Row-Based Table Representation Learning via Teacher-Student Training (Short Paper)." NeurIPS 2022 Workshops: TRL, 2022.](https://mlanthology.org/neuripsw/2022/chen2022neuripsw-rotar/)BibTeX
@inproceedings{chen2022neuripsw-rotar,
title = {{RoTaR: Efficient Row-Based Table Representation Learning via Teacher-Student Training (Short Paper)}},
author = {Chen, Zui and Cao, Lei and Madden, Samuel},
booktitle = {NeurIPS 2022 Workshops: TRL},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/chen2022neuripsw-rotar/}
}