SNIP: Bridging Mathematical Symbolic and Numeric Realms with Unified Pre-Training
Abstract
In scientific inquiry, symbolic mathematical equations play a fundamental role in modeling complex natural phenomena. Leveraging the power of deep learning, we introduce SNIP, a Multi-Modal Symbolic-Numeric Pre-training framework. By employing joint contrastive learning between symbolic and numeric domains, SNIP enhances their mutual alignment in pre-trained embeddings. Latent space analysis reveals that symbolic supervision significantly enriches the embeddings of numeric data, and vice versa. Evaluations across diverse tasks, including symbolic-to-numeric and numeric-to-symbolic property prediction, demonstrate SNIP's superior performance over fully supervised baselines. This advantage is particularly pronounced in few-shot learning scenarios, making SNIP a valuable asset in situations with limited available data.
Cite
Text
Meidani et al. "SNIP: Bridging Mathematical Symbolic and Numeric Realms with Unified Pre-Training." NeurIPS 2023 Workshops: AI4Science, 2023.Markdown
[Meidani et al. "SNIP: Bridging Mathematical Symbolic and Numeric Realms with Unified Pre-Training." NeurIPS 2023 Workshops: AI4Science, 2023.](https://mlanthology.org/neuripsw/2023/meidani2023neuripsw-snip/)BibTeX
@inproceedings{meidani2023neuripsw-snip,
title = {{SNIP: Bridging Mathematical Symbolic and Numeric Realms with Unified Pre-Training}},
author = {Meidani, Kazem and Shojaee, Parshin and Reddy, Chandan K. and Farimani, Amir Barati},
booktitle = {NeurIPS 2023 Workshops: AI4Science},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/meidani2023neuripsw-snip/}
}