UniCorn: A Unified Contrastive Learning Approach for Multi-View Molecular Representation Learning
Abstract
Recently, a noticeable trend has emerged in developing pre-trained foundation models in the domains of CV and NLP. However, for molecular pre-training, there lacks a universal model capable of effectively applying to various categories of molecular tasks, since existing prevalent pre-training methods exhibit effectiveness for specific types of downstream tasks. Furthermore, the lack of profound understanding of existing pre-training methods, including 2D graph masking, 2D-3D contrastive learning, and 3D denoising, hampers the advancement of molecular foundation models. In this work, we provide a unified comprehension of existing pre-training methods through the lens of contrastive learning. Thus their distinctions lie in clustering different views of molecules, which is shown beneficial to specific downstream tasks. To achieve a complete and general-purpose molecular representation, we propose a novel pre-training framework, named UniCorn, that inherits the merits of the three methods, depicting molecular views in three different levels. SOTA performance across quantum, physicochemical, and biological tasks, along with comprehensive ablation study, validate the universality and effectiveness of UniCorn.
Cite
Text
Feng et al. "UniCorn: A Unified Contrastive Learning Approach for Multi-View Molecular Representation Learning." International Conference on Machine Learning, 2024.Markdown
[Feng et al. "UniCorn: A Unified Contrastive Learning Approach for Multi-View Molecular Representation Learning." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/feng2024icml-unicorn/)BibTeX
@inproceedings{feng2024icml-unicorn,
title = {{UniCorn: A Unified Contrastive Learning Approach for Multi-View Molecular Representation Learning}},
author = {Feng, Shikun and Ni, Yuyan and Li, Minghao and Huang, Yanwen and Ma, Zhi-Ming and Ma, Wei-Ying and Lan, Yanyan},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {13256-13277},
volume = {235},
url = {https://mlanthology.org/icml/2024/feng2024icml-unicorn/}
}