Pre-DyGAE: Pre-Training Enhanced Dynamic Graph Autoencoder for Occupational Skill Demand Forecasting

Abstract

Conventional image set methods typically learn from image sets stored in a single location. However, in real-world applications, image sets are often distributed across different locations. Learning from such distributed sets using deep neural networks poses challenges for efficient image set classification and retrieval. To address this, we propose Distributed Cascade Manifold Hashing Network (DCMHN) for compact image set representation. DCMHN represents each image set using an SPD manifold and utilizes a manifold hashing network to generate hash codes, enabling efficient classification and retrieval. The network is trained in a cascaded manner, where the bilinear mapping in the BiMap layer is learned first, followed by joint learning of the hash function and classifier in the hash layer. DCMHN enforces local consistency on global variables across neighboring nodes, allowing parallel optimization. Extensive experiments on three benchmark image set datasets demonstrate that the proposed DCMHN achieves competitive accuracies in distributed settings, and outperforms state-of-the-arts in terms of computation and storage efficiency.

Cite

Text

Chen et al. "Pre-DyGAE: Pre-Training Enhanced Dynamic Graph Autoencoder for Occupational Skill Demand Forecasting." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/222

Markdown

[Chen et al. "Pre-DyGAE: Pre-Training Enhanced Dynamic Graph Autoencoder for Occupational Skill Demand Forecasting." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/chen2024ijcai-pre/) doi:10.24963/ijcai.2024/222

BibTeX

@inproceedings{chen2024ijcai-pre,
  title     = {{Pre-DyGAE: Pre-Training Enhanced Dynamic Graph Autoencoder for Occupational Skill Demand Forecasting}},
  author    = {Chen, Xi and Qin, Chuan and Wang, Zhigaoyuan and Cheng, Yihang and Wang, Chao and Zhu, Hengshu and Xiong, Hui},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {2009-2017},
  doi       = {10.24963/ijcai.2024/222},
  url       = {https://mlanthology.org/ijcai/2024/chen2024ijcai-pre/}
}