icsPLMs: Exploring Pre-Trained Language Models in Intelligent Customer Service (Student Abstract)
Abstract
Pre-trained language models have shown their high performance of text processing in intelligent customer service platforms. However, these models do not leverage domain specific information. In this paper, we propose icsPLMs optimized for intelligent customer service on both word and sentence levels. Our experimental results represent that using targeted strategies can further improve the performance of pre-trained language models in this field.
Cite
Text
Liu et al. "icsPLMs: Exploring Pre-Trained Language Models in Intelligent Customer Service (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I21.30475Markdown
[Liu et al. "icsPLMs: Exploring Pre-Trained Language Models in Intelligent Customer Service (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/liu2024aaai-icsplms/) doi:10.1609/AAAI.V38I21.30475BibTeX
@inproceedings{liu2024aaai-icsplms,
title = {{icsPLMs: Exploring Pre-Trained Language Models in Intelligent Customer Service (Student Abstract)}},
author = {Liu, Shixuan and Wang, Chao and Song, Shuangyong},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2024},
pages = {23565-23566},
doi = {10.1609/AAAI.V38I21.30475},
url = {https://mlanthology.org/aaai/2024/liu2024aaai-icsplms/}
}