Long-Form Speech Generation with Spoken Language Models

Abstract

We consider the generative modeling of speech over multiple minutes, a requirement for long-form multimedia generation and audio-native voice assistants. However, textless spoken language models struggle to generate plausible speech past tens of seconds, due to high temporal resolution of speech tokens causing loss of coherence, architectural issues with long-sequence training or extrapolation, and memory costs at inference time. From these considerations we derive SpeechSSM, the first speech language model family to learn from and sample long-form spoken audio (e.g., 16 minutes of read or extemporaneous speech) in a single decoding session without text intermediates. SpeechSSMs leverage recent advances in linear-time sequence modeling to greatly surpass current Transformer spoken LMs in coherence and efficiency on multi-minute generations while still matching them at the utterance level. As we found current spoken language evaluations uninformative, especially in this new long-form setting, we also introduce: LibriSpeech-Long, a benchmark for long-form speech evaluation; new embedding-based and LLM-judged metrics; and quality measurements over length and time. Speech samples, the LibriSpeech-Long dataset, and any future code or model releases can be found at https://google.github.io/tacotron/publications/speechssm/.

Cite

Text

Park et al. "Long-Form Speech Generation with Spoken Language Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Park et al. "Long-Form Speech Generation with Spoken Language Models." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/park2025icml-longform/)

BibTeX

@inproceedings{park2025icml-longform,
  title     = {{Long-Form Speech Generation with Spoken Language Models}},
  author    = {Park, Se Jin and Salazar, Julian and Jansen, Aren and Kinoshita, Keisuke and Ro, Yong Man and Skerry-Ryan, Rj},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {48245-48261},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/park2025icml-longform/}
}