Differentiable Short-Time Fourier Transform: A Time-Frequency Layer with Learnable Parameters
Abstract
We present a differentiable version of the short-time Fourier transform (STFT), enabling gradient-based optimization of its parameters. This approach integrates with neural networks, allowing joint learning of both STFT and network parameters. Tests on simulated and real data demonstrate an improved time-frequency representation and enhanced performance on downstream tasks, illustrating the potential of our method as a standard for setting spectrogram parameters automatically.
Cite
Text
Leiber et al. "Differentiable Short-Time Fourier Transform: A Time-Frequency Layer with Learnable Parameters." ICML 2024 Workshops: Differentiable_Almost_Everything, 2024.Markdown
[Leiber et al. "Differentiable Short-Time Fourier Transform: A Time-Frequency Layer with Learnable Parameters." ICML 2024 Workshops: Differentiable_Almost_Everything, 2024.](https://mlanthology.org/icmlw/2024/leiber2024icmlw-differentiable/)BibTeX
@inproceedings{leiber2024icmlw-differentiable,
title = {{Differentiable Short-Time Fourier Transform: A Time-Frequency Layer with Learnable Parameters}},
author = {Leiber, Maxime and Marnissi, Yosra and Barrau, Axel},
booktitle = {ICML 2024 Workshops: Differentiable_Almost_Everything},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/leiber2024icmlw-differentiable/}
}