Optimized Random Features for the Neural Tangent Kernel (Student Abstract)
Abstract
The neural tangent kernel (NTK) has emerged as an important tool in recent years, both for developing a theoretical understanding of deep learning as well as for various applications. Even though recursive closed form expressions have been derived for computing the NTK, these become computationally expensive as the complexity of a network increases. Recent papers have looked at reducing this complexity using various sketching techniques along with random features. Building on these techniques, we propose an additional optimization step which results in better approximation of the NTK.
Cite
Text
Das and Maity. "Optimized Random Features for the Neural Tangent Kernel (Student Abstract)." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I28.35244Markdown
[Das and Maity. "Optimized Random Features for the Neural Tangent Kernel (Student Abstract)." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/das2025aaai-optimized/) doi:10.1609/AAAI.V39I28.35244BibTeX
@inproceedings{das2025aaai-optimized,
title = {{Optimized Random Features for the Neural Tangent Kernel (Student Abstract)}},
author = {Das, Shrutimoy and Maity, Binita},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {29340-29342},
doi = {10.1609/AAAI.V39I28.35244},
url = {https://mlanthology.org/aaai/2025/das2025aaai-optimized/}
}