Interpreting the Second-Order Effects of Neurons in CLIP
Abstract
We interpret the function of individual neurons in CLIP by automatically describing them using text. Analyzing the direct effects (i.e. the flow from a neuron through the residual stream to the output) or the indirect effects (overall contribution) fails to capture the neurons' function in CLIP. Therefore, we present the "second-order lens", analyzing the effect flowing from a neuron through the later attention heads, directly to the output. We find that these effects are highly selective: for each neuron, the effect is significant for <2% of the images. Moreover, each effect can be approximated by a single direction in the text-image space of CLIP. We describe neurons by decomposing these directions into sparse sets of text representations. The sets reveal polysemantic behavior - each neuron corresponds to multiple, often unrelated, concepts (e.g. ships and cars). Exploiting this neuron polysemy, we mass-produce "semantic" adversarial examples by generating images with concepts spuriously correlated to the incorrect class. Additionally, we use the second-order effects for zero-shot segmentation, outperforming previous methods. Our results indicate that a automated interpretation of neurons can be used for model deception and for introducing new model capabilities
Cite
Text
Gandelsman et al. "Interpreting the Second-Order Effects of Neurons in CLIP." International Conference on Learning Representations, 2025.Markdown
[Gandelsman et al. "Interpreting the Second-Order Effects of Neurons in CLIP." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/gandelsman2025iclr-interpreting/)BibTeX
@inproceedings{gandelsman2025iclr-interpreting,
title = {{Interpreting the Second-Order Effects of Neurons in CLIP}},
author = {Gandelsman, Yossi and Efros, Alexei A and Steinhardt, Jacob},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/gandelsman2025iclr-interpreting/}
}