Jul 28, 2025
Idempotent GPTs actually may provide robustness by design
Jamilya Erkenova, Sergei Kudriashov
Idempotence is one of the central concepts in quantum physics, corresponding to an operator
that doesn’t change its output being applied twice. Enforcing idempotence in generative deep learning may be interpreted as imposing a constraint on the model to be a projector on the manifold, corresponding to the train-time target distribution, which was explored for image generation models by Shocher et al. 2023. Idempotent test-time training has predicted to be a valuable approach for uncertainty quantification and adaptation to distribution shifts Durasov et al.2025. We find that although language models are iterative refiners of token predictions, they strugle to preserve idempotence. Thus, we train small-scale idempotent GPT model with expected qualities by design and provide proof-of-concept code for evaluations. Development of the project may let us obtain more robust and adaptable models and lower the probability of catastrophic risks
No reviews are available yet
Cite this work
@misc {
title={
@misc {
},
author={
Jamilya Erkenova, Sergei Kudriashov
},
date={
7/28/25
},
organization={Apart Research},
note={Research submission to the research sprint hosted by Apart.},
howpublished={https://apartresearch.com}
}