Posts by Collection

portfolio

publications

Transformers are Provably Optimal In-context Estimators for Wireless Communications

Published in AISTATS, 2025

This paper introduces the concept of in-context estimation (ICE), where pre-trained transformers adapt to new tasks by leveraging limited prompts without explicit optimization. It proves that single-layer softmax attention transformers (SATs) can optimally solve ICE problems for a subclass of cases and demonstrates that multi-layer transformers efficiently handle broader ICE problems, outperforming standard approaches. The study highlights that transformers achieve near-optimal performance with minimal context examples, rivaling estimators with perfect knowledge of the latent context.

Recommended citation: Vishnu Teja Kunde, Vicram Rajagopalan, Chandra Shekhara Kaushik Valmeekam, Krishna Narayanan, Srinivas Shakkottai, Dileep Kalathil, and Jean-Francois Chamberland. "Transformers are Provably Optimal In-context Estimators for Wireless Communications." arXiv preprint, arXiv:2311.00226, 2025
Download Paper

talks

teaching

Teaching experience 1

Undergraduate course, University 1, Department, 2014

This is a description of a teaching experience. You can use markdown like any other post.

Teaching experience 2

Workshop, University 1, Department, 2015

This is a description of a teaching experience. You can use markdown like any other post.