STBLLM: Breaking the 1-Bit Barrier with Structured Binary LLMs
Published in The Thirteenth International Conference on Learning Representations (ICLR), 2025
STBLLM introduces structured binary large language models that push quantization beyond the 1-bit barrier. By leveraging structured binary formats and tailored training techniques, STBLLM maintains competitive accuracy while dramatically reducing memory footprint and computation costs.
Recommended citation: Peng Dong, Lin Li, Yuke Zhong, Dazhen Du, **Ruibo Fan**, Yuxin Chen, et al., "STBLLM: Breaking the 1-Bit Barrier with Structured Binary LLMs," in *Proceedings of the 13th International Conference on Learning Representations (ICLR)*, 2025.
Download Paper | Code | Download Bibtex
