2025 ArXiv-2503 QArtSR: Quantization via Reverse-Module and Timestep-Retraining in One-Step Diffusion based Image Super-Resolution Libo Zhu, Haotong Qin, Kaicheng Yang, Wenbo Li, Yong Guo, and 3 more authors In Under Review, Mar 2025 arXiv PDF Code ICML-2502 BiMaCoSR: Binary One-Step Diffusion Model Leveraging Flexible Matrix Compression for Real Super-Resolution Kai Liu†, Kaicheng Yang†, Zheng Chen, Zhiteng Li, Yong Guo, and 3 more authors In ICML, Feb 2025 arXiv PDF Code ArXiv-2509 RobuQ: Pushing DiTs to W1.58A2 via Robust Activation Quantization Kaicheng Yang†, Xun Zhang†, Haotong Qin, Yucheng Lin, Kaisen Yang, and 1 more author In Under Review, Sep 2025 arXiv PDF Code ArXiv-2509 Explore-Execute Chain: Towards an Efficient Structured Reasoning Paradigm Kaisen Yang†, Lixuan He†, Rushi Shah, Kaicheng Yang, Qinwei Ma, and 2 more authors In Under Review, Sep 2025 arXiv PDF Code ArXiv-2510 PT^2-LLM: Post-Training Ternarization for Large Language Models Xianglong Yan†, Chengzhu Bao†, Zhiteng Li, Tianao Zhang, Kaicheng Yang, and 4 more authors In Under Review, Oct 2025 arXiv PDF Code ArXiv-2512 TreeQ: Pushing the Quantization Boundary of Diffusion Transformer via Tree-Structured Mixed-Precision Search Kaicheng Yang, Kaisen Yang, Baiting Wu, Xun Zhang, Qianrui Yang, and 3 more authors In Under Review, Dec 2025 arXiv PDF Code