PUBLICATIONS
You can also find my articles on my Google Scholar profile.
The symbol * represents the equal contribution.
Preprints
Kangqiao Liu and Jie Gu,
Dynamical activity universally bounds precision of response,
[arXiv:2410.20800] (2024.10.28).Kangqiao Liu, Masaya Nakagawa, and Masahito Ueda,
Maxwell’s Demon for Quantum Transport,
[arXiv:2303.08326] (2023.03.15).
Peer-reviewed
Jiale Yu*, Shiyu Wang*, Kangqiao Liu, Chen Zha, Yulin Wu, Fusheng Chen, Yangsen Ye, Shaowei Li, Qingling Zhu, Shaojun Guo, Haoran Qian, He-Liang Huang, Youwei Zhao, Chong Ying, Daojin Fan, Dachao Wu, Hong Su, Hui Deng, Hao Rong, Kaili Zhang, Sirui Cao, Jin Lin, Yu Xu, Cheng Guo, Na Li, Futian Liang, Yong-Heng Huo, Chao-Yang Lu, Cheng-Zhi Peng, Kae Nemoto, W. J. Munro, Xiaobo Zhu, Jian-Wei Pan, and Ming Gong,
Experimental Demonstration of a Maxwell’s Demon Quantum Battery in a Superconducting NISQ Processor,
Physical Review A 109, 062614 (2024) (2024.06.20).Takashi Mori, Liu Ziyin, Kangqiao Liu, and Masahito Ueda,
Power-law escape rate of SGD,
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:15959-15975, 2022 (ICML 2022) (2022.07.15)
selected for Spotlight
[arXiv:2105.09557].Liu Ziyin*, Kangqiao Liu*, Takashi Mori, and Masahito Ueda,
Strength of Minibatch Noise in SGD,
The 10th International Conference on Learning Representations (ICLR 2022) (2022.01.29)
selected for Spotlight (5% of all submissions)
[arXiv:2102.05375].Kangqiao Liu*, Liu Ziyin*, and Masahito Ueda,
Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent,
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:7045-7056, 2021 (ICML 2021) (2021.07.01)
[arXiv:2012.03636].Kangqiao Liu, Zongping Gong, and Masahito Ueda,
Thermodynamic Uncertainty Relation for Arbitrary Initial States,
Physical Review Letters 125, 140602 (2020) (2020.09.29)
[arXiv:1912.11797].