PUBLICATIONS
You can also find my articles on my Google Scholar profile.
The symbol * represents the equal contribution.
Preprints
- Kangqiao Liu, Masaya Nakagawa, and Masahito Ueda,
Maxwell’s Demon for Quantum Transport,
[arXiv:2303.08326].
Peer-reviewed
Takashi Mori, Liu Ziyin, Kangqiao Liu, and Masahito Ueda,
Power-law escape rate of SGD,
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:15959-15975, 2022 (ICML 2022)
[arXiv:2105.09557].Liu Ziyin*, Kangqiao Liu*, Takashi Mori, and Masahito Ueda,
Strength of Minibatch Noise in SGD,
The 10th International Conference on Learning Representations (ICLR 2022)
selected for Spotlight (5% of all submissions)
[arXiv:2102.05375].Kangqiao Liu*, Liu Ziyin*, and Masahito Ueda,
Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent,
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:7045-7056, 2021 (ICML 2021)
[arXiv:2012.03636].Kangqiao Liu, Zongping Gong, and Masahito Ueda,
Thermodynamic Uncertainty Relation for Arbitrary Initial States,
Physical Review Letters 125, 140602 (2020)
[arXiv:1912.11797].