Stochastic Gradient Descent and Anomaly of Variance-Flatness Relation in Artificial Neural Networks
-
Abstract
Stochastic gradient descent (SGD), a widely used algorithm in deep-learning neural networks, has attracted continuing research interests for the theoretical principles behind its success. A recent work reported an anomaly (inverse) relation between the variance of neural weights and the landscape flatness of the loss function driven under SGD Feng Y and Tu Y Proc. Natl. Acad. Sci. USA118 e2015617118 (2021). To investigate this seeming violation of statistical physics principle, the properties of SGD near fixed points are analyzed with a dynamic decomposition method. Our approach recovers the true “energy” function under which the universal Boltzmann distribution holds. It differs from the cost function in general and resolves the paradox raised by the the anomaly. The study bridges the gap between the classical statistical mechanics and the emerging discipline of artificial intelligence, with potential for better algorithms to the latter.
Article Text
-
-
-
About This Article
Cite this article:
Xia Xiong, Yong-Cong Chen, Chunxiao Shi, Ping Ao. Stochastic Gradient Descent and Anomaly of Variance-Flatness Relation in Artificial Neural Networks[J]. Chin. Phys. Lett., 2023, 40(8): 080202. DOI: 10.1088/0256-307X/40/8/080202
Xia Xiong, Yong-Cong Chen, Chunxiao Shi, Ping Ao. Stochastic Gradient Descent and Anomaly of Variance-Flatness Relation in Artificial Neural Networks[J]. Chin. Phys. Lett., 2023, 40(8): 080202. DOI: 10.1088/0256-307X/40/8/080202
|
Xia Xiong, Yong-Cong Chen, Chunxiao Shi, Ping Ao. Stochastic Gradient Descent and Anomaly of Variance-Flatness Relation in Artificial Neural Networks[J]. Chin. Phys. Lett., 2023, 40(8): 080202. DOI: 10.1088/0256-307X/40/8/080202
Xia Xiong, Yong-Cong Chen, Chunxiao Shi, Ping Ao. Stochastic Gradient Descent and Anomaly of Variance-Flatness Relation in Artificial Neural Networks[J]. Chin. Phys. Lett., 2023, 40(8): 080202. DOI: 10.1088/0256-307X/40/8/080202
|