Suboptimal Local Minima Exist for Wide Neural Networks with Smooth Activations

成果类型:
Article; Early Access
署名作者:
Ding, Tian; Li, Dawei; Sun, Ruoyu
署名单位:
Chinese University of Hong Kong; University of Illinois System; University of Illinois Urbana-Champaign; University of Illinois System; University of Illinois Urbana-Champaign
刊物名称:
MATHEMATICS OF OPERATIONS RESEARCH
ISSN/ISSBN:
0364-765X
DOI:
10.1287/moor.2021.1228
发表日期:
2022
关键词:
optimality
摘要:
Does a large width eliminate all suboptimal local minima for neural nets? An affirmative answer was given by a classic result published in 1995 for one-hidden-layer wide neural nets with a sigmoid activation function, but this result has not been extended to the multilayer case. Recently, it was shown that, with piecewise linear activations, suboptimal local minima exist even for wide nets. Given the classic positive result on smooth activation and the negative result on nonsmooth activations, an interesting open question is: Does a large width eliminate all suboptimal local minima for deep neural nets with smooth activation? In this paper, we give a largely negative answer to this question. Specifically, we prove that, for neural networks with generic input data and smooth nonlinear activation functions, suboptimal local minima can exist no matter how wide the network is (as long as the last hidden layer has at least two neurons). Therefore, the classic result of no suboptimal local minimum for a one-hidden-layer network does not hold. Whereas this classic result assumes sigmoid activation, our counterexample covers a large set of activation functions (dense in the set of continuous functions), indicating that the limitation is not a result of the specific activation. Together with recent progress on piecewise linear activations, our result indicates that suboptimal local minima are common for wide neural nets.