In this paper, we enhance the robustness and accuracy of deep neural network (DNN) by introducing the L2,∞ normalization of the weight matrices of DNN with Relu as the activation
function. It is proved that the L2,∞ normalization leads to large dihedral angles between two
adjacent faces of the DNN function graph and hence smoother DNN functions, which reduces
over-fitting of the DNN. A measure is proposed for the robustness of a classification DNN, which
is the union of the volumes of the maximal robust spheres with the sample points as centers. A
lower bound for the robustness measure in terms of the L2,∞ norm is given. Finally, an upper
bound for the Rademacher complexity of DNN with L2,∞ normalization is given. An algorithm
is given to train a DNN with the L2,∞ normalization and experimental results are used to show
that the L2,∞ normalization is effective in terms of improving the robustness and accuracy