Improve the Robustness and Accuracy of Deep Neural Network with L_2,∞ Normalization

10/10/2020
by   Lijia Yu, et al.
0

In this paper, the robustness and accuracy of the deep neural network (DNN) was enhanced by introducing the L_2,∞ normalization of the weight matrices of the DNN with Relu as the activation function. It is proved that the L_2,∞ normalization leads to large dihedral angles between two adjacent faces of the polyhedron graph of the DNN function and hence smoother DNN functions, which reduces over-fitting. A measure is proposed for the robustness of a classification DNN, which is the average radius of the maximal robust spheres with the sample data as centers. A lower bound for the robustness measure is given in terms of the L_2,∞ norm. Finally, an upper bound for the Rademacher complexity of DNN with L_2,∞ normalization is given. An algorithm is given to train a DNN with the L_2,∞ normalization and experimental results are used to show that the L_2,∞ normalization is effective to improve the robustness and accuracy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro