Flexible, Non-parametric Modeling Using Regularized Neural Networks

12/18/2020
by   Oskar Allerbo, et al.
0

Non-parametric regression, such as generalized additive models (GAMs), is able to capture complex data dependencies in a flexible, yet interpretable way. However, choosing the format of the additive components often requires non-trivial data exploration. Here, we propose an alternative to GAMs, PrAda-net, which uses a one hidden layer neural network, trained with proximal gradient descent and adaptive lasso. PrAda-net automatically adjusts the size and architecture of the neural network to capture the complexity and structure of the underlying data generative model. The compact network obtained by PrAda-net can be translated to additive model components, making it suitable for non-parametric statistical modelling with automatic model selection. We demonstrate PrAda-net on simulated data, where we compare the test error performance, variable importance and variable subset identification properties of PrAda-net to other lasso-based approaches. We also apply Prada-net to the massive U.K. black smoke data set, to demonstrate the capability of using Prada-net as an alternative to GAMs. In contrast to GAMs, which often require domain knowledge to select the functional forms of the additive components, Prada-net requires no such pre-selection while still resulting in interpretable additive components.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro