Classification with neural networks with quadratic decision functions
-
Leon Frischauf
, Otmar Scherzer und Cong Shi
Abstract
Neural networks with quadratic decision functions have been introduced as alternatives to standard neural networks with affine linear ones. They are advantageous when the objects or classes to be identified are compact and of basic geometries, such as circles, ellipses, etc. In this paper, we investigate the use of such ansatz functions for classification. In particular, we test and compare the algorithm on the MNIST dataset for classification of handwritten digits and for classification of subspecies. We also show that the implementation can be based on the neural network structure in the software Tensorflowand Keras, respectively.
Abstract
Neural networks with quadratic decision functions have been introduced as alternatives to standard neural networks with affine linear ones. They are advantageous when the objects or classes to be identified are compact and of basic geometries, such as circles, ellipses, etc. In this paper, we investigate the use of such ansatz functions for classification. In particular, we test and compare the algorithm on the MNIST dataset for classification of handwritten digits and for classification of subspecies. We also show that the implementation can be based on the neural network structure in the software Tensorflowand Keras, respectively.
Kapitel in diesem Buch
- Frontmatter I
- Preface V
- Contents VII
-
Part I: Mathematical aspects of data-driven methods in inverse problems
- On optimal regularization parameters via bilevel learning 1
- Learned regularization for inverse problems 39
- Inverse problems with learned forward operators 73
- Unsupervised approaches based on optimal transport and convex analysis for inverse problems in imaging 107
- Learned reconstruction methods for inverse problems: sample error estimates 163
- Statistical inverse learning problems with random observations 201
- General regularization in covariate shift adaptation 245
-
Part II: Applications of data-driven methods in inverse problems
- Analysis of generalized iteratively regularized Landweber iterations driven by data 273
- Integration of model- and learning-based methods in image restoration 303
- Dynamic computerized tomography using inexact models and motion estimation 331
- Deep Bayesian inversion 359
- Utilizing uncertainty quantification variational autoencoders in inverse problems with applications in photoacoustic tomography 413
- Electrical impedance tomography: a fair comparative study on deep learning and analytic-based approaches 437
- Classification with neural networks with quadratic decision functions 471
- Index 495
Kapitel in diesem Buch
- Frontmatter I
- Preface V
- Contents VII
-
Part I: Mathematical aspects of data-driven methods in inverse problems
- On optimal regularization parameters via bilevel learning 1
- Learned regularization for inverse problems 39
- Inverse problems with learned forward operators 73
- Unsupervised approaches based on optimal transport and convex analysis for inverse problems in imaging 107
- Learned reconstruction methods for inverse problems: sample error estimates 163
- Statistical inverse learning problems with random observations 201
- General regularization in covariate shift adaptation 245
-
Part II: Applications of data-driven methods in inverse problems
- Analysis of generalized iteratively regularized Landweber iterations driven by data 273
- Integration of model- and learning-based methods in image restoration 303
- Dynamic computerized tomography using inexact models and motion estimation 331
- Deep Bayesian inversion 359
- Utilizing uncertainty quantification variational autoencoders in inverse problems with applications in photoacoustic tomography 413
- Electrical impedance tomography: a fair comparative study on deep learning and analytic-based approaches 437
- Classification with neural networks with quadratic decision functions 471
- Index 495