
A central focus is the Lipschitz constant of the neural network, a key sensitivity measure to input perturbations. To estimate tight upper bounds, a scalable method based on semidefinite programming is proposed, leveraging the layer-wise structure of general feedforward architectures, which include convolutional and pooling layers. Beyond analysis, the thesis introduces training methods for neural networks with prescribed Lipschitz bounds by incorporating semidefinite constraints into the optimization problem.
Efficient algorithms and network parameterizations are developed to ensure constraint satisfaction throughout training. Finally, robust control techniques are employed to certify closed-loop stability for systems with neural components. In particular, dynamic integral quadratic constraints are used to describe nonlinear activation functions, enabling a less conservative stability analysis than existing approaches. Overall, this work bridges the gap between control theory and deep learning by providing scalable tools for safely integrating neural networks into feedback control and safety-critical applications.
Preview (PDF)
Keywords:
| 46.50 € | ||
| in stock | ||
| 42.00 € | ||
| 56.50 € | ||
| 60.50 € | ||
You can purchase the eBook (PDF) alone or combined with the printed book (Bundle). In both cases we use the payment service of PayPal for charging you - nevertheless it is not necessary to have a PayPal-account. With purchasing the eBook or eBundle you accept our licence for eBooks.
For multi-user or campus licences (MyLibrary) please fill in the form or write an email to order@logos-verlag.de