• ECTS

    5 credits

  • Training structure

    Faculty of Science

Description

This course is a continuation of the optimization course from the second semester of L3.

After reviewing the results and numerical methods for first- and second-order optimization problems, without constraints and under equality and inequality constraints, the course focuses on issues of current interest in industrial optimization, in particular robust, multi-criteria optimization in the presence of uncertainty.

The course then illustrates the role of optimization in the main mathematical learning algorithms (machine learning). These issues are illustrated by examples of classification and regression problems in supervised learning. These examples provide an opportunity to discuss issues of metrics and procedures for evaluating learning, validation, and inference (crossfold, overfitting, etc.).

The course presents the different types of learning: unsupervised, supervised, transfer learning, reinforcement learning, incremental learning, etc.

Issues surrounding database management are addressed: generation, allocation, visualization, and segmentation.

The course presents the links between transfer learning and numerical simulation to address issues such as synthetic database generation, imputation, non-intrusive prediction, rapid inference, etc.

The course includes a significant amount of ongoing IT projects. All sessions take place in a computerized environment and allow for immediate implementation of theoretical concepts.

Read more

Objectives

Make the connection between digital optimization and mathematical learning. Discover machine learning through concrete examples.

Read more

Teaching hours

  • Optimization - CMLecture9 p.m.
  • Optimization - TutorialTutorials9 p.m.

Mandatory prerequisites

Fundamentals of analysis, numerical solutions of ordinary differential equations, numerical linear algebra, programming experience in interpreted languages.

 

 

Recommended prerequisites: L3 semester 2 optimization course. Programming in Python.

Read more

Syllabus

-Optimization without constraints

-First-order methods: Gradient Descent, Conjugate Gradients

-Separable functions, Stochastic gradient, Coordinate descent

-Second-order methods: Newton, Quasi-Newton (BFGS, L-BFGS).

-Techniques for evaluating the gradient of a functional (finite differences, complex variables, adjoint, automatic differentiation).

-How to do without Hessian

-Optimization under equality and inequality constraints

-Lagrangian

-Interpretation of Lagrange multipliers

-Primal-Dual Problem

-Quadratic minimization under linear constraints

-Lagrangian saddle point

-Uzawa algorithm

-Comparison of penalty methods/Primal-Dual/Uzawa, augmented Lagrangian

-KKT conditions

-Conditions of Complementarity

-Projected gradient algorithm, projected Uzawa, external penalty

-Global optimization, moment methods, ADAM, RMSprop

-Multi-criteria optimization, Pareto front

-Robust optimization, over an interval, in the presence of uncertainty

-Regularization techniques (L1, L2, etc.)

-Illustrations in Python

 

-Machine learning optimization: linear models, logistic regression, large margin classifier, random trees and forests, neural networks.

-Dimensional reduction: principal component analysis, singular value decomposition

-Databases and imputation

-Python Scikit-Learn library.

Read more

Additional information

Hourly volumes:

            CM: 21

            TD: 21

            TP: 0

            Land: 0

Read more