• ECTS

    5 credits

  • Component

    Faculty of Science

Description

This course is a continuation of the optimization course given in the second semester of L3.

After a review of results and numerical methods for first- and second-order optimization problems, unconstrained and constrained by equality and inequality, the course focuses on issues of current interest in industrial optimization, in particular robust, multicriteria optimization in the presence of uncertainty.

The course then illustrates the role of optimization in the main machine learning algorithms. These issues are illustrated by examples of classification and regression problems in supervised learning. These examples provide an opportunity to discuss metrics and procedures for evaluating learning, validation and inference (crossfold, overfitting, etc.).

The course introduces the different classes of learning: unsupervised, supervised, transfer, reinforcement, incremental, etc.

Database management issues are addressed: generation, imputation, visualization, slicing.

The course introduces the links between transfer learning and numerical simulation to address issues such as synthetic database generation, imputation, non-intrusive prediction, rapid inference and more.

The course includes a significant number of ongoing IT projects. All sessions take place in a computerized environment, enabling immediate implementation of theoretical elements.

Read more

Objectives

Make the link between numerical optimization and mathematical learning. Discover machine learning through concrete examples.

Read more

Necessary prerequisites

Basic analysis, numerical solutions of ordinary differential equations, numerical linear algebra, programming experience in interpreted language.

 

 

Recommended prerequisites: L3 semester 2 optimization course. Python programming.

Read more

Syllabus

Constraint-free optimization

1st-order methods: Gradient descent, conjugate gradients

-Separable functions, stochastic gradient, coordinate descent

Second-order methods: Newton, Quasi-Newton (BFGS, L-BFGS).

-Techniques for evaluating the gradient of a functional (finite differences, complex variables, adjoint, automatic differentiation).

-How to do without Hessian

-Optimization under equality and inequality constraints

-Lagrangian

-Interpreting Lagrange multipliers

-Primal-Dual problem

-Linear constrained quadratic minimization

-Lagrangian Saddle Point

-Uzawa algorithm

-Comparison of penalization/Primal-Dual/Uzawa methods, augmented Lagrangian

-KKT conditions

-Conditions of Complementarity

-Projected gradient algorithm, projected Uzawa, external penalization

-Global optimization, moment methods, ADAM, RMSprop

Multi-criteria optimization, Pareto front

-Robust interval optimization in the presence of uncertainty

-Accrual techniques (L1, L2, ...)

-Illustrations in Python

 

-Optimization in machine learning: linear models, logistic regression, wide margin separator, random trees and forests, neural networks.

-Dimensional reduction: principal component analysis, singular value decomposition

-Databases and imputation

-Scikit-Learn Python library.

Read more

Further information

Hourly volumes :

            CM: 21

            TD : 21

            TP: 0

            Land: 0

Read more