• ECTS

    5 credits

  • Component

    Faculty of Science

Description

This course is a continuation of the optimization course of the second semester of L3.

After a reminder of the results and numerical methods for first and second order optimization problems, unconstrained and constrained by equality and inequality, the course focuses on issues of current interest in industrial optimization, and in particular, robust, multi-criteria optimization, in the presence of uncertainties.

The course then illustrates the place of optimization in the main machine learning algorithms. These issues are illustrated by examples of classification and regression problems in supervised learning. These examples are used to discuss issues of metrics and procedures for learning evaluation, validation and inference (crossfold, overfitting, etc).

The course presents the different classes of learning: unsupervised, supervised, transfer, reinforcement, incremental, etc.

The issues surrounding database management are addressed: generation, imputation, visualization, slicing.

The course presents the links between transfer learning and numerical simulation to address issues of synthetic database generation, imputation, non-intrusive prediction, rapid inference, etc.

The course includes an important part of computer projects along the way. All the sessions take place in a computerized environment and allow an immediate implementation of the theoretical elements.

Read more

Objectives

Make the link between numerical optimization and mathematical learning. Discover machine learning through concrete examples.

Read more

Necessary pre-requisites

Basics of analysis, numerical solutions of ordinary differential equations, numerical linear algebra, programming experiences in interpreted language.

 

 

Recommended prerequisites: L3 semester 2 optimization course. Programming in Python.

Read more

Syllabus

-Unconstrained optimization

-First order methods: Gradient descent, conjugate gradients

-Separable functions, stochastic gradient, coordinate descent

-Second order methods: Newton, Quasi-Newton (BFGS, L-BFGS).

-Technique for evaluating the gradient of a functional (finite differences, complex variables, adjoint, automatic differentiation).

-How to do without the Hessian

-Optimization under equalities and inequalities constraints

-Lagrangian

-Interpretation of Lagrange multipliers

-Primal-Dual Problem

-Quadratic minimization under linear constraints

-Lagrangian Saddle Point

-Uzawa's algorithm

-Comparison of penalization/Primal-Dual/Uzawa methods, augmented Lagrangian

-Terms of KKT

-Conditions of Complementarity

-Projected gradient algorithm, projected Uzawa, external penalization

-Global optimization, moment methods, ADAM, RMSprop

Multi-criteria optimization, Pareto front

-Robust optimization, on interval, in the presence of uncertainty

-Accrual techniques (L1, L2, ...)

-Illustrations in Python

 

-Optimization in machine learning: linear models, logistic regression, wide margin separator, random trees and forests, neural networks.

-Dimensional reduction: principal component analysis, singular value decomposition

-Databases and imputation

-Scikit-Learn Python library.

Read more

Additional information

Hourly volumes:

            CM : 21

            TD : 21

            TP : 0

            Land : 0

Read more