• ECTS

    2 credits

  • Component

    Faculty of Science

Description

Statistical modeling is based on the two fundamental notions of information (extracted from the data) and decision (made on the basis of the data). This course introduces the theoretical formalization of these two notions. It is therefore logically placed at the beginning of the curriculum, with many other courses using its notions and results later on.

Read more

Objectives

Introduce the two concepts at the heart of mathematical statistics: information (quantification of information and coding) and decision (quantification and risk management).

Read more

Necessary prerequisites

Probability theory course.

 

 

Recommended prerequisites: a good command of probability calculus, derivation and integration.

Read more

Syllabus

Introduction

Random environment: the problem of reducing uncertainty and associated risks.

I - Information theory

  1. Entropy of a distribution

     a) Locating an element in a probability set. Optimal coding and entropy of a discrete variable. b) Entropy of a continuous variable. c) Entropy of a vector. d) General properties of entropy: affine variable change, independence.

   2. Mutual information

     a) Mutual information and conditional entropy of two events. b) Mutual information and conditional entropy of two variables. c) Kullback-Leibler contrast. Chi2 approximation. Application to the mutual information of two variables.

  3. Outline of statistical applications

    a) Predictive variable selection. b) Regression trees and classification (CART). c) Classification by segmentation trees. d) Law selection in a parametric model. Pseudo-true law. e) Optimal recoding of a variable: unsupervised/supervised case.

II - Decision theory

  1. Background and issues.

    a) Random experiment, state of nature, decision, loss, pure/mixed decision rule, risk. b) Pre-order on decision rules.

  2. Some classic problems:

   a) Point estimation. b) Ensemblistic estimation. c) Hypothesis testing. d) Diagnosis (classification).

  3. Pre-ordering of decision rules.

    a) General absence of an optimal rule: absence of an optimal estimator, absence of an optimal test.

     b) Admissible rules. c) Essentially complete class of rules. Convexity theorem.

  4. Statistical principles & rule selection

     a) Principles for transforming partial preorders into total preorders: minimax principle, Bayes principle.

    b) Selection principles: unbiased rules, rules based on information theory.

  5. Further development of the Bayesian framework

    a) A priori density of the parameter. Joint density of parameter and observations. A posteriori density of the parameter.

    b) Which Priors? Conjugate priors. Non-informative priors: uniform prior, Jeffrey's prior.

    c) Bayes risk. Bayes rule: Bayes estimator and Bayes test. Credibility interval.

Read more

Further information

Hourly volumes :

            CM : 9

            TD : 9

            TP : 

            Terrain :

Read more