Skip to main content

Master's degree Mathematics and computer science

Macaron diplôme national de Master contrôlé par l'Etat
Bac+1
Bac+2
Bac+3
Bac+4
Bac+5
M1
M2
Field(s)
Sciences and engineering
Degree
Master's degree  
Mention
Computer Science  
Program
Mathematics and computer science  
How to apply
Initial training, Recognition of prior learning  
Course venue
Campus Marne la Vallée - Champs sur Marne, Bâtiment Copernic
Capacities
10  
Training from

Entry requirements

M1 in Mathematics or Computer Science, plus L2-level skills in the other discipline.

Benefits of the program

This Master’s is unique in France, covering both mathematics and computer science, with requirements in both disciplines. It is based on the teaching team's joint extensive experience, developed in the prestigious Bézout Labex.

Acquired skills

Master's level in areas at the interface of mathematics and computer science: optimisation, analysis, geometry, combinatorics and machine learning.

Development of research skills: autonomy, personal work on specialised themes, literature review.

Advanced programming skills oriented towards applications in mathematics and computer science.

International

Bézout grants are awarded to a number of international students. English-speaking students are welcome.

Capacities

10

Course venue

Campus Marne la Vallée - Champs sur Marne, Bâtiment Copernic

Your future career

Students can pursue further studies with a PhD in mathematics or computer science.

 

Jobs in R&D in areas at the interface of the two disciplines, typically optimisation and machine learning.

Classes in machine learning, in particular, promote the development of professional skills that are highly sought after in the private sector, to make graduates immediately operational.

Professional integration

Students can pursue further studies with a PhD in mathematics or computer science. Jobs in R&D in areas at the interface of the two disciplines, typically optimisation and data science. Classes in machine learning, in particular, promote the development of professional skills that are highly sought after in the private sector, to make graduates operational.

Study objectives

Students are trained to be able to continue with a PhD in mathematics or computer science, in one of the many areas at the interface of the two disciplines. Emphasis is also placed on certain cutting-edge skills that are highly sought after in the private sector, like machine learning and optimisation, to ensure students have a good grasp of these areas and can find employment straight after graduating.

Major thematics of study

Mathematics and computer science: discrete and continuous optimisation, algorithms and combinatorics, geometry, data science, large random matrices, random graphs.

Calendar

Foundation classes for one month, then ten weeks of core modules, then eight weeks of specialisation, and finally three to six weeks of work placement, starting from April.

Options

The options on offer are subject to change every year, so as to reflect the various areas of research, except “data science” which is always offered.

 

Data science: AI techniques, machine learning, applications and use of programming tools.

 

Algebraic combinatorics and computer algebra: operads, Hopf algebra.

 

Large random matrices and applications: theory and applications in signal processing and statistical testing.

 

Random graphs and graphons: theory of large dense graphs and applications.

Semester 3

CoursesECTSCMTDTP
Basics in mathematics

Basic courses in analysis, algebra, probability and geometry.

6 16h 24h
Basics in computer science

Basic courses in complexity, algorithmic, programming and graphs

6 16h 24h
Discrete and continuous optimization

Discrete optimization : Min-max results in combinatorial optimization provide elegant mathematical statements, are often related to the existence of efficient algorithms, and illustrate well the power of duality in optimization. The course will rely on concrete examples taken from industry. Plan of the course: Discrete optimization in bipartite graphs. Chains and antichains in posets. Chordal graphs. Perfect graphs. Lovász theta function. Continuous optimization : The course will cover over theoretic and algorithmic aspects of convex  optimization in a finite-dimensional setting. Plan of the course: Linear programming, the simplex algorithm, totally unimodular matrices, convex fuctions, semi-definite programming, convex programming, Karush-Kuhn-Tucker conditions. Weak and strong duality, Farkas Lemma. Sparse solutions via L1 penalization. LASSO method.

6 20h 30h
Probabilistic algorithms and combinatorics

Probabilistic algorithms. This course is about randomized algorithms, which relies on randomness to speed up their running time. The methods: Las Vegas and Monte Carlo algorithmes, complexity classes for randomized algorithms, lower bounds: Yao’s Minimax principle, some probabilistic settings useful in algorithms: coupon collector, birthday problem, …Applications: analysis of Quicksort and Quickselect algorithms, stable marriage problem, probabilistic data structures (hash tables, skip lists, treaps, …), probabilistic counting, graph algorithms. Combinatorics. The lectures on enumerative combinatorics will consist in the study of classical objects: permutations, trees, partitions, parking functions, …- classical sequences: factorial, Catalan, Schroder, …- classical methods: bijections, group actions, induction, generating series.The lectures will be heavily based on the study of various examples, some very easy and others trickier.

6 24h 24h
Combinatoire

Etude des objets classiques de la combinatoire : permutations, arbres, partitions, fonctions de parking. Etude des principales suites de comptage : factorielle, Catalan, Shroder. Introduction des méthodes classiques : bijectives, action de groupe, induction, séries génératrices. Le cours est basé sur l'étude de nombreux exemples.

 

3 12h 12h
Géométrie

Algorithmique et combinatoire des graphes plongés. Voir le descriptif en anglais.

 

3 12h 12h
Discrete geometry

Discrete differential geometry. Extension of curvature to discrete objects such that polyedra or graphs. Geometric flow under constraint preserves a texture mapping along deformation. Programm : Discrete surface theory, topology and Gaussian curvature, Gauss Bonnet theorem. Discrete differential calculus. Discrete mean curvature on triangulated surfaces. Parametrization by line of curvature, Quadrangulation and application to architecture. Intrinsic curvature and application to graph theory. Discrete isoperimetry. We extend the classical isoperimetric problem (which sets have maximal area among sets of given perimeter) to graphs. We present : Brunn-Minkowski and Loomis-Whitney theorems, entropy, Sauer-Shelah, Harper’s theorems, Boolean analysis on the hypercube, Cheeger’s theorem linking the spectral gap to expansion properties of a graph and we give algebraic and probabilistic constructions of families of expander graphs.

6 24h 24h
Fondements mathématiques des sciences de données

Présentation des outils théoriques nécessaire aux sciences de données. Voir le descriptif en anglais.

 

3 12h 12h
Fondements informatique des sciences des données

Ce cours présente les outils informatique utilisé pour la science des données. Voir le descriptif en anglais.

 

3 12h 12h

Semester 4

CoursesECTSCMTDTP
Stage

18
Data Sciences

Machine learning. OBJECTIVES: Understanding  of the principal Artificial Intelligence algorithms: machine & deep learning. Introduction to the optimization and the stochastic approximation algorithms for learning. Building predictive methods on unstructured datasets such as text data. PROGRAM: Introduction to statistical learning: theoretical and empirical risk, Bias-Variance equilibrium, overfitting; Aggregating methods: random forests; bagging and boosting methods; Kernels methods and Support Vector machine algorithms; Convexification, regularization and penalization technics: Lasso, Ridge, elastic net.. ; Deep learning algorithms: feedforward, convolutional and recurrent networks, dropout regularization; Prediction with unstructured text data:  bagofwords, word2vec; Introduction to reinforcement learning.

6 16h 16h
Random graphs and graphons

The objective is to present the theory of large dense graphs, in their analytical, probabilistic and combinatorics aspects. Program : introduction and reminders on finite graphs ; definition of a graphon, graphon as a generator of dense random graphs ; properties of the space of graphons, cut-distance ; convergence of large graphs to a graphon ; sampled graphs; Inequalities of concentration and convergence ; application: classical combinatorial inequalities ; application: epidemic on a graphon, graphs biased by size ; application: degree function, exponential model

6 16h 40h
Algebraic combinatorics and formal calculus

Operads in combinatorics: Informally, an operad is a space of operations having one output and several inputs that can be composed. We present some elementary objects of algebraic combinatorics: combinatorial classes and algebras. We introduce (non-symmetric) operads and study some tools allowing to establish presentations by generators and relations of operads. Koszul duality and generalizations: colored operads, symmetric operads, and pros. We shall also explain how the theory of operads offers a tool to obtain enumerative results. Algebraic combinatorics : study of classical symmetric functions and discussion about representation theory, noncommutative symmetric functions (NCSF), the definition of Hopf algebras, the dual algebra of NCSF, quasi-symmetric functions, the modern generalizations of those algebras and use of all these algebraic properties (transition matrices, expressions in various bases, morphisms of Hopf algebras) to solve (classical) combinatorial questions.

6 16h 16h
Grandes matrices aléatoires et applications

The purpose of large Random Matrix Theory (RMT) is to describe the eigenstructure (eigenvectors and eigenvalues) of matrices whose entries are random variables and whose dimensions go to infinity. The first results go back to Wigner (1948) for random symmetric matrices and Marchenko and Pastur (1967) for large covariance matrices. Both results have been motivated by questions in theoretical physics which still provides open problems. In the eighties, Voiculescu used RMT as a tool to address open problems in operator theory. This point of view turned to be extremely successful as many such open problems received a solution with the help of RMT. A whole theory known as free probability has been developed which tightly relies on RMT. In the nineties, RMT turned to be very successful to address problems in wireless communications, to analyze the performances of multi-antenna telecommunication networks, and to provide usefull results in statistical signal processing. For 20 years, the theory of large random matrices is very active as can be seen by the publication of five major monographs on the subject. The goal of the course is to present the most classical and prominent results in the field together with some statistical applications: Basic techniques in RMT and Stieltjes transform; Marchenko-Pastur’s theorem which describes the limiting spectral measure of large covariance matrices; other models of interest: large covariance matrices with general population matrix, signal + noise matrices, etc.; Small perturbations and spiked models. In terms of applications, we will describe the problem of statistical test in large dimension and other statistical problems.

6 16h 16h

CARAYOL Arnaud (M1-M2)

NICAUD Cyril (M2)

Academic coordinator

VANTIEGHEM Nicolas (M2)

Academic secretary
Phone number : 01 60 95 77 83
Building : Copernic
Office : 2B179

LARANCE Charlène

Gestionnaire formation continue
SOLTANI Amel
Gestionnaire VAE
Partners

CERMICS