Laboratoires Participants

LIP6 - UPMC
INRIA- Rocquencourt
Michèle Soria
(Coordinatrice)
LIAFA-Paris Diderot
LIX- Ecole Polytechnique
Dominique Rossin
IGM - Paris Est
LIPN - Paris Nord
Frédérique Bassino

voir le site temporel

Magnum
In construction
LIAFA-Paris Diderot
LIX- Ecole Polytechnique
  • Dominique Rossin
  • Enrica Duchi
  • Jeremy Lovejoy
  • Roberto Mantaci
  • Anne Micheli
  • Thi Ha Duong Phan
  • Dominique Poulalhon
  • Olivier Mallet
  • Mathilde Bouvel
LIP6 - UPMC
INRIA- Rocquencourt
  • Michèle Soria
  • Olivier Bodini
  • Maryse Pelletier
  • Carine Pivoteau
  • Alexis Darrasse
  • Olivier Roussel

Post-doctorants
  • Lucas Gerin
  • Yann Ponty
IGM
Paris-Est
LIPN
Paris-Nord
LRI
Paris-Sud
  • Sylvie Corteel
  • Francesca Fiorenzi
  • Philippe Nadeau
  • Matthieu Josuat-Verges
-->

The central theme of the MAGNUM project is the elaboration of complex discrete models that are of broad applicability in several areas of computer science. A major motivation for the development of such models is the design and analysis of efficient algorithms dedicated to simulation of large discrete systems and random generation of large combinatorial structures. Another important motivation is to revisit the area of average-case comple xity theory under the angle of realistic data models. The project proposes to develop the general theory of complex discrete models, devise new algorithms for random generation and simulation, as well as bridge the gap between theoretical analyses and practically meaningful data models. The sophisticated methods developed during the past decades make it possible to enumerate and quantify parameters of a large variety of combinatorial models, including trees, graphs, words and languages, permutations, etc. However these methods are mostly targeted at the analysis of uniform models , where, typically, all words (or graphs or trees) are taken with equal likelihood. Accordingly, the existing average-case and probabilistic analyses of algorithms are essentially confined to a uniformity assumption that is often far from being representative of the structure of data present real applications.

The MAGNUM project proposes to depart from this uniformity assumption and develop new classes of models that bear a fair relevance to real-life data, while being, at the same time, still mathematically tractable. Such models are the ones most likely to be connected with efficient algorithms and data structures.

The MAGNUM project is decomposed into four tasks, all of them both relying on sound mathematical existing methodologies, that we want to develop and interconnect. It aims at modeling complex phenomena that appear in real applications. The kernel is the random generation of complex combinatorial structures and simulation of random dynamical discrete systems. A preamble of these tasks is the study of complex combinatorial models, and a consequence is the possibility of analysing algorithms with complex input data. The fours tasks in the project deal with models and applications.

  • Task 1 concentrates on examining complex models : the objective here is to elucidate the structure of objects (such as permutations or graphs) obeying various forms of local or global constraints, and quantify their properties. This serves as the common "knowledge basis" for the next three tasks, namely random generation, simulation and analysis of algorithms on realistic input data.
  • Task 2 is devoted to random generation with Boltzmann models; one objective is to enrich the existing Boltzmann model and adapt it for non-uniform frameworks, another important goal being to design efficient and certified random generation algorithms.
  • Task 3 concerns discrete dynamical systems (cellular automata and sandpile models) where the dynamic itself is random --these models can be described as special Markov processes; in this case the development of high-performance simulation algorithms is conditioned by a deep investigation of the dynamics involved.
  • The goal of Task 4 is to revisit the analysis of some of the most fundamental algorithms, including divide-and-conquer methods, sorting and searching, determinisation and minimisation of automata. So far, existing analyses have been confined to simple models, which are not realistic; our purpose in the Magnum Project is precisely to extend the analysis to realistic data models. Centered on complex discrete models, relying on analytical and probabilistic methods, and focusing on random generation and simulation, MAGNUM is both a merging and a natural follow-up of the two former ANR projects GAMMA and MASED.
-->