Nos tutelles

PSL * CNRS Dauphine



Séminaires du Pôle 2 : "Optimisation combinatoire, algorithmique"

publié le , mis à jour le


Fabio Furini et Florian Sikora

Liste de diffusion.

Possibilité de s’inscrire à la liste de diffusion en envoyant un email à :
florian . sikora @ dauphine . fr

Prochains séminaires.

Les séminaires ont lieu en général le lundi à 14h.

jeudi 11 mai 2017 (attention jour inhabituel), 14h, A 305 : Giorgio Lucarelli (INRIA and LIG.) :
A primal-dual approach for online scheduling with resource augmentation

A well-identified issue in algorithms and, in particular, in online computation is the weakness of the worst case paradigm. Several more accurate models have been proposed in the direction of eliminating this weakness. Resource augmentation is one of these models which has been successfully used for providing theoretical evidence for several heuristics in scheduling with good performance in practice. According to it, the algorithm is applied to a more powerful environment than that of the adversary. Several types of resource augmentation for scheduling problems have been proposed up to now, including speed augmentation, machine augmentation and more recently rejection. In this context, we propose algorithms for online scheduling problems for minimizing variations of the total weighted flow time objective based on mathematical programming and a primal-dual scheme.

On 04/05/2017 16:57, Florian SIKORA wrote :

Séminaires précédents.

vendredi 28 avril 2017 (attention jour inhabituel), 14h15, D120 : Edouard Bonnet (Middlesex University (Londres)) :
Subexponential algorithms in non sparse classes of graphs

Abstract :
Many optimisation problems, while they remain NP-hard on planar graphs, admit subexponential algorithms running in time $2^O(\sqrt n)$.
The latter fact is due to a classical result of Lipton and Tarjan that planar graphs have balanced vertex-separators of order $\sqrt n$.
This often allows a divide-and-conquer approach based on exhaustively guessing what an optimum solution does on this small separator.
Actually, an idea of Demaine, Fomin, Hajiaghayi, and Thilikos called bidimensionality even yields algorithms running in time $2^O(\sqrt k) n^O(1)$
where $k$ is the size of the solution, for most problems, not only on planar graphs but on other sparse classes of graphs such as graphs excluding
a fixed minor.

In this talk, we move away from planar graphs and consider the next best thing in terms of structure :
geometric graphs such as intersection graphs or visibility graphs ; and ask the same question of the existence of subexponential algorithms.
We survey recent algorithmic results and ETH-based lower bounds (where ETH, the Exponential Time Hypothesis, is the assumption that 3-SAT
has no subexponential algorithm in time $2^o(n)$).
Our focus will be whether algorithms running in time $2^O(\sqrt n)$ or $n^O(\sqrt k)$ are possible or if, on the contrary, no $2^o(n)$
or $n^o(k)$
is likely to be obtained.

jeudi 20 avril 2017 (attention jour inhabituel), 13h30, C110 : Aurélie Lagoutte (G-SCOP) :
From extended formulations of polytopes to the Clique-Stable Set Separation problem.

The concept of extended formulations of polytopes was introduced by Yannakakis. A polytope P has a small extended formulation if it is the projection of a polytope Q lying in a higher dimensional space, with a small number of facets. In particular, he studies the stable set polytope and focus on the case of perfect graphs, where it can be fully described with the clique constraints. He highlights a lower bound stated as a communication complexity problem : Alice is given a clique K, Bob is given a stable set S, and they have to decide via a non-deterministic protocol whether K intersects S or not. A certificate for the non-intersection is a bipartition of the vertices such that K is included in one side, and S is included in the other side. The goal is to find a Clique-Stable set separator, i.e. a set F of certificates which contains every useful bipartitions, and with |F|=poly(|V(G)|). This is not possible in general [Goos15], but Yannakakis’ question on perfect graphs is still also open.

We use different techniques to prove that such a polynomial Clique-Stable set separator exists in restricted classes of graphs : probabilistic arguments for random graphs,
VC-dimension for graphs where a split graph H is forbidden, connections with the Erdos-Hajnal property for graphs with no induced path on k vertices and its complement, and decomposition theorem for perfect graphs with no balanced skew partition, and other classes admitting a decomposition theorem.

Lundi 3 avril 2017, 14h, P 301 : Valentin Garnero (INRIA COATI) :
On augmenting matchings via bounded-length augmentations

In a graph, it is well-known that a matching can be turned into a bigger matching by applying a so-called augmentation operation. Due to a classical result of Berge, by repeatedly applying this operation, we actually necessarily end up with a maximum matching.
Nisse, Salch and Weber considered the influence on this fact of restricting the length k of the augmented paths. In particular, they proved that, for k = 3, reaching, via augmentations of (≤ k)-paths, a maximum matching from an initial matching can be done in polynomial time for any graph, while the same problem for any k ≥ 5 is NP-complete. The latter result remains true for bipartite graphs, while we still have no clue for trees. Nisse, Salch and Weber nevertheless exhibited a polynomial-time algorithm solving this problem for any path.
In a recent work with Bensmail and Nisse, we made some progress towards understanding this problem for general trees. In particular, we have exhibited polynomial-time solving algorithms for a few more tree classes, including bounded-degree trees, caterpillars, subdivided stars, and trees where the vertices with degree at least 3 are sufficiently far apart. We have also obtained some negative results for modified versions of the problem.

During this talk, we will present the proofs of some of these results, and mention some remaining open questions.

Lundi 27 mars 2017, 14h, B203 : Valia Mitsou (LIRIS, Lyon1) :
On the complexity of defective coloring

In defective coloring, we are given a graph G and two integers c, ∆* and are asked if we can c-color G so that the maximum degree induced by any color class is at most ∆*. We will present some algorithmic and complexity results on this problem in classes of graphs where proper coloring is easy : on the one hand, we will consider (sub)classes of perfect graphs, namely cographs and chordal graphs ; on the other hand we will talk about structural parameterizations, such as treewidth and feedback vertex set.

Lundi 20 mars 2017, 14h, A707 : Dimitri Watel (ENSIEE) :
Mutualisation de taxis avec partage de coût : modélisation, complexité, complexité paramétrée et résolution pratique du problème

(Travaux en collaboration avec Alain Faye)

On se propose dans cette présentation d’étudier une variante du problème Dial-A-Ride (DARP). Dans le problème original, on cherche à optimiser les routes de véhicules chargés de transporter des personnes depuis leurs origines respectives vers leurs destinations respectives, tout en respectant des contraintes de fenêtre de temps et des contraintes de capacités (nombre de places dans le véhicule). Ce modèle est généralement utilisé pour optimiser des chemins pour des taxis. Nous nous penchons sur une variante de ce problème dans laquelle les clients partagent le coût des trajets (ou des parties de trajets) qu’ils effectuent avec d’autres clients. L’objectif est de réduire le coût de chaque client d’un facteur au moins égal à alpha.

La présentation se découpe en deux parties : une étude théorique où nous cherchons à déterminer la complexité paramétrée vis-à-vis des divers paramètres naturels du problème et une seconde partie où une heuristique d’énumération est présentée, accompagnée de résultats de tests sur des données réelles.

Lundi 13 mars 2017, 14h, A707 : Nicolas Gastineau (LAMSADE) :
Packing coloring and subcubic graph

An i-packing is a subset X_i of vertices such that any two vertices u and v satisfy d(u,v)>i (where d(u,v) is the distance between u and v). The packing chromatic number is the smallest integer such that there exist k sets of vertices X_1,...,X_k forming a partition of the vertex set of a graph, X_i being an i-packing. In this talk, I will first present decision problems about the existence of i-packings and about the packing chromatic number and their computational complexity. Afterwards, I will present conjectures about graphs having bounded packing chromatic number and about subcubic graphs. These conjectures will be motivated by properties on packing chromatic number.

Lundi 27 février 2017, 14h15, D 102 : Maximilien Danisch (Telecom ParisTech) :
Towards real-world graph algorithmics

Real-world graphs (a.k.a. complex networks) are ubiquitous : the web, Facebook, Twitter, brain networks, protein interaction networks, etc. Studying these graphs and solving computational problems on them (say maximum clique, partitioning or dense subgraph) has applications in many fields. I will first show that the structure of some real-world graphs is special. In particular, they are not adversarial and some difficult problems (say NP-complete or NP-hard problems) can be solved on some huge real-world graphs (say 2G edges or more). I will then present two works along the lines of “understanding and leveraging the structure of real-world graphs in order to build better algorithms” : (i) Enumerating all k-cliques and (ii) Computing the density-friendly graph decomposition.

Lundi 9 janvier 2017, 14h, A707 : Zsolt Tuza (Hungarian Academy of Sciences, Budapest, and University of Pannonia, Veszprem) : Scheduling, graph coloring, and approximation

Joint work with Gyorgy Dosa (University of Pannonia) and Hans Kellerer (University of Graz)

We consider problems of scheduling jobs on a given number of machines,
with some further restrictions on the assignments. The objective is to
minimize the makespan, i.e., to finish all jobs as soon as possible.
Some variants are solvable in polynomial time via matching and network
flow methods, while others are NP-hard. A natural subproblem leads to
the chromatic number of graphs whose independence number is equal to 3.
We prove that this problem is APX-complete, i.e. there exists a constant
c>0 such that a polynomial-time (1+c)-approximation algorithm does not
exist, unless P=NP (while the number of vertices is a trivial
3-approximation for the optimum).

Lundi 12 décembre 2016, 14h, P 301 : Hang Zhou (Max-Planck-Institut für Informatik) : Correlation Clustering and Two-edge-connected Augmentation for Planar Graphs

In correlation clustering, the input is a graph with edge-weights, where every edge is labelled either + or − according to similarity of its endpoints. The goal is to produce a partition of the vertices that disagrees with the edge labels as little as possible.

In two-edge-connected augmentation, the input is a graph with edge-weights and a subset R of edges of the graph. The goal is to produce a minimum weight subset S of edges of the graph, such that for every edge in R, its endpoints are two-edge-connected in R union S.

For planar graphs, we prove that correlation clustering reduces to two-edge-connected augmentation, and that both problems have a polynomial-time approximation scheme.

This is joint work with Philip Klein and Claire Mathieu.

Lundi 14 novembre 2016, 14h, P 303 : Mathias Weller (LIRMM, Montpellier) : An example of FPT in P from bioinformatics, and other gimickery.

Parameterized Complexity theory is not only for NP-hard problems but
makes a difference in designing practical algorithms even for problems
that have polynomial-time algorithms. I’m introducing some of the
research on the subject, along with a current research project of mine
motivated in bioinformatics.
If time and audience motivation permits, I have one or two additional
algorithmic curiosities that might be interesting.

Lundi 7 novembre 2016, 14h, A707 : Zacharie ALES (LIA, Avignon) : Extraction et partitionnement pour la recherche de régularités : application à l’analyse de dialogues.

Un corpus de dialogues peut être représenté par un ensemble de tableaux d’annotations encodant les différents énoncés des dialogues. Nous définissons une méthodologie en deux étapes afin d’identifier les schémas dialogiques mis en oeuvre fréquemment dans le corpus : extraction de motifs récurrents, puis partitionnement de ces motifs en classes homogènes constituant les régularités.

Deux algorithmes sont développés afin de réaliser l’extraction de motifs récurrents : LPCA-DC et SABRE. La première est une adaptation d’un algorithme de programmation dynamique existant tandis que la seconde est issue d’une modélisation formelle du problème d’extraction d’alignements locaux dans un couple de tableaux d’annotations.

Le partitionnement de motifs récurrents est réalisé par diverses heuristiques de la littérature ainsi que deux formulations originales du problème de K-partitionnement sous la forme de programmes linéaires en nombres entiers. Lors d’une étude polyèdrale, nous caractérisons des facettes associées à ces formulations (notamment les inégalités de 2-partitions, les inégalités 2-chorded cycles et les inégalités de clique généralisées). Ces résultats théoriques permettent la mise en place d’un algorithme de branch-and-cut résolvant efficacement le problème.

Nous développons le logiciel d’aide à la décision VIESA, mettant en oeuvre ces différentes méthodes et permettant leur évaluation au cours de deux expérimentations réalisées par un expert psychologue. Des régularités correspondant à des stratégies dialogiques que des extractions manuelles n’avaient pas permis d’obtenir sont ainsi identifiées.

Vendredi 21 octobre 2016, 14h, A711 : Recent improvements in exact Maximum Clique search.

Pablo San Segundo (Centre for Automation and Robotics, Madrid) :

We consider the well known problem from Graph Theory of finding a maximum
clique (a maximum subgraph such that all its vertices are pairwise adjacent) for a given graph. The problem is known to be NP-hard and is also hard to approximate. In the last decade, the majority of efficient experimental algorithms are branch-and- bound solvers which use greedy coloring as bounding function. In this talk we describe the main heuristics used by leading exact solvers. The last part of the talk will be dedicated to applications of maximum clique in practice.

Exposés de 2015-2016.
Exposés de 2014-2015.

Pour les exposés antérieurs, voir cette page.


Ajouter un événement iCal