 + 
20190524  Structural tractability of MaxCSPs: beyond tree decompositions (Clément Carbonnel) 
15:30 (E 323)  MaxCSP is a wellstudied variant of the constraint satisfaction problem (CSP) where the goal is to find an assignment that maximises the number of satisfied constraints. MaxCSP is NPhard, but many polynomialtime restrictions have been identified over the years; these are generally obtained by restricting either the constraint language (which constraints can be used) or the constraint structure (on which subsets of variables the constraints must be placed).
The complexity of MaxCSP under structural restrictions is poorly understood. The two most general structural properties known to ensure tractability of MaxCSPs, betaacyclicity and bounded (incidence) MIMwidth, are incomparable and lead to very different algorithms. In this talk we will use the classical framework of tree decompositions as a starting point to explain how these two properties are closely related to a particular hypergraph measure called « pointwidth ». We will then present a unified dynamicprogramming algorithm that solves MaxCSP in polynomial time whenever the constraint hypergraph has bounded pointwidth (modulo some additional technical considerations). Joint work with Miguel Romero and Stanislav Zivny.

20190322  KRIMP: mining itemsets that compress [Vreeken et al. 11] (Bachir Belaid) 
15:30 (E 323)  One of the major problems in pattern mining is the explosion of the number of results. Tight constraints reveal only common knowledge, while loose constraints lead to an explosion in the number of returned patterns. This is caused by largegroups of patterns essentially describing the same set of transactions. In this paperwe approach this problem using the MDL principle: the best set of patterns is that setthat compresses the database best. For this task we introduce theKrimpalgorithm.Experimental evaluation shows that typically only hundreds of itemsets are returned;a dramatic reduction, up to seven orders of magnitude, in the number of frequentitem sets. These selections, called code tables, are of high quality. This is shown withcompression ratios, swaprandomisation, and the accuracies of the code tablebased Krimpclassifier, all obtained on a wide range of datasets. Further, we extensivelyevaluate the heuristic choices made in the design of the algorithm. [slides] 
20190315  Constraint Programming for Association Rules (Bachir Belaid) 
15:30 (E 323)  Discovering association rules among items in a datasetis one of the fundamental problems in data mining. Ithas recently been shown that constraint programmingis a flexible way to tackle data mining tasks. In this paper we propose a declarative model based on constraintprogramming to capture association rules. Our modelalso allows us to specify any additional property and/oruser’s constraints on the kind of rules the user is lookingfor. To implement our model, we introduce a new globalconstraint,Confident, for ensuring the confidence ofrules. We prove that completely propagatingConfidentis NPhard. We thus provide a decomposition ofConfident. In addition to user’s constraints on theitems composing body and head of the rules, we showthat we can capture the popular minimal nonredundantproperty of association rules. An experimental analysisshows the practical effectiveness of our approach compared to existing approaches. [slides] 
20190201  Structural tractability of CSPs: an overview (Clément Carbonnel) 
15:30 (E 323)  The most natural way to define a subproblem of the constraint satisfaction problem (CSP) is by restricting either the constraint language (which constraints can be used) or the constraint structure (on which subsets of variables the constraints must be placed). Such restrictions may affect the complexity of the problem; for instance the CSP restricted to the language of binary Boolean clauses (2SAT) is tractable, and so are treestructured CSP instances. The ultimate research goal is to obtain a full complexity classification of all possible restrictions. The complexity of languagebased restrictions is by now well understood (Bulatov FOCS'17, Zhuk FOCS'17) but there still exist structurebased restrictions of unknown complexity, and research on this topic has stalled for the last ten years. In this talk we will focus on the complexity of structurebased restrictions of CSP and review the existing tools, results and open problems. If time permits we will also discuss the case of (structurallyrestricted) MaxCSP, for which very little is known.

20181207  Solveurs de Contraintes Autonomes (Hugues Wattez) 
15:30 (E 323)  Partant de la programmation par contraintes et des solveurs CSP, nous cherchons à les rendre autonomes à partir de l'apprentissage par renforcement. En effet, en fonction du problème, les solveurs CSP nécessitent l'intervention d'un expert choisissant minutieusement la configuration à adopter (heuristique de choix de variables/valeurs, fonction de propagation, etc.), afin de parcourir au plus vite l'espace de recherche. C'est ici qu'intervient le principe des bandits multibras afin de découvrir la configuration optimale du solveur lors de son exécution. Ainsi, nous voyons comment associer ces deux domaines et les difficultés que nous pouvons rencontrer. 
20181123  Unveiling Source Code Latent Knowledge. Discovering Program Topoi (Carlo IEva) 
9:30 (Séminaire)  During the development of long lifespan software systems, specification documents can become outdated or can even disappear due to the turnover of software developers. Implementing new software releases or checking whether some user requirements are still valid thus becomes challenging. The only reliable development artifact in this context is source code but understanding source code of large projects is a time and effort consuming activity. This challenging problem can be addressed by extracting highlevel (observable) capabilities of software systems. By automatically mining the source code and the available sourcelevel documentation, it becomes possible to provide a significant help to the software developer in his/her program comprehension task.
This thesis proposes a new method and a tool, called FEAT (FEature As Topoi), to address this problem. Our approach automatically extracts program topoi from source code analysis by using a three steps process: First, FEAT creates a model of a software system capturing both structural and semantic elements of the source code, augmented with codelevel comments; Second, it creates groups of closely related functions through hierarchical agglomerative clustering; Third, within the context of every cluster, functions are ranked and selected, according to some structural properties, in order to form program topoi.
The contributions of the thesis is threefold:
1) The notion of program topoi is introduced and discussed from a theoretical standpoint with respect to other notions used in program comprehension ;
2) At the core of the clustering method used in FEAT, we propose a new hybrid distance combining both semantic and structural elements automatically extracted from source code and comments. This distance is parametrized and the impact of the parameter is strongly assessed through a deep experimental evaluation ;
3) Our tool FEAT has been assessed in collaboration with Software Heritage (SH), a largescale 
20180625  Unifying Reserve Design Strategies with Graph Theory and Constraint Programming. (Dimitri Justeau) 
17:30 (E 323)  The delineation of areas of high ecological or biodiversity value is a priority of any conservation program. However, the selection of optimal areas to be preserved necessarily results from a compromise between the complexity of ecological processes and managers’ constraints. Current reserve design models usually focus on few criteria, which often leads to an oversimplification of the underlying conservation issues. This paper shows that Constraint Programming (CP) can be the basis of a more unified, flexible and extensible framework. First, the reserve design problem is formalized. Secondly, the problem is modeled from two different angles by using two graphbased models. Then CP is used to aggregate those models through a unique Constraint Satisfaction Problem. Our model is finally evaluated on a real use case addressing the problem of rainforest fragmentation in New Caledonia, a biodiversity hotspot. Results are promising and highlight challenging perspectives to overtake in future work. 
20180427  Matching under preference (Mohamed Siala) 
15:30 (E 323)  I present in this seminar some recent work on matching under preferences. In twosided stable matching problems, the aim is to find an assignment between two disjoint sets of agents that respect an optimality criterion called stability. In this context, each agent has an ordinal preference ranking over agents of the other set.
In the first part of the seminar, I will first talk about a novel notion of robustness that we called (a, b)supermatches. An (a,b)supermatch is a stable matching in which if 'a' pairs break up it is possible to find another stable matching by changing the partners of those 'a' pairs and at most 'b' other pairs.
In the second part, I will present few constraint programming reformulations of twosided stable matching problems. I show in particular that a novel reformulation based on the notion of ‘rotation' outperforms the other approaches in balanced and sexequal stable matching.
[slides] 
20180323  Autour des classes polynomiales pour les problèmes de satisfaction de contraintes (Achref El Mouelhi) 
15:30 (E 323)  Le problème de satisfaction des contraintes CSP (pour Constraint Satisfaction Problems en anglais) est connu pour être NPcomplet, même pour le cas binaire. Cependant, en imposant certaines restrictions, sur la taille des domaines ou sur les contraintes, la résolution peut être accomplie en temps polynomial. Dans ce cas, on parle de propriétés traitables (tractables properties). L'ensemble d'instances satisfaisant une propriété traitable constitue une classe polynomiale (ou traitable). Par exemple, Il est connu que l'ensemble d'instances qui satisfont la propriété des triangles cassés BTP (pour BrokenTriangle Property) ou ZOA (Zero/One/All) définissent des classes polynomiales pour les CSP binaires. En pratique, les classes polynomiales sont rarement présentes. En plus, elles nécessitent généralement des algorithmes adhoc pour la reconnaissance et pour la résolution. Par conséquent, les algorithmes de l'état de l'art comme, FC (pour Forward Checking), RFL (pour Real Full Lookahead) ou MAC (pour Maintaining ArcConsistency) ne peuvent pas les exploiter.
Récemment, des efforts considérables ont été consacrés à l'identification des classes polynomiales qui sont facilement exploitées par les algorithmes de l'état de l'art cités cidessus. Par exemple, il a été prouvé que l'ensemble d'instances CSP ayant un nombre polynomial de cliques maximales dans le graphe de microstructure peuvent être résolues en temps polynomial par les algorithmes de résolution classiques. Ensuite, un nouveau cadre dont le but est d'étendre le champ des classes polynomiales a été proposé. En effet, il est basé sur la notion de transformation de CSP et introduit la notion de classe polynomiale cachée. Elle part du constat selon lequel les problèmes réels n'appartiennent pas en général à des classes polynomiales mais peuvent parfois, après simplification (en appliquant certains algorithmes de filtrages par cohérence locale, par exemple) en faire pa [slides] 
20180216  Multiagent epistemic planning (Andreas Herzig) 
15:30 (E 323)  The generation of plans for multiple agents goes beyond classical planning in that it typically involves reasoning about the agents' knowledge, including higherorder knowledge (such as knowledge about other agents' knowledge). This requires a reexamination of the relevant concepts and models. While the concepts of initial situation and goal can be adequately described in epistemic logic, the concept of an action law deserves a closer look. I point out some conceptual and computational difficulties with the dynamic epistemic logic account in terms of event models. I finally argue for a simple epistemic logic that is grounded in what the agents observe. 
20171222  A Reactive Strategy for Enforcing HigherLevel Consistency During Search (Berthe Choueiry) 
15:30 (E 323)  The stronger the consistency property enforced as lookahead during search for solving a Constraint Satisfaction Problem, the smaller the tree explored by backtrack search but, unfortunately, the bigger the overhead on the cost of search. I will present a new reactive strategy for automatically increasing the level of consistency only where it is likely to be most effective, thus, best benefitting the search process.
Joint work with Robert J. Woodward and Christian Bessiere 
20171215  Le Deep Learning démystifié (Valentin Leveau & Titouan Lorieul) 
15:30 (bat51.124)  Depuis longtemps, nous rêvons de pouvoir mettre en place des systèmes dits "intelligents" qui seraient capables d'apprendre à effectuer certaines tâches si on leur montrait, à travers un nombre limité d'exemples, comment faire. Ces mécanismes d’apprentissage sont l’objet d’étude de l'apprentissage statistique (Machine Learning) dont l'apprentissage profond (Deep Learning) est un sousdomaine. Ces approches profondes ont permis des progrès considérables dans certains problèmes et sont devenus, au cours des dernières années, l'état de l'art sur de nombreuses tâches de reconnaissance d'images et de bien d'autres domaines. En prenant exemple sur la vision par ordinateur, nous verrons comment ces méthodes fonctionnent et, en particulier, les spécificités de leur structure, de leur architecture et de leur algorithme d'apprentissage se basant sur un problème d'optimisation. Nous expliquerons également pourquoi elles ne se sont imposées que dernièrement alors qu'elles ont des origines plus anciennes. Enfin, nous aborderons leurs limitations actuelles qui peuvent rendre leur usage problématique dans certains cas et verrons en quoi ces dernières constituent des axes de recherche actifs. [slides] 
20171201  The complexity of generalvalued CSPs seen from the other side (Clément Carbonnel) 
15:30 (E 323)  The generalvalued CSP (VCSP) is a generalisation of CSP in which soft constraints are permitted and the goal is to find the minimum possible cost over all assignments. VCSP has an equivalent formulation in terms of homomorphisms between two valued structures: one that captures the structure of the constraint network (the "lefthand side valued structure") and one that captures the cost functions associated with each constraint (the "righthand side valued structure"). In this talk we present a precise characterisation of all classes of boundedarity lefthand side structures that give rise to a tractable restriction of VCSP, and we outline a proof based on a novel equivalence notion between valued structures. We also present a necessary and sufficient condition for a lefthand side valued structure to be solvable by the kth level of the SheraliAdams LP hierarchy and discuss search VCSPs, in which one is additionally required to find an assignment of minimum cost. 
20170929  Discovering Program Topoi through Clustering (Carlo Ieva) 
15:30 (E 323)  Understanding source code of large opensource software
projects is very challenging when there is only little documentation.
New developers face the task of classifying a huge
number of files and functions without any help. This paper
documents a novel approach to this problem, called FEAT,
which automatically extracts topoi from source code by using
hierarchical agglomerative clustering. Program topoi summarize
the main capabilities of a software system by presenting
to developers clustered lists of functions together with
an index of their relevant words. The clustering method used
in FEAT exploits a new hybrid distance which combines
both textual and structural elements automatically extracted
from source code and comments. The experimental evaluation
of FEAT shows that this approach is suitable to understand
opensource software projects of size approaching
2,000 functions and 150 files, which opens the door for its
deployment in the opensource community.

20170922  On Neighborhood Singleton Consistencies (Anastasia Paparrizou) 
15:30 (E 323)  CP solvers predominantly use arc consistency (AC) as the default propagation method. Many stronger consistencies, such as triangle consistencies (e.g. RPC and maxRPC) exist, but their use is limited despite results showing that they outperform AC on many problems. This is due to the intricacies involved in incorporating them into solvers. On the other hand, singleton consistencies such as SAC can be easily crafted into solvers but they are too expensive. We seek a balance between the efficiency of triangle consistencies and the ease of implementation of singleton ones. Using the recently proposed variant of SAC called Neighborhood SAC as basis, we propose a family of weaker singleton consistencies. We study them theoretically, comparing their pruning power to existing consistencies. We make a detailed experimental study using a very simple algorithm for their implementation. Results demonstrate that they outperform the existing propagation techniques, often by orders of magnitude, on a wide range of problems.

20170915  Mining Optimized Association Rules for Numeric Attributes[Fukudaa et al] (Bachir Belaid) 
15:30 (E 323)  Given a huge database, we address the problem of finding association rules for numeric attributes, such as(Balance∈I)⇒(CardLoan=yes),which implies that bank customers whose balances fall in a rangeIare likely to use card loan with a probability greater thanp. The above rule is interesting only if the rangeIhas some special feature with respect to the interrelation betweenBalanceandCardLoan. It is required that the number of customers whose balances are contained inI(called thesupportofI) is sufficient and also that the probabilitypof the conditionCardLoan=yesbeing met (called theconfidence ratio) be much higher than the average probability of the condition over all the data. Our goal is to realize a system that finds such appropriate ranges automatically. We mainly focus on computing twooptimized ranges: one that maximizes the support on the condition that the confidence ratio is at least a given threshold value, and another that maximizes the confidence ratio on the condition that the support is at least a given threshold number. Using techniques from computational geometry, we present novel algorithms that compute the optimized ranges in linear time if the data are sorted. Since sorting data with respect to each numeric attribute is expensive in the case of huge databases that occupy much more space than the main memory, we instead apply randomized bucketing as the preprocessing method and thus obtain an efficient rulefinding system. Tests show that our implementation is fast not only in theory but also in practice. The efficiency of our algorithm enables us to compute optimized rules for all combinations of hundreds of numeric and Boolean attributes in a reasonable time. 
20170623  An Efficient Algorithm for Mining Frequent Sequence with Constraint Programming [Aoga et al.] (MohamedBachir Belaid) 
15:30 (E 323)  The main advantage of Constraint Programming (CP) approaches for sequential pattern mining (SPM) is their modularity, which includes the ability to add new constraints (regular expressions, length restrictions, etc.). The current best CP approach for SPM uses a global constraint (module) that computes the projected database and enforces the minimum frequency; it does this with a filtering algorithm similar to the PrefixSpan method. However, the resulting system is not as scalable as some of the most advanced mining systems like Zaki’s cSPADE. We show how, using techniques from both data mining and CP, one can use a generic constraint solver and yet outperform existing specialized systems. This is mainly due to two improvements in the module that computes the projected frequencies: first, computing the projected database can be sped up by precomputing the positions at which a symbol can become unsupported by a sequence, thereby avoiding to scan the full sequence each time; and second by taking inspiration from the trailing used in CP solvers to devise a backtrackingaware data structure that allows fast incremental storing and restoring of the projected database. Detailed experiments show how this approach outperforms existing CP as well as specialized systems for SPM, and that the gain in efficiency translates directly into increased efficiency for other settings such as mining with regular expressions. 
20170607  Echantillonnage sous contraintes en viticulture depr ́ecision (Baptiste OGER) 
15:30 (E 323)  Le d ́eveloppement de l’agriculture de pr ́ecision n ́ecessite de collecter des donn ́ees par ́echantillonnage demani`ere plus rationnelle. Cet ́echantillonnage s’appuiesur des outils statistiques qui n’int`egrent pas toujoursles contraintes op ́erationnelles. L’objet de cette ́etude estd’utiliser le raisonnement par contraintes pour optimiserl’ ́echantillonnage dans une parcelle de vigne. Il s’agit `a lafois de choisir, dans la parcelle, les points qui respectentles contraintes d’ ́echantillonnage tout en optimisant led ́eplacement du technicien. C’est donc un probl`eme detourn ́ee, pour lequel la Programmation par Contrainte ad ́evelopp ́e de nombreux algorithmes de filtrage ces derni`eres ann ́ees, notamment la contrainte WeightedSubCircuit [Briot et al. 2017] bien adapt ́ee aux tourn ́ees nepassant que par un sousensemble de points. Nous proposons ici un mod`ele pour une version simplifi ́ee du probl`eme et les premiers r ́esultats obtenus avec le solveurChoco. 
20170602  Une contrainte de circuit adapt ́ee aux tourn ́ees multiples (Nicolas Briot) 
15:30 (E 323)  Il existe de nombreuses applications r ́eelles contenant
un probl`eme de tourn ́ees de v ́ehicules. La programmation
par contraintes permet d’aborder ces probl`emes de fa ̧con
efficace. Des contraintes de circuits ont ́et ́e d ́efinies pour
traiter du probl`eme de voyageur de commerce (TSP) ou
de tourn ́ees de v ́ehicules (VRP). Ces contraintes sont
bas ́ees sur la recherche d’un circuit hamiltonien dans
un graphe. Dans cet article, nous nous int ́eressons au
probl`eme plus g ́en ́eral de tourn ́ees multiples dans lequel
on cherche `a couvrir une partie du graphe par un ensemble de circuits de coˆut minimal. Nous proposons une
nouvelle contrainte globale bas ́ee sur la recherche de circuits ́el ́ementaires disjoints dans un graphe. Contrairement aux contraintes existantes, on ne cherche pas un
circuit hamiltonien mais un ensemble de circuits de Steiner. Apr`es avoir d ́efini cette contrainte, nous montrons
que le filtrage au bornes est NPdifficile. Nous proposons
donc une m ́ethode de filtrage incompl`ete bas ́ee sur la recherche d’une borne inf ́erieure d’un circuit de Steiner. 
20170522  Fouille de motifs basée sur la programmation par contraintes, Appliquée à la validation de logiciels (Mehdi Maamar) 
16:00 (E 323)  
20170519  Optimisation et applicxation en bioinfo (Thomas Schiex) 
15:30 (E 323)  Promenade autour de l'optimisation sur modèles graphiques probabilistes, réseaux de fonctions de coût et application en bioinfo. 
20170505  Applications of weighted Voronoi diagrams and randomization to variancebased kclustering [Inaba..] (Joel Quinqueton) 
15:30 (E 323) 
In this paper we consider thekclustering problem for a set S of n points i=xi in theddimensional space with variancebased errors as clustering criteria, motivated from the color quantization problem of computing a color lookup table for frame buffer display. As the intercluster criterion to minimize, the sum on intracluster errors over every cluster is used, and as the intracluster criterion of a clusterSj , a 1pi∈Sjxix Sj 2 is considered, where ˙ is the L2 norm and xSj is the centroid of points in Sj, i.e., 1/Sjpi∈Sjxi. The cases of a=1,2 correspond to the sum of squared errors and the allpairs sum of squared errors, respectively. The kclustering problem under the criterion with a=1,2 are treated in a unified manner by characterizing the optimum solution to the kclustering problem by the ordinary Euclidean Voronoi diagram and the weighted Voronoi diagram with both multiplicative and additive weights. With this framework, the problem is related to the generalized primary shutter function for the Voronoi diagrams. The primary shutter function is shown to be OnOkd, which implies that, for fixed k, this clustering problem can be solved in a polynomial time. For the problem with the most typical intracluster criterion of the sum of squared errors, we also present an efficient randomized algorithm which, roughly speaking, finds an ∈–approximate 2–clustering inOn1/∈d time, which is quite practical and may be used to real largescale problems such as the color quantization problem.

20170324  An Exploratory Knowledge Discovery Process based on Formal Concept Analysis (Amedeo Napoli) 
14:00 (Séminaire)  Knowledge discovery (KD) in complex datasets can be considered from several points of view, e.g. data, knowledge, and problem solving. KD is applied to datasets and has a direct impact on the design of knowledge bases and problem solving. Dually, principles supporting knowledge representation can be reused in knowledge discovery, as in declarative data mining, for making KD more exploratory, iterative and interactive. Accordingly, one of our main objectives in KD is to design an exploratory KD approach supporting interactions between data and domain knowledge, discovery and representation, and between the analyst and the KD process. Accordingly, in this presentation, we discuss the process of Exploratory Knowledge Discovery based on Formal Concept Analysis (FCA). FCA starts with a binary context and outputs a concept lattice, which can be visualized, navigated and interpreted by human agents, and, as well, which can be processable by software agents. FCA can be extended to pattern structures for dealing with complex data such as Linked Data (RDF). We will present a methodology based on FCA and pattern structures for mining definitions in RDF data. Such data are classified into a concept lattice allowing data exploration and the discovery of implications, used to automatically detect missing information and to complete RDF data. Contrasting the preceding datadirected KD process, we will present a goaldirected KD process, where the search for patterns of high interest is guided by the analyst preferences and domain constraints. Finally, we will conclude by discussing the benefits of FCA and pattern structures for exploratory knowledge discovery. 
20170317  Modélisation du circuit de l’information nécessaire à la boucle diagnostique microbiologique (David Morquin) 
15:30 (E 323)  Modélisation du circuit de l’information nécessaire à la boucle diagnostique microbiologique – antibiothérapie adaptée des septicémies en milieu hospitalier et analyse de l’inertie de système

20170224  About Knowledge Representation and Computational Social Choice (Jérôme Lang) 
15:30 (Séminaire)  Social choice theory studies the aggregation of individual preferences towards a collective choice. Computational social choice emerged in the late 1980s, and mostly uses computational paradigms and techniques to provide a better analysis of social choice mechanisms (especially in the fields of voting and of fair division of resources), and to construct new ones. Among the subfields of artificial intelligence that are concerned by this interaction, knowledge representation plays an important role (other subfields being machine learning, reasoning with uncertainty, search, and constraint programming). The reasons for which KR plays an important role include: representing preferences and reasoning about them; computing collective decisions with incomplete knowledge of agents' preferences; the role of knowledge in strategic behavior; and using logic for automated theorem proving in social choice. The talk will give an overview of part of these topics, and will try to identify challenges for KR researchers interested in computational social choice. 
20170210  Réduction des pics de consommation d’électricité et problèmes d’optimisation induits pour les... (Chloé DESDOUITS (thèse)) 
15:30 (E 323)  Réduction des pics de consommation d’électricité et problèmes d’optimisation induits pour les consommateurs.
Alors que les préoccupations concernant le réchauffement climatique deviennent de plus en plus sérieuses, l'une de ses premières causes: la consommation d'électricité, continue à croître. Une des manières d'endiguer le phénomène pourrait être de mieux équilibrer la consommation et la production, afin d'allumer moins de gros groupes de production, et de permettre l'intégration de plus de sources renouvelables. Le nouveau paradigme du marché de l'électricité incite les consommateurs à réduire leur pic de consommation, et à différer leur consommation quand la demande est moindre, à l'aide d'incitations tarifaires. De nouveaux algorithmes d'optimisation, et de nouvelles méthodologies sont donc nécessaires pour optimiser la puissance d'électrique utilisée par les consommateurs. Schneider Electric propose, à travers le projet européen Arrowhead, d'étudier trois cas applicatifs : un ascenseur avec plusieurs sources énergétiques, une usine manufacturière et un réseau d'eau potable. Pour chacune de ces trois applications, une méthodologie pour optimiser les pics de puissance consommée (parfois à l'aide d'une fonction de coût de l'électricité) est donnée, ainsi que des algorithms d'optimisation. Dans le cas de l'ascenseur multisources, deux contrôleurs couplés sont proposés : l'un au niveau stratégique résolvant un problème linéaire, et l'autre à base de règles au niveau tactique. Dans le cas de l'usine, la méthodologie que nous avons utilisée pour faire des mesures, construire des modèles énergétiques, et finalement optimiser est expliquée. De plus, trois formulations linéaires, ainsi qu'une procédure de recherche locale, et une formulation naïve de programmation par contraintes sont données afin de résoudre le problème d'ordonnance 
20170120  La contrainte circuit (Nicolas Briot) 
15:30 (E 323)  De nombreux problèmes industriels peuvent se ramener à des problèmes de tournées. Le plus célèbre est le problème du voyageur de commerce (TSP) ou plus généralement, le problème de tournées de véhicules (VRP). Ces problèmes "purs" sont largement traités en RO. Malheureusement, l’ajout de contraintes tel que : les fenêtres de temps, les précédences, les tailles des tournées, etc peuvent rendre la modélisation (et la résolution) encore plus difficile. À l'inverse, la programmation par contraintes arrive à exprimer certaines de ces structures combinatoires.
Depuis 1994 avec le solveur CHIP, il existe la contrainte globale circuit dont la validation consiste à trouver une séquence de variables formant un circuit hamiltonien dans un graphe donné. En 22 ans, seulement une dizaine d'articles ont été publiés sur la contrainte : nouveaux algorithmes de filtrages, nouvelles stratégies et évolution de la contrainte.
Je vous propose dans un premier temps de nous concentrer sur la contrainte circuit et son algorithme de filtrage principal. Dans un deuxième temps, je vous parlerai de la version optimisation de la contrainte et de ses différents algorithmes de filtrages dont le plus intéressant est basé sur la relaxation 1tree de Held and Karp. 
20161206  Almoststable” matchings in the Hospitals / Residents problem with Couples. D.F. Manlove et al. (Gaelle Hisler) 
15:30 (E 323)  The Hospitals / Residents problem with Couples (hrc) models the allocation of intending junior doctors to hospitals where couples are allowed to submit joint preference lists over pairs of (typically geographically close) hospitals. It is known that a stable matching need not exist, so we consider min bp hrc, the problem of finding a matching that admits the minimum number of blocking pairs (i.e., is “as stable as possible”). We show that this problem is NPhard and difficult to approximate even in the highly restricted case that each couple finds only one hospital pair acceptable. However if we further assume that the preference list of each single resident and hospital is of length at most 2, we give a polynomialtime algorithm for this case. We then present the first Integer Programming (IP) and Constraint Programming (CP) models for min bp hrc. Finally, we discuss an empirical evaluation of these models applied to randomlygenerated instances of min bp hrc. We find that on average, the CP model is about 1.15 times faster than the IP model, and when presolving is applied to the CP model, it is on average 8.14 times faster. We further observe that the number of blocking pairs admitted by a solution is very small, i.e., usually at most 1, and never more than 2, for the (28,000) instances considered. 
20161202  Calcul garanti d'ellipsoïdes attractives pour les systèmes dynamiques (Gilles Chabert) 
11:00 (E 324)  Cet exposé propose une approche numérique (intervalles) pour traiter un problème d'automatique, à savoir caractériser le bassin d'attraction d'un point fixe asymptotiquement stable d'un système dynamique nonlinéaire. Plus précisément, nous proposons de calculer des ellipsoïdes dont l'inclusion dans le bassin d'attraction est garantie, la méthode étant également robuste aux incertitudes sur le système et sur le point fixe. La méthode repose sur une extension aux intervalles de l'équation de Lyapunov et l'optimisation globale. Travaux en cours de développement. 
20161118  Learning Parameters For the Sequence Constraint From Solutions [PicardCantin et al] (Robin Arcangioli) 
15:30 (E 323)  This paper studies the problem of learning parameters for
global constraints such as
Sequence
from a small set of positive exam
ples. The proposed technique computes the probability of observing a
given constraint in a random solution. This probability is used to select
the more likely constraint in a list of candidates. The learning method
can be applied to both soft and hard constraints. 
20161102  Complexity Results in Optimistic/Pessimistic Preference Reasoning (Gaelle Hisler) 
15:00 (E 323)  Preference reasoning is a central problem in decision support. There exist various ways to interpret a set of qualitative preferences. Conditional preference logics allow to deal with semantics such as optimistic, pessimistic, strong or not. In this paper, we study the complexity of the main problems in optimistic/pessimistic preference logic: undominated, consistency and dominance. We show that they are all NPhard in general, with some becoming polynomial under specific semantics. Our second contribution is to show that the dominance problem, which has an online component in its definition, is compilable to polynomial time. 
20161102  An Interval Filtering Operator for Upper and Lower Bounding in Constrained Global Optimization (Olivier Sans) 
14:00 (E 323)  This paper presents a new intervalbased operator for continuous constrained global optimization. It is built upon a new filtering operator, named TEC, which constructs a bounded subtree using a Branch and Contract process and returns the paralleltoaxes hull of the leaf domains/boxes. Two extensions of TEC use the information contained in the leaf boxes of the TEC subtree to improve the two bounds of the objective function value: (i) for the lower bounding, a polyhedral hull of the (projected) leaf boxes replaces the paralleltoaxes hull, (ii) for the upper bounding, a good feasible point is searched for inside a leaf box of the TEC subtree, following a lookahead principle. The algorithm proposed is an autoadaptive version of TEC that plays with both extensions according to the outcomes of the upper and lower bounding phases. Experimental results show a significant speedup on several stateoftheart instances. 
20161027  A ‘Reduce and Solve’ Algorithm for the Multidimensional Knapsack Problem (Mohamed Bachir Belaid) 
15:30 (E 323)  In this talk, we present a ‘reduce and solve’ heuristic for
approximately solving the multidimensional knapsack problem (MKP). The
proposed heuristic combines solution space reduction with linear programming
techniques. The solution space reduction is achieved by fast and simple
algorithms and then refined by solving some relaxed MKP subproblems
leading at the end to optimality or near optimality. The performance of the
proposed heuristic has been evaluated on several problem instances
Encouraging results have been obtained. 
20160930  Adaptive learning algorithm for constraint acquisition (Redouane Ezzahir) 
15:30 (E 323)  Constraint acquisition assists a nonexpert user in modeling
her problem as a constraint network. Several learning algorithms have
been developed for acquiring a constraint network. All these algorithms
attempt to reduce the dialogue length between the system and the user,
and they are only evaluated using the number and size of queries. How
ever, for hard problems, the delay time to generate a query can be signif
icantly longer and thus cause the rupture of the system. To address this
problem we propose Alca a new adaptive learning algorithm for con
straint acquisition based on the Quacq system. It uses a parameter
that is automatically adjusted to perform an adaptive localconstraint
acquisition. It also includes two techniques: example reexamination and
premature convergence. Experiments on some benchmarks illustrate the
practical eectiveness of our approach and its ability to cope with both
the dialogue length and the waiting time of a query.

20160916  IJCAI16 and CP16 overview (Robin Arcangioli) 
15:30 (E 323)  Discussion arround a selection of what have been presented at IJCAI and CP this year (2016). 
20160902  A Global Constraint for Closed Frequent Pattern Mining (Mehdi Maamar) 
15:30 (E 324)  Discovering the set of closed frequent patterns is one of the fundamental problems in Data Mining.
Recent Constraint Programming (CP) approaches for declarative itemset mining have proven their usefulness and flexibility.
But the wide use of reified constraints in current CP approaches leads to
difficulties in coping with high dimensional datasets.
In this paper, we propose the \closed global constraint to capture the
closed frequent pattern mining problem without requiring
reified constraints or extra variables.
We present an algorithm to enforce domain consistency
on ClosedPattern in polynomial time.
The computational properties of this algorithm are analyzed and its practical effectiveness is experimentally evaluated. 
20160629  ight upper bound on splitting by linear combinations for pigeonhole principle (ight upper bound on splitting by linear combinatio) 
14:30 (E 324)  SAT is a wellknown problem to identify if a given boolean formula has a satisfying assignment. Main technique in solving SAT is DPLLalgorithms. A usual DPLLalgorithm chooses a single variable and order of values to substitute, and run recursively on simplified formulas seeking for a satisfying assignment. We consider an extension of the algorithm which chooses not a single variable, but a linear form modulo 2. There is a known result that formulas, encoding pigeonhole principle on $n$ pigeons and $m$ holes ($PHP_n^m$), still is hard even for linear splitting and requires $2^{\Omega(n)}$ time of work. The known bound for a usual DPLLalgorithm without linear splittings is $2^{\Theta(n \log n)}$. In this talk we show that $PHP_n^m$ can be solved in $2^{O(n)}$ time with linear splittings, and, as a corollary, formulas, encoding perfect matching principle on graphs with $n$ vertices, can be also solved in $2^{O(n)}$. 
20160623  Multiple Constraint Acquisition (Robin Arcangioli) 
15:30 (E 323)  QUACQ is a constraint acquisition system that assists
a nonexpert user to model her problem as a
constraint network by classifying (partial) examples
as positive or negative. For each negative example,
QUACQ focuses onto a constraint of the target
network. The drawback is that the user may
need to answer a great number of such examples to
learn all the constraints. In this paper, we provide a
new approach that is able to learn a maximum number
of constraints violated by a given negative example.
Finally we give an experimental evaluation
that shows that our approach improves on QUACQ. 
20160623  Constraint Acquisition with Recommendation Queries (Abderrazak Daoudi) 
16:00 (E 323)  Constraint acquisition systems assist the nonexpert
user in modeling her problem as a constraint network.
Most existing constraint acquisition systems
interact with the user by asking her to classify an
example as positive or negative. Such queries do
not use the structure of the problem and can thus
lead the user to answer a large number of queries.
In this paper, we propose PREDICT&ASK, an algorithm
based on the prediction of missing constraints
in the partial network learned so far. Such missing
constraints are directly asked to the user through
recommendation queries, a new, more informative
kind of queries. PREDICT&ASK can be plugged in
any constraint acquisition system. We experimentally
compare the QUACQ system to an extended
version boosted by the use of our recommendation
queries. The results show that the extended version
improves the basic QUACQ. 
20160610  Argumentationbased defeasible reasoning (Leila Amgoud (DRCNRS à l'IRIT, Toulouse)) 
15:30 (bat5124/3)  Argumentation is an alternative approach for defeasible reasoning. It is based on the idea of justifying plausible conclusions by “strong” arguments. Starting from a knowledge base encoded in a logical language, an argumentation system defines arguments and attacks between them using the consequence operator associated with the language. Finally, it uses a semantics for evaluating the arguments. The plausible conclusions to be drawn from the knowledge base are those supported by “good” arguments.
In this talk, we present two families of such systems: the family using extension semantics and the one using ranking semantics. We discuss the outcomes of both families and compare them. We then compare the argumentation approach with other wellknown approaches for defeasible reasoning, namely default logic. 
20160610  Argumentation based constraint acquisition (Robin Arcangioli) 
15:30 (E 324)  Efficient acquisition of constraint networks is a
key factor for the applicability of constraint problem solving
methods. Current techniques learn constraint networks from
sets of training examples, where each example is classified as
either a solution or nonsolution of a target network. However,
in addition to this classification, an expert can usually provide
arguments as to why examples should be rejected or accepted.
Generally speaking domain specialists have partial knowledge
about the theory to be acquired which can be exploited for
knowledge acquisition. Based on this observation, we discuss
the various types of arguments an expert can formulate and
develop a knowledge acquisition algorithm for processing these
types of arguments which gives the expert the possibility to
input arguments in addition to the learning examples. The
result of this approach is a significant reduction in the number
of examples which must be provided to the learner in order
to learn the target constraint network. 
20160603  LCM (Lineartime Closed pattern Miner) vs ClosedPattern Global Constraint (Mehdi Maamar) 
15:30 (E 324)  Discovering the set of closed frequent patterns is one of the
fundamental problems in Data Mining. Recent Constraint Programming
(CP) approaches for declarative itemset mining have proven their usefulness
and flexibility.
But the wide use of reified constraints in current
CP approaches raises many difficulties to cope with high dimensional
datasets.
This talk presents LCM the wellknown spesialized algorithm vs ClosedPattern global constraint. 
20160520  An Interval Branch and Bound Algorithm for Parameter Estimation (Gilles Trombettoni) 
15:30 (E 324)  The parameter estimation problem is a widespread and challenging problem in engineering sciences consisting in computing the parameters of a parametric model that fit observed data. The computer vision community has proposed the RANSAC algorithm to deal with outliers in the observed data. This randomized algorithm is efficient but nondeterministic and therefore incomplete. Jaulin et al. propose a branchandcontract algorithm that returns all the model instances fitting at least $q$ observations. Assuming that at least $q$ observed data are inliers, this algorithm achieves on the observations a relaxed intersection operator called $q$intersection.
First, this paper presents several improvements to Jaulin et al.'s algorithm.
Second, an interval branch and bound algorithm is designed to produce a model that can explain the maximum number of observations within a given tolerance.
Experiments are carried out on computer vision and image processing problems. They highlight a significant speedup w.r.t. Jaulin et al.'s interval method in 2D and 3D shape recognition problems. We have also investigated how the approach scales up in dimensions up to 7 for stereovision (estimation of essential and fundamental matrices).

20160513  Modèles simples de clustering par CSP. (Joël Quinqueton) 
15:30 (E 324)  Nous nous intéressons aux problèmes de clustering selon le diamètre minimum et la séparation maximum.
Nous présentons quelques algorithmes approximatifs ou exacts qui ont été proposés pour ces problèmes.
Nous présentons une modélisation CSP sans réification de ces problèmes.
Nous expérimentons ces modèles sur les plateformes Minion et Gecode. 
20160311  What is a Decision Problem? (Alexis Tsoukias) 
10:00 (sem)  The talk introduces a general framework through which any type of
decision problem should be characterisable. The aim of the
framework is to show that only a finite number of primitive
characteristics describe a decision problem, thus reducing the
number of archetypes to a finite number of combinations. In the
talk we present in some details some of these characteristics. We
also provide a number of simple examples in order to show how this
framework helps in categorising current decision problems, in
reformulating decision problems and in constructing formal
arguments for the decision aiding process. 
20160219  JustInTime Compilation of Knowledge Bases [Audemard et.al. IJCAI13] (Gaelle Hisler) 
15:30 (E 324)  Since the first principles of Knowledge Compila tion (KC), most of the work has been focused in finding a good compilation target language in terms of compromises between compactness and expres siveness. The central idea remains unchanged in the last fifteen years: an offline, very hard, stage, allows to “compile” the initial theory in order to guarantee (theoretically) an efficient online stage, on a set of predefined queries and operations. We propose a new “JustinTime” approach for KC. Here, any Knowledge Base (KB) will be imme diately available for queries, and the effort spent on past queries will be partly amortized for future ones. To guarantee efficient answers, we rely on the tremendous progresses made in the practical solving of SAT and incremental SAT applicative problems. Even if each query may be theoretically hard, we show that our approach outperforms previ ous KC approaches on the set of classical problems used in the field, and allows to handle problems that are out of the scope of current approaches. 
20160212  Acquisition de contraintes (Robin Arcangioli) 
15:30 (E 324)  
20160129  Probingbased variable selection heuristics for NCSPs [Reyes V, Araya I, ICTAI14] (Olivier Sans) 
16:30 (E 323)  Interval branch & bound solvers are commonly used for solving numerical constraint satisfaction problems. They alternate filtering/contraction and branching steps in order to find small boxes containing all the solutions of the problem. The branching basically consists in generating two subproblems by dividing the domain of one variable into two. The selection of this variable is the topic of this work. Several heuristics have been proposed so far, most of them using local information from the current node (e.g., domain sizes, partial derivative images over the current box, etc). We propose instead an approach based on past information. This information is provided by a preprocessing phase of the algorithm (probing) and is used during the search. In simple words, our algorithm attempts to identify the most important variables in a series of cheap test runs. As a result of probing, the variables are weighted. These weights are then considered by the selection heuristic during the search. Experiments stress the interest of using techniques based on past information in interval branch & bound solvers. 
20160122  Extrapolating from Limited Uncertain Information to Obtain Robust Solutions for LargeScale Optimiza (Nicolas Briot) 
15:30 (E 324)  Data uncertainty in reallife problems is a current challenge in many areas, including Operations Research (OR) and Constraint Programming (CP). This is especially true given the continual and accelerating increase in the amount of data associated with reallife problems, to which Large Scale Combinatorial Optimization (LSCO) techniques may be applied. Although data uncertainty has been studied extensively in the literature, many approaches do not take into account the partial or complete lack of information about uncertainty in reallife settings. To meet this challenge, in this paper we present a strategy for extrapolating data from limited uncertain information to ensure a certain level of robustness in the solutions obtained. Our approach is motivated by realworld applications of supply of timber from forests to sawmills. 
20160108  Stochastic Constraint Programming (Nicolas Briot) 
15:30 (E 324)  
20151218  Data analytics and optimisation for assessing a ride sharing system (Vincent Armant) 
15:30 (E 324)  Ridesharing schemes attempt to reduce road traffic by matching prospective passengers to drivers with spare seats in their cars.
To be successful, such schemes require a critical mass of drivers and passengers.
In this talk we will see how deployed ridesharing applications can be improved by combining data analytics and constraint programming.
We use data analytics to model flexible user behaviours, and infer feasible matches between users.
We then use the discovered feasible match graph and a constraint programming model to assess the potential of different ridesharing schemes.
We demonstrate that flexible schemes that persuade existing drivers to switch to become passengers, have the potential to reduce the number of unmatched participants by up to 80%.
Finally, we abstract the ridesharing problem to a more general problem, the problem of multi resources allocation with time dependencies, and discuss the solutions and the new challenges it raises. 
20151211  Elicitation incrémentale de poids pour la décision multicritère ou collective (Patrice Perny) 
14 (3/124 (B5))  Dans cet exposé, nous présentons de nouvelles méthodes d'élicitation incrémentale de préférences pour la décision multicritère ou collective. Plus précisément, l'exposé sera focalisé sur l'apprentissage actif de pondérations dans les fonctions d'agrégation. Nous nous intéressons ici à des protocoles d'élicitation qui visent, à chaque étape du processus de décision, à identifier la question la plus informative possible pour réduire l'incertitude sur la valeur des solutions potentielles (minimisation de regrets) et ainsi favoriser la détermination rapide d'une solution nécessairement optimale ou quasioptimale.
Après avoir rappelé les bases de l'élicitation incrémentale de poids dans le cas simple d'un agrégateur linéaire et d'un ensemble d'alternatives donné en extension, l'exposé se focalisera sur deux difficultés indépendantes que l'on peut envisager de manière séparée puis de manière combinée : l'élicitation des poids dans une intégrale de Choquet (problème 1), l'élicitation incrémentale des poids sur un ensemble d'alternatives défini de manière implicite (problème 2). Dans le problème 1, nous montrerons comment contourner la difficulté liée au nombre exponentiel de paramètres de l'intégrale de Choquet pour permettre un apprentissage actif efficace de la capacité. Dans le problème 2, nous proposerons une nouvelle approche entrelaçant élicitation des préférences et énumération implicite des solutions pour la détermination rapide d'une solution robuste avec une garantie de qualité. Cette approche permettant l'élicitation incrémentale des poids en cours de recherche peut être instanciée dans différents algorithmes de résolution tels que la programmation dynamique, les algorithmes gloutons et le branch and bound. A titre d'illustration nous présentons une sophistication interactive et incrémentale de MOA* pour l'optimisation multiobjectif dans les espaces d'états. Ce travail résulte d'une 
20151204  Detecting Types of Variables for Generalization in Constraint Acquisition (Abderrazak Daoudi) 
15:30 (E 324)  During the last decade several constraint acquisition
systems have been proposed for assisting nonexpert
users in building constraint programming models. GENACQ
is an algorithm based on generalization queries that can be
plugged into many constraint acquisition systems. However,
generalization queries require the aggregation of variables into
types which is not always a simple task for nonexpert users.
In this paper, we propose a new algorithm that is able to
learn types during the constraint acquisition process. The idea
is to infer potential types by analyzing the structure of the
current constraint network and to use the extracted types to
ask generalization queries. Our approach gives good results
although no knowledge on the types is provided. 
20151127  A General Framework for Reordering Agents Asynchronously in Distributed CSP (Christian Bessiere) 
15:30 (E 324)  Reordering agents during search is an essential component
of the eciency of solving a distributed constraint satisfaction prob
lem. Termination values have been recently proposed as a way to sim
ulate the mindomain dynamic variable ordering heuristic. The use of
termination values allows the greatest
exibility in reordering agents dy
namically while keeping polynomial space. In this paper, we propose
a general framework based on termination values for reordering agents
asynchronously. The termination values are generalized to represent var
ious heuristics other than mindomain. Our general framework is sound,
complete, terminates and has a polynomial space complexity. We im
plemented several variable ordering heuristics that are wellknown in
centralized CSPs but could not until now be applied to the distributed
setting. Our empirical study shows the signicance of our framework
compared to stateoftheart asynchronous dynamic ordering algorithms
for solving distributed CSP 
20151120  Apprentissage automatique large échelle: application au cas de la prédiction de l’attrition chez (Remi Coletta) 
15:30 (E 324)  Dans cet exposé, je vous présenterais au travers d’un cas d’usage, qui constituera mon fil conducteur les différentes briques scientifiques (intégration de données, Machine Learning et Séquence Mining,..) et techniques (architecture InMemory SPARK) mises en oeuvre et/ou développées au sein de TellMePlus, qui expliquent pourquoi la solution de TellMePlus
dépasse souvent en performances l’état de l’art industriel (y compris plusieurs grands éditeurs logiciels) sur des cas d’usage industriels.
Certaines briques étant très technos, ou plutôt issues des bases de données, je conclurais en zoomant sur quelques perspectives de recherche plus proches de l’apprentissage automatique.

20151113  ModelSeeker: Constraint Acquisition System (Robin Arcangioli) 
15:30 (E 323)  Model Seeker est un système qui permet, à partir d’exemples positifs, de générer un modèle sous contraintes d’un problème hautement structuré. Ce système est basé sur le catalogue des contraintes globales duquel il tente d’extraire le modèle. Il utilise l’outil appelé Constraint Seeker qui, pour un lot d’exemples (positifs ou négatifs), fournit une liste de contraintes candidates classées de la plus à la moins pertinente. Nous analyserons la méthode qu’utilise Model Seeker, en particulier, comment capturetil des structures régulières sur lesquelles s’applique potentiellement une contrainte et comment génèretil une liste de contraintes candidates pertinentes. 
20151106  Towards Declarative Scripting Combining CP and Analytics (Hassan AïtKaci) 
14:30 (Bat53/124)  We take a rapid look at the current trends of evolution of application software tools for combining the formal capacities of (mostly statistical) Analytics along with Constraint Programming (CP). We note that application orchestration is what makes this task difficult. As a result, a new area of software development is rapidly becoming of great importance in relation with CP and Analytics: viz., scripting application orchestration. This has become a clear need in particular for Data Analytics that are typically multiplexed applications interacting between data and specific processing tools acting thereon such as statistical analysis and simulation, optimization, graphing, tabling and DB interface, HTML/LaTeX/Doc report generation, etc., ... We overview some systems used for scripting application orchestration involving CP and Analytics, looking at pros and cons. Then, we discuss the following point: since scripting is crucial both to CP and Analytics, shouldn't we leverage CP to make Analytics programming as declarative as possible? If so, what are the challenges?
This was an invited presentation as part of the CP2015 Panel Workshop on CP and Analytics, Aug. 31, 2015, Cork, Ireland. 
20151023  Fault localization using itemset mining under constraints (Mehdi Maamar) 
15:30 (E 324)  We introduce in this paper an itemset mining approach to tackle
the fault localization problem, which is one of the most difficult processes in
software debugging. We formalize the problem of fault localisation as finding
the k best patterns satisfying a set of constraints modelling the most suspicious
statements. We use a Constraint Programming (CP) approach to model and
to solve our itemset based fault localization problem. Our approach consists of
two steps: i) mining topk suspicious suites of statements; ii) fault localization
by processing topk patterns. Experiments performed on standard benchmark
programs show that our approach enables to propose a more precise localization
than a standard approach. 
20151016  Discussion sur l'IA (Christian Bessiere) 
14:30 (bat5/3124)  Une réunion du pôle IA en forme de discussion informelle sur les enjeux et risques de l'IA sur la société. 
20151009  TEC opérateur (Tree for Enforcing Consistency) sur le domaine continu (Olivier Sans) 
15:30 (E 324)  
20151002  PREFIXPROJECTION Global Constraint for Sequential Pattern Mining (CP15) (Yahia Lebbah) 
14:30 (bat51/124)  Many efficient ad hoc methods have been developed for mining sequential patterns, but they are
all suffering from a lack of genericity. Recent works have investigated Constraint Programming
(CP) methods, but they are not still effective because of their encoding. In this talk we present a
global constraint based on the projected databases principle which remedies to this drawback. Experiments showed that our approach clearly outperforms CP approaches and competes well with ad hoc methods on large datasets. 
20150925  Représentations des préférences (Gaelle Hisler) 
15:30 (E 324)  
20150918  Best Application paper at CP15 (Nicolas Briot) 
15:30 (E 324)  title: LongHaul Fleet Mix and Routing Optimisation with Constraint Programming and Large Neighbourhood Search Philip Kilby and Tommaso Urli
Abstract. We present an original approach to compute efficient mid term fleet configurations, at the request of a Queenslandbased longhaul trucking carrier. Our approach considers one year’s worth of demand data, and employs a constraint programming (CP) model and an adap tive large neighbourhood search (LNS) scheme to solve the underly ing multiday multicommodity split delivery capacitated vehicle routing problem. A Paretobased preprocessing step allows to tackle the large amount of data by extracting a set of representative days from the full horizon. Our solver is able to provide the decision maker with a set of Paretoequivalent fleet setups trading off fleet efficiency against the likeli hood of requiring onhire vehicles and drivers. Moreover, the same solver can be used to solve the daily loading and routing problem. We carry out an extensive experimental analysis based on realworld data, and we discuss the properties of the generated fleets. We show that our app roach is a sound methodology to provide decision support for the mid and shortterm decisions of a longhaul carrier. 
20150911  Teaching CP (Christian Bessiere) 
15:30 (E 324)  Teaching constraint programming (CP) is important: it allows us to increase the uptake of the technology and to train the next generation of researchers and practitioners. We are holding a cocomeet on teaching constraint programming. 
20150702  Global Constraints in Software Testing Applications (Arnaud Gotlieb) 
16:00 (E 323)  Global constraints are considered as a major contribution of the Constraint Programming community to the combinatorial optimisation area. In this talk, I will present two examples of software testing applications where global constraints are the keyenabling technology to solve large or difficult instances of the underlying combinatorial problems. These two applications have led to the development of CP models embedded into industrial products. 
20150619  The Balance Constraint Family (Christian Bessiere) 
15:30 (E 323)  Spread, Balance, AllBalance, Focus global constraints 
20150616  L'art difficile de l'évaluation de stratégies dans un contexte multiagents (Pole IA) (Philippe Mathieu) 
10:30 (bat52/124)  La simulation ou la résolution de problèmes dans le cadre des systèmes complexes nécessite pour chaque élément constituant, la mise en place
d'une stratégie comportementale. Une stratégie n'est ni plus ni moins qu'un mécanisme de sélection d'actions se basant en général sur le
passé pour en déduire la prochaine action à réaliser. Proposer une nouvelle stratégie nécessite de savoir l'évaluer en la comparant à ses
semblables. Malheureusement, ses performances dépendant du contexte, il est alors très difficile de mettre en place des comparaisons objectives
permettant d'établir qu'une stratégie est meilleure qu'une autre. A travers deux contextes (très) différents, le contexte des jeux itérés
puis celui des marchés financiers, nous montrons les difficultés et écueils que l'on rencontre, et proposons quelques pistes pour améliorer
ces comparaisons. 
20150612  Phase Transition (Christian Bessiere) 
15:30 (E 323)  A talk on CP/SAT Phase transition, the Exceptionally Hard problems (EHP) and the notion of restart 
20150612  Constraint Acquisition as a Reinforcement Learning task (Kostandina Veljanovska) 
16:00 (E 323)  
20150529  Programmation par contraintes pour la vendange sélective (Nicolas Briot) 
15:30 (E 324)  En viticulture de précision, la vendange sélective consiste à récolter séparément deux qualités de raisin dans une même vigne. On dispose pour cela d’une machine à vendanger comportant deux trémies et on connait la localisation des zones de qualités ainsi qu’une estimation des quantités à récolter. Le problème consiste à optimiser le trajet de la machine à vendanger tout en respectant de nombreuses contraintes comme la prise en compte du sens de traitement des rangs, la capacité de stockage de la machine, son rayon de braquage, le nombre de vidanges, etc. Après une formalisation du problème de la vendange sélective, cet article présente une modélisation sous la forme d’un problème de satisfaction de contraintes. La dernière section donne des résultats préliminaires fournis par le solveur AbsCon.

20150507  ActivityBased Search for BlackBox ContraintProgramming Solvers [L. Michel and P.V Hentenrynk] (Olivier Sans) 
16:00 (E 324)  Robust search procedures are a central component in the design of blackbox constraintprogramming solvers. This paper proposes activitybased search, the idea of using the activity of variables during propagation to guide the search. Activitybased search was compared experimentally to impactbased search and the wdeg heuristics. Experimental results on a variety of benchmarks show that activitybased search is more robust than other heuristics and may produce significant improvements in performance. 
20150429  Arithmétique par intervalles (Nathalie Revol) 
16h (E 324)  Cet exposé commencera par une brève introduction à l'arithmétique par intervalles ;
l'algorithme de Newton servira d'exemple pour illustrer ses spécificités, à savoir
les précautions à prendre lorsque l'on écrit une formule mathématique comportant des
intervalles, et les garanties que l'on peut obtenir sur le résultat calculé :
encadrement du résultat, mais également preuve d'existence, d'unicité.
Afin de pouvoir partager les bénéfices des calculs utilisant l'arithmétique par
intervalles, un effort de normalisation, lancé en 2008 et qui devrait aboutir en
2015 à la norme IEEE 1788, sera présenté.
Cependant, les intervalles calculés sont des encadrements parfois beaucoup
trop larges des valeurs cherchées. Pour remédier en particulier au problème de
décorrélation des variables (appelé "variable dependency" en anglais), des
variantes telles que les modèles de Taylor ou l'arithmétique affine existent
et seront évoquées.
Enfin, une approche plus algorithmique sera présentée sur des problèmes
d'algèbre linéaire : produit de matrices, résolution de systèmes linéaires. 
20150424  MultiArmed Bandits for Adaptive Constraint Propagation (Anastasia Paparrizou) 
15:30 (E 324)  : Adaptive constraint propagation has recently received a
great attention. It allows a constraint solver to exploit various levels
of propagation during search, and in many cases it shows better
performance than static/predefined. The crucial point is to make
adaptive constraint propagation automatic, so that no expert
knowledge or parameter specification is required. In this work,
we propose a simple learning technique, based on multiarmed
bandits, that allows to automatically select among several levels
of propagation during search. Our technique enables the combination
of any number of levels of propagation whereas existing techniques
are only defined for pairs. An experimental evaluation demonstrates
that the proposed technique results in a more efficient and stable solver. 
20150421  Packing Curved Objects (Gilles Chabert) 
11:00 (E 324)  We will present a generic approach based on continuous constraint programming for packing twodimensional objects of quite arbitrary shapes, including nonconvex curved shapes and assemblies of them. There has been considerable work on packing objects but, most of the time, with specific shapes (rectangles or "bins", circles, polygons,...). Nothing has really been done for the general case where different arbitrary shapes can be mixed together. More precisely, we address the wide class of objects described by nonlinear inequalities.
Our approach can be seen as a threelayered framework. The top layer uses a stochastic optimization algorithm with a fitness function that gives a global violation cost and equals zero when objects are all packed. The second layer decomposes the global function as a sum of elementary functions that measure the overlapping between each pair of different objects. The third layer calculates the set of all positions and orientations that can be assigned to a given shape so that the intersection with a second one is empty. This nonoverlapping constraint is processed by a branch & prune algorithm that resorts to two original algorithms that we will present. We will show some experimental results and give the ideas we are currently exploring for extending our approach to objects described by parametric curves. 
20150410  Fault localization using data mining under constraints (Nadjib Lazaar) 
15:30 (E 324)  
20150327  Learning How to Learn Constraints with Generalization Queries (Abderrazak Daoudi) 
15:30 (E 324)  Generalization queries need an aggregation of variables into types.
This aggregation of variables is provided by the user. However,
in some cases, the user is not able to provide the types. In this work,
we propose several techniques that take advantage of generalization,
even when no knoweldge on variable types is provided.
The main idea is to analyse the partial graph of learned constraints.
The inference of potentiel types is based on detection of quasicliques
and quasibicliques. We evaluate our techniques on some benchmarks. 
20150320  Contribution to the parameter elicitation in multicriteria optimization (Noureddine Aribi) 
15:30 (E 324)  Many methods exist for solving multicriteria optimization problems,
and it is not easy to choose the right method well adapted to a given multicriteria problem.
Even after choosing a multicriteria method, various parameters (e.g. weight, utility functions, etc.)
must be carefully determined either to find the optimal solution (best compromise) or to classify
all feasible solutions (the set of alternatives).
To overcome this potential difficulty, elicitation methods
are used in order to help the decision maker to fix safely the parameters.
Additionally, we assume that we have a set of feasible solutions, and we also
make the assumption that we have prior information about the preferences of the decision maker,
and we focus on how to use these information, rather than how to get them.
In this work, we take advantage of a simple and quickly computable statistical measure, namely,
the Spearman correlation coefficient, to develop a greedy approach, and two exact approaches
based on constraint programming (CP) and mixed integer programming (MIP).
These methods are then used to automatically elicit the appropriate parameter
of the lexicographic ordering method.
We also propose some elicitation models for some multicriteria methods,
such as OWA operators, weighted sum method, and Leximin.
These elicitation approaches are based on solving maxCSP models.
The parameters that we calculate ensure the best way according
to which the chosen method will capture the preferences expressed by the decision maker.
On the other hand, we propose a variant of a CP based model of Leximin, including a symmetry breaking algorithm having a better experimental results. 
20150313  On the complexity of Clustering problems (Joel Quinqueton) 
15:30 (E 324)  We present here results about the combinatorial aspects of atempting to deal with clustering problems using a CSP solver. 
20150304  XCSP3 An Integrated Format for Benchmarking Combinatorial Constrained Problems (Christophe Lecoutre) 
15:00 (E 323)  We propose a major revision of the format XCSP 2.1, called XCSP3, to build integrated representations
of combinatorial constrained problems. This new format is able to deal with
mono/multi optimization, many types of variables, cost functions, reication, views, annotations,
variable quantication, distributed, probabilistic and qualitative reasonings. The new
format is made compact, highly readable, and rather easy to parse. Interestingly, it partly
captures the structure of the problem models, through the possibilities of declaring arrays of
variables, and identifying syntaxic and semantic groups of constraints. The number of constraints
is kept under control by introducing a set of approximately 50 basic constraint forms,
and producing almost automatically some of their variations through lifting, restriction, sliding,
logical combination and relaxation mechanisms. As a result, XCSP3 encompasses practically
all constraints that can be found in major constraint solvers developed by the CP community.
A website, which is developed conjointly with the format, is expected to contain very
rapidly many models and series of instances. The user will be able to make sophisticated
queries for selecting instances from very precise criteria. The objective of XCSP3 is to ease
the eort required to test and compare dierent algorithms by providing a common testbed
of combinatorial constrained instances. 
20150220  Node Selection Strategies in Interval Branch and Bound Algorithms: zoom on Feasible Diving (Gilles Trombettoni) 
15:30 (E 324)  The talk focuses on a strategy achieving a tradeoff between intensification and diversification for node selection in interval Branch and Bound algorithms. A description of Feasible Diving is given for intensifying the search of a feasible solution with a good cost at each node of a standard bestfirst search. A similar technique is used, while not really detailed in a publication, in the CPLEX MIP solver. To clearly explain the strategy, I will describe the current and (probably) future scheme of our IbexOpt constrained global (continuous) optimizer. 
20150123  Preference Elicitation in Recommender Systems (Paolo Viappiani (LIP6, France)) 
14:00 (séinaires)  Many applications, such as decision support systems and recommender systems, need to reason about the user's preferences. However, acquiring the preferences of the user is a challenging task for a number of reasons. First, elicitation of user preferences is usually expensive (w.r.t. time, cognitive effort, etc.). Second, many decision problems have large outcome or decision spaces. Third, users are inherently "noisy" and inconsistent. Since classical elicitation paradigm from decision theory are impractical for modern software applications, a number of approaches for effectively eliciting preferences have been recently proposed in the artificial intelligence and machine learning community. This talk, after an introduction to recommender systems, will review the most prominent techniques for preference elicitation (also called preference learning in the machine learning community), including learning from ranked data. We will put a particular emphasis on interactive methods that aim at asking the most informative questions in order to make good (or even optimal) recommendations with sparse knowledge of the user's utility function. 
20150116  Constraint programs: from acquisition to validation (Nadjib Lazaar) 
14:00 (bat52/124)  Constraint programs are declarative programs written in highlevel constraint languages such as OPL (Optimization Programming Language), CHOCO, GECODE or ZINC. These programs are more and more used in the design of safetycritical or businesscritical systems with, for example, application to highfrequency trading, airtraffic control, car manufacturing or hardware verification. However, i) we need a fair expertise to write down a constraint program and, ii) as any other computer programs, constraint programs must be thoroughly tested and corrected before being used in operational mode. My research interests are mainly related to the two previous questions. In my presentation, I will give an overview on previous, ongoing and future works on constraint acquisition and constraint validation. 
20150116  Modularisation de triplestores RDF déductifs (Federico Ulliana) 
14:45 (bat52/124)  Reusing modules extracted from wellestablished knowledge bases allows to capitalize on the efforts made in the last years by many institutions, organizations, and domain experts, to publish quality information and make it available online. Today, this practice is crucial to ensure a coherent development of the Semantic Web.
I will present a novel approach to the extraction of modules of data and knowledge from RDF Triplestores augmented with safe inference rules, à la Datalog. Dealing with a recursive rule language poses challenging issues for defining the module semantics, and also makes module extraction algorithmically unsolvable in some cases. I will present a set of module extraction algorithms compliant with the novel semantics, and experimental results showing the succinctness of the modules extracted with this approach. 
20150116  Development of Optimal and Adaptive Strategy Using Qlearning (Kostandina Veljanovska) 
16:00 (E 324)  Reinforcement learning is a machine learning technique, which can work without supervision. It is goaldirected learning from interaction with an environment, technique that will learn what to do  how to map situations to actions, in order to maximize a numerical reward signal. Reinforcement learning agents are implemented as controllers in order to provide optimal control. In my talk I will present the technique used to implement the control strategy. 
20150115  Towards a global constraint for the itemset mining problem (Nadjib Lazaar) 
15:30 (E 324)  
20150109  Constraintbased CPnets (Gaelle Hisler) 
15:30 (E 324)  Gaelle will give us a talk based on two papers [Prestwich et al., AAAI05] and [Domshlak et al., Heuristics06] 
20141219  ImpactBased Search Strategies for Constraint Programming [P. Refalo CP04] (Olivier Sans) 
15:30 (E 324)  A key feature of constraint programming is the ability to design
specific search strategies to solve problems. On the contrary, integer
programming solvers have used efficient generalpurpose strategies since
their earliest implementations. We present a new general purpose search
strategy for constraint programming inspired from integer programming
techniques and based on the concept of the impact of a variable. The impact
measures the importance of a variable for the reduction of the search
space. Impacts are learned from the observation of domain reduction during
search and we show how restarting search can dramatically improve
performance. Using impacts for solving multiknapsack, magic square,
and Latin square completion problems shows that this new criteria for
choosing variables and values can outperform classical generalpurpose
strategies. 
20141212  Node Edge Arc Routing Problem (Nicolas Briot) 
15:30 (E 324)  The aim is to present the Capacitated General Windy Routing Problem and its variants. This new problem subsumes many important and wellknown arc and node routing problems. Arc and node routing problems consist of finding a set of routes covering certain arcs, edges and/or vertices of a graph, which meet certain conditions. These problems allow us to model and to solve many reallife problems, especially in the context of transportation, distribution management and scheduling.

20141211  Handling uncertain data dependency constraints (Carmen Gervet) 
11:00 (séminaire)  Uncertain data due to imprecise measurements is commonly specified as bounded intervals in a constraint decision or optimization problem. Dependencies do exist among such data, e.g. upper bound on the sum of uncertain production rates per machine, sum of traffic distribution ratios from a router over several links. For tractability reasons most approaches in optimization and constraintbased reasoning, assume independence of the data. Assuming independence is safe, but leads to large solution spaces, and a loss of problem structure. Thus it cannot be overlooked. In this presentation we present two approaches we developed to embed and tackle such dependencies effectively.
The first one combines the strengths of two frameworks, namely constraint programming and regression analysis. We will show how we can exploit the strengths of both paradigms, and provide valuable insights to the decision maker by relating solutions to uncertain data satisfying dependency constraints. We also show the insights we learnt from running a regression: the solving of dependency constraints needs to take into account the possible values of the relevant decision variables.
Our second approach addresses this particular issue efficiently. We consider dependency constraints where the dependencies are drawn from a matrix model. Matrix models are linear models whereby the data imprecision applies to the cells of the matrix and the output vector. We combine consistency techniques from constraint programming and modeling techniques from mathematical programming to derive a new model for uncertain data dependency constraints. We will develop and illustrate the strengths this approach: 1) a novel perspective on data dependency constraints, 2) an efficient modeling approach that suits solvers from multiple paradigms. 
20141127  Adaptive constraint propagation (Anastasia Paparrizou) 
15:00 (1.4)  Constraint propagation algorithms vary in the strength of propagation they enforce. Selecting the appropriate propagator is an essential problem in CP and therefore has attracted a lot of attention. The adaptive techniques that have been proposed to tackle this problem are mainly based on either ML or heuristic approaches. In my talk, I will present you an overview of existing works of both approaches. 
20141031  Node Selection Heuristics Using the Upper Bound in Interval Branch and Bound (Gilles Trombettoni) 
15:30 (E 324)  We present in this article a new strategy for selecting the current
node in an interval Branch and Bound algorithm for constrained
global optimization. The standard bestfirst strategy selects the
node with the lowest lower bound of the objective
estimate.
We propose in this article new node selection policies where an
upper bound of each node/box is also taken into
account. The good accuracy of this upper bound achieved by
several operators leads to a good performance of the criterion.
These new strategies obtain better experimental results than
classical bestfirst search on difficult instances. 
20141017  a CP framework for Constrained Clustering (Joël Quinqueton) 
15:30 (E 324)  Joel will talk us about the papers by Christel Vrain et al. on a CP framework for Constrained Clustering [ICTAI14, PKDD14]. 
20141010  Buffered Resource Constraint: Algorithms and Complexity. CPAIOR14 (Christian Bessiere) 
15:30 (E 323)  The notion of buffered resource is useful in many problems.
A buffer contains a finite set of items required by some activities, and changing the content of the buffer is costly. For instance, in instruction scheduling, the registers are a buffered resource and any switch of registers has a significant impact on the total runtime of the compiled code.
We first show that sequencing activities to minimize the number of switches in the buffer is NPhard. We then introduce an algorithm which, given a set of already sequenced activities, computes a buffer assignment which minimizes the number of switches in linear time, i.e., O(nd) where n is the length of the sequence and d the number of buffered items.
Next, we introduce an algorithm to achieve bound consistency on the constraint Switch, that bounds the number of changes in the buffer, in O (n2d+n1.5d1.5) time. Finally, we report the results of experimental evaluations that demonstrate the efficiency of this propagator. 
20140929  Introduction au choix social computationnel (Jérôme Lang  Université Paris Dauphine) 
14 (E 324)  La théorie du choix social vise à la construction et l'analyse de méthodes
pour la décision collective; c'est une branche importante de l'économie
mathématique. Voici quelques exemples typiques de décision collective :
élections de représentants politiques; votes profanes (par exemple, un
groupe d'amis décidant du choix d'un restaurant); partage équitable de
ressources (par exemple, répartition des biens entre exconjoints dans un
jugement de divorce, répartition des classes et des créneaux horaires dans
un lycée); recherche d'un consensus sur un verdict lors d'un jury
d'assises. Jusqu'à présent, les théoriciens du choix social se sont peu
préoccupés de questions algorithmiques. C'est là que l'informatique, et
plus spécifiquement l'intelligence artificielle, entre en jeu. Depuis une
dizaine d'années se développe ainsi un champ de recherche, à la rencontre
du choix social et de l'informatique, appelé "choix social
computationnel". On peut distinguer deux directions de recherche. La
première vise à importer des concepts et procédures de la théorie du choix
social pour résoudre des problèmes issus d'applications provenant de
l'informatique, notamment les procédures d'agrégation pour le classement
de pages web. La seconde (et la plus importante) vise à utiliser des
notions et méthodes venant de l'informatique pour résoudre ou repenser des
problèmes complexes de décision collective. L'exposé passera en revue
quelquesunes des problématiquesclé du choix social computationnel, et
notamment le calcul exact ou approché de règles de vote complexes, le vote
sur des domaines combinatoires, les barrières computationnelles aux
comportements stratégiques, et la conception de protocoles d'interaction
pour la décision collective. 
20140620  One word about contractor in and beyond numerical CSP (Gilles Trombettoni) 
15:00 (E 324)  I will introduce the notion of contractor proposed by Chabert and Jaulin in 2009. We will discuss about differences and similarities of contractor with filtering algorithm and global constraint. 
20140606  Reformulating CSPs for Scalability with Application to Geospatial Reasoning (Ken M. Bayer, et. al) (Berthe Choueiry) 
15:30 (E 324)  While many realworld combinatorial problems can be advantageously modeled
and solved using Constraint Programming, scalability remains a major issue
in practice. Constraint models that accurately reflect the inherent structure
of a problem, solvers that exploit the properties of this structure, and
reformulation techniques that modify the problem encoding to reduce the cost
of problem solving are typically used to overcome the complexity barrier.
In this paper, we investigate such approaches in a geospatial reasoning task,
the buildingidentification problem (BID), introduced and modeled as a
Constraint Satisfaction Problem by Michalowski and Knoblock [1].
We introduce an improved constraint model, a custom solver for this problem,
and a number of reformulation techniques that modify various aspects of the
problem encoding to improve scalability. We show how interleaving these
reformulations with the various stages of the solver allows us to solve
much larger BID problems than was previously possible. Importantly, we
describe the usefulness of our reformulations techniques for general
Constraint Satisfaction Problems, beyond the BID application. 
20140523  Qintersection Algorithms for ConstraintBased Robust Parameter Estimation (Philippe Vismara) 
15:30 (E 324)  Given a set of axisparallel ndimensional boxes, the
qintersection is defined as the smallest box encompassing all
the points that belong to at least
q boxes. Computing the qintersection is a combinatorial problem that allows us to
handle robust parameter estimation with a numerical constraint
programming approach. The qintersection can be viewed as
a filtering operator for soft constraints that model measurements subject to outliers.
This paper highlights the equivalence of this operator with the search of qcliques in a graph
whose boxicity is bounded by the number of variables in the
constraint network. We present a computational study of the
qintersection. We also propose a fast heuristic and a sophisticated exact qintersection algorithm. First experiments show
that our exact algorithm outperforms the existing one while
our heuristic performs an efficient filtering on hard problems. 
20140430  Boosting Constraint Acquisition via Generalization (Abderrazak Daoudi) 
15:30 (E 324)  Constraint acquisition assists a nonexpert user in modeling
her problem as a constraint network. In existing constraint acquisition
systems the user is only asked to answer very basic questions.
The drawback is that when no background knowledge is provided,
the user may need to answer a great number of such questions to
learn all the constraints. In this paper, we introduce the concept of
generalization query based on an aggregation of variables into types.
We present a constraint generalization algorithm that can be plugged
into any constraint acquisition system. We propose several strategies
to make our approach more efficient in terms of number of queries.
Finally we experimentally compare the recent QUACQ system to an
extended version boosted by the use of our generalization functionality.
The results show that the extended version dramatically improves
the basic QUACQ. 
20140425  Learning How to Propagate Using Random Probing (Amine Balafrej) 
15:30 (E 323)  Amine will talk us about the paper by E. Stamatatos and K. Stergiou on "Learning How to Propagate Using Random Probing". CPAIOR'09 paper 
20140411  A Computational Method for (MSS,CoMSS) Partitioning (JeanMarie Lagniez) 
14:30 (E 323)  The concepts of MSS (Maximal Satisfiable Subset) and CoMSS (also called Minimal Correction Subset) play a key role in many A.I. approaches and techniques. In my presentation, a novel algorithm for partitioning a Boolean CNF formula into one MSS and the corresponding CoMSS will be introduced. Extensive empirical evaluation shows that it is more robust and more efficient on most instances than currently available techniques. 
20140404  Wireless Sensor Networks, a hybrid LP+CP scheme to solve it optimally (Eric Bourreau) 
 In this presentation we consider the design of energy efficient operation scheme
in wireless sensor networks (WSN) in order to maximize network lifetime. We address
target coverage with sensors used for sensing and sending data to a
base station through multihop communication. With this purpose, a column
generation algorithm (LP) exploiting a Constraint Programming (CP) approach to tackle the
pricing subproblem is used to obtain optimal solutions. Global constraints, both on integer
variables (‘gcc’) and graph variables are used (‘trees’). 
20140404  Strong Bounds Consistencies (Anastasia Paparrizou) 
 Arc consistency (AC) and bounds consistency (BC) are the
standard local consistencies used for propagation in CP
solvers. Although many strong local consistencies that extend
AC have been proposed, similar consistencies based
on BC have been overlooked. We study relational Path Inverse
Bounds Consistency (rPIBC) and Pairwise Bounds
Consistency (PWBC), two new local consistencies that extend
BC by simultaneously considering combinations of
constraints as opposed to single constraints. We show that
rPIBC and PWBC are stronger than BC, but they are NPhard
to enforce even when constraints are linear. We propose
two polynomialtime techniques to enforce approximations
of these two consistencies on linear constraints. One
is a reformulation of the constraints on which we enforce
BC whereas the other is a polynomial time algorithm. Both
achieve stronger pruning than BC. Our experiments on randomly
generated problems and on problems used for the evaluation
of matchmakers in Web Services show large differences
in favor of our approaches. 
20140328  An interval based approach for graphmining (Yahia Lebbah) 
 Canonical encoding is one of the key operations required by frequent subgraph
mining algorithms for the candidate generation. Existing approaches make
use of canonical encodings with an exponential time complexity. As a consequence,
mining all frequent patterns for large and dense graphs is computationally
expensive. In this talk, we propose a relaxation of the canonicity
property, leading to two encodings, lower and upper encodings, with a
polynomial time complexity, allowing to tightly enclose the exact set of frequent
subgraphs. These two encodings have been integrated in the state of the art
Gaston miner. Experiments performed on large dense synthetic and real graph
datasets show that, our two encodings are very effective compared to state of
the art solvers, while on large real sparse graph datasets they remain very competitive. 
20140327  Our Recent/Current Works about Sliced Table Constraints and Subgraph Isomorphism (Christophe Lecoutre) 
14 (E 324)  In this talk, we shall describe our recent developments concerning compression of table constraints (using a datamining algorithm) and combination of (known) techniques for solving the subgraph isomorphism. 
20140321  How to boost AI and constraint programming to tackle realworld combinatorial problems: set objects (Carmen Gervet) 
 Since the beginning of Constraint Programming (CP), its extension to tackle large scale and realworld problems has been an active and successful field of research. In this presentation, we first present our works to extend CP to reason with sets, as subsets of a larger known set. Sets are natural and fundamental combinatorial objects to model a wide range of configuration problems like network design, set packing and covering as well as allocation problems. Set variables raise fundamentally novel representations issues since their domains often contain an exponential number of sets. How to represent this domain and how to use this representation for pruning the search space effectively is the issue we tackle.
Secondly, we address the problem of optimization in the presence of imprecise/incomplete data ubiquitous in realworld decision and optimization applications (e.g. network flow analysis, and economics of renewable energies). Interval parameters are effectively used in Operations Research and CP to specify such uncertain data, in order to provide reliable solutions to convex models. A reliable output is a set of solutions guaranteed to contain all solutions possible under any realization of the data. It is safe but often too large to be meaningful to the decisionmaker. This raises the issues of embedding more problem structure in the representation of imprecise data while retaining tractability of the model. We consider two main aspects: how to represent a possible degree of knowledge of the data and how to account for dependencies among data in a tractable manner.

20140321  Résolution SAT incrémentale. (Gilles Audemard) 
14 (séminaire)  Les dernières avancées réalisées dans le cadre de la résolution
pratique du problème SAT ont rejailli bien au delà de ses frontières.
Ainsi, à l'heure actuelle, de nombreux problèmes dont la classe d 
20140221  Approches déclaratives pour la fouille de données (Lakhdar Sais) 
 Cet exposé fait un tour d’horizon de nos contributions à la fouille de donnée et plus généralement à la fertilisation croisée entre la fouille de données, la programmation par contraintes et la satisfiabilité propositionnelle (http://www.cril.univartois.fr/decMining/). Ces travaux ont été soutenus par le projet ANR DAG « Approches déclaratives pour l’énumération de motifs intéressants » (http://liris.cnrs.fr/dag/). 
20140214  Solve a Constraint ProblemWithout Modeling It (Nadjib Lazaar) 

We study how to find a solution to a constraint problem
without modeling it. Constraint acquisition systems
such as Conacq or ModelSeeker are not able to solve a
single instance of a problem because they require positive
examples to learn. The recent QuAcq algorithm
for constraint acquisition does not require positive examples
to learn a constraint network. It is thus able to
solve a constraint problem without modeling it: we simply
exit from QuAcq as soon as a complete example is
classified as positive by the user. In this paper, we propose
ASK&SOLVE, an elicitationbased solver that tries
to find the best tradeoff between learning and solving to
converge as soon as possible on a solution. We propose
several strategies to speedup ASK&SOLVE. Finally we
give an experimental evaluation that shows that our approach
improves the state of the art. 
20140207  Adaptive (i,j)consistency via hypertree decomposition (Younes Mechqrane) 
 State of the art constraint solvers can fail solving some instances
that are surprisingly too hard for them. One way to improve these
algorithms would be prevent them from searching deep into the search
tree before realizing they made bad decisions. If there is no way to
analyse failures, revising the wrong decisions can be extremely time
consuming. Hypertree decomposition methods decompose a CSP into a tree
of clusters (subsets of variables). When the order of the variables is
compatible with a depth first traversal of the cluster tree, we know
that once the variables appearing in the same separator are all
instantiated, the subproblem defined by the subtree below the
separator is independent from the rest of the problem. To minimize bad
decisions, we propose that each time a separator has been completely
instantiated, we check the extensibility of the current partial
solution on the clusters that are immediately below. If it turns out
that the current partial solution cannot be extended, then we know
that the projection of the partial solution on the separator is a
nogood. These learned nogoods are a relaxation of (i,j)consistency
where i is the size of the separator. A learned nogood immediately
allows some form of backjunmping and later allows to decrease
thrashing. To concentrate our efforts on clusters that are more likely
to produce nogoods, we use machine learning techniques to
automatically learn the clusters that are most likely to be
unsatisfiable. 
20140117  Tractable Combinations of Global Constraints (Robert Woodward) 
 Robert will talk us about the paper by David A. Cohen, Peter G. Jeavons, Evgenij Thorstensen, Stanislav Živný
from CP 2013, on "Tractable Combinations of Global Constraints" 
20140110  Various kinds of coclustering as CSP (Joel Quinqueton) 
 we present several CSP models of coclustering and some experiments of these models 
20131220  ICTAI13 overview (Gilles Trombettoni) 
 
20131213  Solving Distributed Constraint Optimization Problem (Mohamed Wahbi) 
 The Distributed Constraint Optimization Problem (DCOP) is a powerful framework for modeling and solving applications in multiagent coordination. The presentation is about AFBBJ+, a revisited version of the Asynchronous Forward Bounding algorithm (one of the best algorithms to solve DCOPs). In AFBBJ+, we refine the lower bound computations. We also propose to compute lower bounds for the whole domain of the last assigned agent instead of only doing this for its current assignment. In addition, these lower bounds can be used as a value ordering heuristic in AFBBJ+. 
20131206  Learning propagation policies (Amine Balafrej) 
 Amine will talk us about the work done in [S. Epstein, E. Freuder, R. Wallace, and X. Li, 2005] on "Learning propagation policies" 
20131121  Programmation par contraintes appliquée à la robotique mobile (Luc Jaulin, ENSTABretagne/LabSTICC, Brest) 
 De nombreux problèmes de la robotique mobile (planification de trajectoire, localisation, cartographie) peuvent se modéliser par des contraintes ou CSP (Constraint Satisfaction Problem). En robotique, les variables de ces CSP ont une nature extrêmement variée relativement à ce qui est classiquement considéré dans la communauté CP (Constraint Programming). 
20131108  Localizing & Bolstering Constraint Propagation in a Tree Decomposition (Berthe Choueiry) 

The tractability of a Constraint Satisfaction Problem (CSP) is guaranteed by a direct relationship between its consistency level and a structural parameter of its constraint network such as the treewidth. This result is not widely exploited in practice because enforcing higherlevel consistencies can be costly and can change the structure of the constraint network and increase its width. Recently, R(∗,m)C was proposed as a relational consistency property that does not modify the structure of the graph and, thus, does not affect its width. In this paper, we explore two main strategies, based on a tree decomposition of the CSP, for improving the performance of enforcing R(∗,m)C and getting closer to the above tractability condition. Those strategies are: a) localizing the application of the consistency algorithm to the clusters of the tree decomposition, and b) bolstering constraint propagation between clusters by adding redundant constraints at their separators, for which we propose three new schemes. We characterize the resulting consistency properties by comparing them, theoretically and empirically, to the original R(∗,m)C and the popular GAC and maxRPWC, and establish the benefits of our approach for solving difficult problems.
Joint work with Shant Karakashian and Robert Woodward. The presentation is based on a AAAI 2013 paper and material from the doctoral dissertation of Shant Karakashian.

20131025  ICTAI13: Adaptive Constructive Interval Disjunction (Gilles Trombettoni) 
 An operator called CID and an efficient variant 3BCID were
proposed in 2007. For numerical CSPs handled by interval methods,
these operators compute a partial consistency equivalent to
Partition1AC for discrete CSPs.
The two main parameters of CID are the number of times the
main CID procedure is called and the maximum number of
subintervals treated by the procedure. The 3BCID operator is
stateoftheart in numerical CSP solving, but not in constrained
global optimization.
This paper presents an adaptive variant of 3BCID. The number
of variables handled is autoadapted during the search, the other
parameters are fixed and robust to modifications. On a
representative sample of instances, ACID appears to be the
best approach in solving and optimization, and has been added to the
default strategies of the Ibex interval solver. 
20131025  ICTAI13: A CSP Approach for Metamodel Instantiation (Adel Fedjoukh) 
 Abstract—This work is a contribution of Artificial Intelli
gence to Software Engineering. We present a comprehensive
approach to metamodel instantiation using CSP. The gener
ation of models which conform to a given metamodel is a
crucial issue in Software Engineering, especially when it comes
to produce a variate and large dataset of relevant models to test
model transformations or to properly design new metamodels.
We define an original constraint modeling of the problem of
generating a model conform to a metamodel, also taking into
account its additional OCL constraints. The generation process
we describe appears to be quicker, more efficient and flexible
than any other stateoftheart approach.

20131018  Recent Advances in HighLevel Relational Consistency (Robert J. Woodward ) 
 Consistency properties and algorithms for achieving them are at the
heart of the success of Constraint Programming. Most research has
focused on properties to be verified by and enforced on the domain
values of a Constraint Satisfaction Problem (CSP). We propose new
consistency properties and algorithms that apply to the relations of
the constraints in a problem. An important aspect of our research is
that increasing the level of consistency does not need to modify the
topology of the network and may enable backtrackfree search, the holy
grail of Constraint Programming.
In this talk, I will discuss two relational consistency properties,
R(*,m)C and RNIC, which I will theoretically characterize and
empirically evaluate on difficult benchmark problems.
This work is done in collaboration with S. Karakashian, C. Reeson,
B.Y. Choueiry, and C. Bessiere. 
20131011  CP13 overview (Christian Bessiere) 
 Christian will summarize and talk us on CP13 papers 
20131004  Roastering/planning application (Rémi Coletta) 
 Rémi details us an application of roastering/planning for hospital, proposed by BergerLevrault (www.bergerlevrault.fr/) 
20130927  Inner Regions and Interval Linearizations for (Continuous) Global Optimization (Gilles Trombettoni) 
 Researchers from interval analysis and constraint (logic) programming
communities have studied intervals for their ability to manage
infinite solution sets of numerical constraint systems. In particular,
inner regions represent subsets of the search space in which
all points are solutions. Our main contribution is the use of
recent and new inner region extraction algorithms in the upper
bounding phase of constrained global optimization.
Convexification is a major key for efficiently lower bounding
the objective function. We have adapted the convex interval
taylor form proposed by Lin & Stadtherr for producing a reliable
outer and inner polyhedral approximation of the solution set and a
linearization of the objective function. Other original ingredients
are part of our optimizer, including an efficient interval constraint
propagation algorithm exploiting monotonicity of functions.
We end up with a new framework for reliable continuous constrained
global optimization. Our interval B&B is implemented in the
intervalbased explorer Ibex and extends this free C++
library. Our strategy significantly outperforms the best reliable
global optimizers. 
20130924  Duality: A Fundamental Feature of Combinatorial Op (John Slaney (14h)) 
 Over the years, there have been many applications of the technique of finding solutions, or optimal solutions, to combinatorial problems by generating hitting sets (or minimalcost hitting sets) for the solutions to some sort of dual problem. In this talk, I clarify this notion of duality at an abstract level and set out some of its properties. Then I consider an algorithm for applying it in a generic way to a large family of combinatorial optimisation problems. 
20130913  Adaptive Parameterized Consistency (Amine Balafrej) 
 Stateoftheart constraint solvers uniformly maintain the
same level of local consistency (usually arc consistency) on all the instances.
We propose parameterized local consistency, an original approach
to adjust the level of consistency depending on the instance and on which
part of the instance we propagate. We do not use as parameter one of
the features of the instance, as done for instance in portfolios of solvers.
We use as parameter the stability of values, which is a feature based on
the state of the arc consistency algorithm during its execution. Parameterized
local consistencies choose to enforce arc consistency or a higher
level of local consistency on a value depending on whether the stability of
the value is above or below a given threshold. We also propose a way to
dynamically adapt the parameter, and thus the level of local consistency,
during search. This approach allows us to get a good tradeoff between
the number of values pruned and the computational co 
20130913  Constrained Wine Blending (Philippe Vismara) 
 Assemblage consists in blending base wines in order to create a
target wine. Recent developments in aroma analysis allow us to measure
chemical compounds impacting the taste of wines. This chemical
analysis makes it possible to design a decision tool for the
following problem: given a set of target wines, determine which
volumes must be extracted from each base wine to produce wines that
satisfy constraints on aroma concentration, volumes, alcohol contents
and price.
This talk describes the modeling of wine assemblage as a non linear
constrained MinMax problem (minimizing the gap to the desired
concentrations for every aromatic criterion) efficiently handled by
the Ibex interval branch and bound. 
20130627  La Qintersection par les graphes (Clément Carbonnel) 
 En estimation de paramètres, il arrive de devoir gérer des mesures aberrantes (outliers). Dans le cas où le nombre d'outliers peut être borné a priori, une stratégie de résolution robuste a été proposée, mais elle requiert de pouvoir résoudre efficacement un problème appelé "qintersection", qui a fait l'objet de mon stage. Après un tour rapide de l'état de l'art, je présenterai les résultats obtenus pour ce problème : complexité, nouveaux algorithmes et évaluation expérimentale. Je présenterai également un petit résultat de théorie des graphes lié à la qintersection.

20130621  CPTEST: CP for testing CP (Nadjib Lazaar) 
 Constraint programs, such as those written in highlevel constraint
modeling languages, e.g., OPL (Optimization Programming Language), COMET,
GECODE, ZINC or CHOCO, are more and more used in the design of safetycritical
or businesscritical systems. Applications, as diverse as highfrequency trading
systems, airtraffic control, car manufacturing systems or hardware verification
are now using constraint models in their daily operations and predictions. However,
as any other conventional programs, constraint programs require to be thoroughly
tested and corrected before being used in operation. Indeed, incorrect predictions
in an auction system for ecommerce, or innapropriate deviation order
in airtraffic control may eventually result in catastrophic consequences such as
human losses or huge money loss.
This paper presents CPTEST, an automated testing framework for constraints programs,
and extends it with two major functionalities. CPTEST is organized as a
comprehensive toolbox for automatically detecting, localizing functional faults,
and also correcting these faults. The framework takes into account the specificity
of the refinement process of constraint programs as well as their typical faults.
Our experimental results on realworld constraint programs written in OPL and
CHOCO show that CPTEST can actually relieve the validation engineers of heavy
validation tasks. Adoption of the tool by the industrial world involved in the development
and maintenance of constraint programs is in progress. 
20130610  Efficient algorithms and heuristics for Strong Loc (Anastasia Paparrizou) 

Many Constraint Satisfaction Problems (CSPs), consisting of nonbinary constraints, include table constraints (i.e. lists of allowed or disallowed tuples). Such constraints are very important in constraint programming as they are present in many real problems from areas such as configuration and databases. As a result, numerous specialized algorithms that achieve Generalized Arc Consistency (GAC) on table constraints have been proposed in the literature. However, since these algorithms process one constraint at a time they cannot exploit possible intersections that may exist between different constraints. On the other hand, existing algorithms for consistencies stronger than GAC that can exploit constraint intersections are generic and thus very expensive.
One objective of this research is to propose efficient algorithms for strong local consistencies that can be applied on table constraints and can be easily adopted by standard CP solvers. Towards this, we propose an extension 
20130607  (15h30) Symmetry breaking (Philippe vismara) 
 Constraint programming provides effective models to solve subgraph isomorphism problems. Many realworld applications require the enumeration of all solutions that are not equivalent to a symmetry of the processed graphs. In this talk we focus on the conditions under which it is possible to eliminate these symmetries using lexicographic constraints. When the number of symmetries becomes too large, we propose a method based on set constraints. 
20130607  (16h) Cohérences Locales Paramétrées (Amine Balafrej) 
 
20130531  (14h30) De la réduction polynomiale à la décom (Olivier Bailleux  Université de Bourgogne) 
 Un solveur pour les résoudre tous. L’idée est séduisante. Elle s’appuie sur la notion théorique de réduction polynomiale dans une perspective pratique : réutiliser un solveur dédié à un problème cible pour résoudre d’autres problèmes sans avoir à développer des solveurs spécifiques. Cette approche pose de nombreuses questions : quels problèmes sources peuvent être résolus via un problème cible donné ? Existetil un problème cible « universel » ? Comment choisir le meilleur modèle de traduction d’un problème source à un problème cible donnés ? Comment caractériser un « bon » modèle de traduction ? Le cadre de la programmation par contraintes permetil d’aborder cette problématique sans perte de généralité ? J’aborderai toutes ces questions au regard de l’état des connaissances dans les domaines concernés et je proposerai quelques pistes de recherches. 
20130517  RNIC (Christian Bessiere) 
 
20130503  SATbased Sequence mining (Remi Coletta) 
 
20130412  How to review a paper? (Christian Bessiere) 
 
20130405  Algorithmes efficaces pour la résolution CSP (Christophe Lecoutre  CRIL ) 
 Durant cet exposé, je présenterai les grandes lignes de quelques algorithmes (reconnus) efficaces pour l'inférence (filtrage de l'espace de recherche) et la recherche de solutions. Les principes (structures) sousjacents à ces algorithmes seront présentés : residus/watched literals, réduction tabulaire, sparse sets. Je terminerai avec quelques développements récents choisis (pour les cadres CSP et WCSP).

20130329  The Power of OSAC (Johan Thapper) 
 We consider the valued constraint satisfaction problem (VCSP). That is, given a set of cost functions, each with a variable scope, find an assignment to the variables that minimises the sum of the cost functions applied to their variables. I will present recent work on the VCSP that describes how the complexity of this problem depends on the set of cost functions used. This result has consequences for the soft consistency notion OSAC (optimal soft arc consistency). Assuming P != NP, we obtain a characterisation of the power of OSAC, i.e., we can determine when establishing OSAC alone solves the VCSP.
This is joint work with Stanislav Zivny (Warwick, UK) 
20130327  Qintersection (algos) (Clément Carbonnel) 
 Clément nous présente les premiers résultats de complexité et les algorithmes approchés et exacts pour la qintersection. 
20130326  A Constraint Programming approach for coverage op (Daoudi Abderrazak) 
 Wireless Sensor Networks (WSN) constitutes the platform of a broad range of applications related to surveillance
such as military, and environmental monitoring. In these fields, the deployment of WSN to support optimal coverage is
obviously a fastidious task, especially when a multiple coverage is needed. We present a multiobjective constraints
optimization approach for solving the above issue. Therefore, we investigate two main ideas: First, a preprocessing step with
field analysis in three dimensions (3D) environment. Second, a robust basedconstraint modeling approach with multiobjective
function. 
20130201  Qintersection (Gilles Trombettoni) 
 Le probleme d’estimation de parametres non lineaire traite par des methodes a intervalles peut etre traite par une strategie combinatoire differente : filtrage/contraction + qintersection (exponentiel, probleme NPcomplet).

20121221  QuAcq (Quick Acquisition) (Nadjib Lazaar) 
 A l’occasion de l’extension du deadline Maya, Nadjib nous présentera une approche basée sur des requêtes partielles pour une acquisition rapide des contraintes. 
20121214  Datamining/Clustering (attention 16h) (Eniko Szekely) 
 Eniko Szekely, postdoc équipe TaToo et spécialiste DataMining/Clustering nous fera une présentation des variantes du problème de Clustering ainsi que d'un certain nombre d'algos.
Horaire exceptionnellement décalé à 16h pour cause de soutenance
de thèse.

20121207  CP 4 Datamining (Rémi Coletta) 
 Il existe plusieurs problèmes de mining en fonction des données:
 itemset mining
 sequence mining (ordre sur les items)
 treemining
 graphmining
A PKDD 2008, De Readt et al ont présenté un modèle CP pour
l'itemset mining.
Brahim Douar a soutenu sa thèse sur une approche
basée sur l'AC pour le graphmining.
Dans ce talk, Rémi nous présentera un modèle pseudoboolean pour
le sequential mining. 
20121130  CSPs numériques et Recherche adaptative (Gilles Trombettoni) 
 La fin du cours de Gilles sur les CSPs numériques. Il abordera notamment les aspects sur la consistance adapative. 
20121122  Aquisition de prémodèles (Nadjib Lazaar (attention JEUDI)) 
 Présentation par Nadjib de:
Lopez/Lallouet ICTAI 2010
G.Raymond 2007

20121116  maxRPC (Amine Balafrej) 
 Présentation par Amine des différents algos pour maintenir la consistance maxRPC 
20121109  Negation de contraintes Globales (Nadjib Lazaar) 
 Presentation par Nadjib Lazaar de son papier a ModRef (WS CP 2012) 
20121026  CSPs numériques (part 2) (Gilles Trombettoni) 
 La suite du cours de Gilles d'introduction sur les CSPs numériques 
20121019  Nyseos (Philippe, Rémi, Eric) 
 Modele CSP pour Assemblage de vins 
20120928  Optimisation de planning dans l’hôtellerie (Rémi Le Brouster) 
 Cette présentation fixera le cadre du stage de Rémi Le Brouster au sein de la société Thélis, concernant une amélioration apportée au logiciel Unicamp.
Le but est d'augmenter le chiffre d'affaire des campings, clients de Thélis, en effectuant des modifications sur le planning des réservations futures afin de libérer le maximum de blocs vendables. 