Coconut

Home

Members

Collaborations

Meetings

Softwares

Admin

Some of the meeting are common with the pole IA (see the pole IA page )

2017-09-22On Neighborhood Singleton Consistencies (Anastasia Paparrizou)
15:30
(E 323)
CP solvers predominantly use arc consistency (AC) as the default propagation method. Many stronger consistencies, such as triangle consistencies (e.g. RPC and maxRPC) exist, but their use is limited despite results showing that they outperform AC on many problems. This is due to the intricacies involved in incorporating them into solvers. On the other hand, singleton consistencies such as SAC can be easily crafted into solvers but they are too expensive. We seek a balance between the efficiency of triangle consistencies and the ease of implementation of singleton ones. Using the recently proposed variant of SAC called Neighborhood SAC as basis, we propose a family of weaker singleton consistencies. We study them theoretically, comparing their pruning power to existing consistencies. We make a detailed experimental study using a very simple algorithm for their implementation. Results demonstrate that they outperform the existing propagation techniques, often by orders of magnitude, on a wide range of problems.
2017-09-15Mining Optimized Association Rules for Numeric Attributes[Fukudaa et al] (Bachir Belaid)
15:30
(E 323)
Given a huge database, we address the problem of finding association rules for numeric attributes, such as(Balance∈I)⇒(CardLoan=yes),which implies that bank customers whose balances fall in a rangeIare likely to use card loan with a probability greater thanp. The above rule is interesting only if the rangeIhas some special feature with respect to the interrelation betweenBalanceandCardLoan. It is required that the number of customers whose balances are contained inI(called thesupportofI) is sufficient and also that the probabilitypof the conditionCardLoan=yesbeing met (called theconfidence ratio) be much higher than the average probability of the condition over all the data. Our goal is to realize a system that finds such appropriate ranges automatically. We mainly focus on computing twooptimized ranges: one that maximizes the support on the condition that the confidence ratio is at least a given threshold value, and another that maximizes the confidence ratio on the condition that the support is at least a given threshold number. Using techniques from computational geometry, we present novel algorithms that compute the optimized ranges in linear time if the data are sorted. Since sorting data with respect to each numeric attribute is expensive in the case of huge databases that occupy much more space than the main memory, we instead apply randomized bucketing as the preprocessing method and thus obtain an efficient rule-finding system. Tests show that our implementation is fast not only in theory but also in practice. The efficiency of our algorithm enables us to compute optimized rules for all combinations of hundreds of numeric and Boolean attributes in a reasonable time.
2017-06-23An Efficient Algorithm for Mining Frequent Sequence with Constraint Programming [Aoga et al.] (Mohamed-Bachir Belaid)
15:30
(E 323)
The main advantage of Constraint Programming (CP) approaches for sequential pattern mining (SPM) is their modularity, which includes the ability to add new constraints (regular expressions, length restrictions, etc.). The current best CP approach for SPM uses a global constraint (module) that computes the projected database and enforces the minimum frequency; it does this with a filtering algorithm similar to the PrefixSpan method. However, the resulting system is not as scalable as some of the most advanced mining systems like Zaki’s cSPADE. We show how, using techniques from both data mining and CP, one can use a generic constraint solver and yet outperform existing specialized systems. This is mainly due to two improvements in the module that computes the projected frequencies: first, computing the projected database can be sped up by pre-computing the positions at which a symbol can become unsupported by a sequence, thereby avoiding to scan the full sequence each time; and second by taking inspiration from the trailing used in CP solvers to devise a backtracking-aware data structure that allows fast incremental storing and restoring of the projected database. Detailed experiments show how this approach outperforms existing CP as well as specialized systems for SPM, and that the gain in efficiency translates directly into increased efficiency for other settings such as mining with regular expressions.
2017-06-02Une contrainte de circuit adapt ́ee aux tourn ́ees multiples (Nicolas Briot)
15:30
(E 323)
Il existe de nombreuses applications r ́eelles contenant un probl`eme de tourn ́ees de v ́ehicules. La programmation par contraintes permet d’aborder ces probl`emes de fa ̧con efficace. Des contraintes de circuits ont ́et ́e d ́efinies pour traiter du probl`eme de voyageur de commerce (TSP) ou de tourn ́ees de v ́ehicules (VRP). Ces contraintes sont bas ́ees sur la recherche d’un circuit hamiltonien dans un graphe. Dans cet article, nous nous int ́eressons au probl`eme plus g ́en ́eral de tourn ́ees multiples dans lequel on cherche `a couvrir une partie du graphe par un ensemble de circuits de coˆut minimal. Nous proposons une nouvelle contrainte globale bas ́ee sur la recherche de circuits ́el ́ementaires disjoints dans un graphe. Contrairement aux contraintes existantes, on ne cherche pas un circuit hamiltonien mais un ensemble de circuits de Steiner. Apr`es avoir d ́efini cette contrainte, nous montrons que le filtrage au bornes est NP-difficile. Nous proposons donc une m ́ethode de filtrage incompl`ete bas ́ee sur la recherche d’une borne inf ́erieure d’un circuit de Steiner.
2017-05-22Fouille de motifs basée sur la programmation par contraintes, Appliquée à la validation de logiciels (Mehdi Maamar)
16:00
(E 323)
2017-05-19Optimisation et applicxation en bioinfo (Thomas Schiex)
15:30
(E 323)
Promenade autour de l'optimisation sur modèles graphiques probabilistes, réseaux de fonctions de coût et application en bioinfo.
2017-05-05Applications of weighted Voronoi diagrams and randomization to variance-based k-clustering [Inaba..] (Joel Quinqueton)
15:30
(E 323)
In this paper we consider thek-clustering problem for a set S of n points i=xi in thed-dimensional space with variance-based errors as clustering criteria, motivated from the color quantization problem of computing a color lookup table for frame buffer display. As the inter-cluster criterion to minimize, the sum on intra-cluster errors over every cluster is used, and as the intra-cluster criterion of a clusterSj , a -1pi∈Sjxi-x Sj 2 is considered, where ˙ is the L2 norm and xSj is the centroid of points in Sj, i.e., 1/Sjpi∈Sjxi. The cases of a=1,2 correspond to the sum of squared errors and the all-pairs sum of squared errors, respectively. The k-clustering problem under the criterion with a=1,2 are treated in a unified manner by characterizing the optimum solution to the kclustering problem by the ordinary Euclidean Voronoi diagram and the weighted Voronoi diagram with both multiplicative and additive weights. With this framework, the problem is related to the generalized primary shutter function for the Voronoi diagrams. The primary shutter function is shown to be OnOkd, which implies that, for fixed k, this clustering problem can be solved in a polynomial time. For the problem with the most typical intra-cluster criterion of the sum of squared errors, we also present an efficient randomized algorithm which, roughly speaking, finds an ∈–approximate 2–clustering inOn1/∈d time, which is quite practical and may be used to real large-scale problems such as the color quantization problem.
2017-03-24An Exploratory Knowledge Discovery Process based on Formal Concept Analysis (Amedeo Napoli)
14:00
(Séminaire)
Knowledge discovery (KD) in complex datasets can be considered from several points of view, e.g. data, knowledge, and problem solving. KD is applied to datasets and has a direct impact on the design of knowledge bases and problem solving. Dually, principles supporting knowledge representation can be reused in knowledge discovery, as in declarative data mining, for making KD more exploratory, iterative and interactive. Accordingly, one of our main objectives in KD is to design an exploratory KD approach supporting interactions between data and domain knowledge, discovery and representation, and between the analyst and the KD process. Accordingly, in this presentation, we discuss the process of Exploratory Knowledge Discovery based on Formal Concept Analysis (FCA). FCA starts with a binary context and outputs a concept lattice, which can be visualized, navigated and interpreted by human agents, and, as well, which can be processable by software agents. FCA can be extended to pattern structures for dealing with complex data such as Linked Data (RDF). We will present a methodology based on FCA and pattern structures for mining definitions in RDF data. Such data are classified into a concept lattice allowing data exploration and the discovery of implications, used to automatically detect missing information and to complete RDF data. Contrasting the preceding data-directed KD process, we will present a goal-directed KD process, where the search for patterns of high interest is guided by the analyst preferences and domain constraints. Finally, we will conclude by discussing the benefits of FCA and pattern structures for exploratory knowledge discovery.
2017-03-17Modélisation du circuit de l’information nécessaire à la boucle diagnostique microbiologique (David Morquin)
15:30
(E 323)
Modélisation du circuit de l’information nécessaire à la boucle diagnostique microbiologique – antibiothérapie adaptée des septicémies en milieu hospitalier et analyse de l’inertie de système
2017-02-24About Knowledge Representation and Computational Social Choice (Jérôme Lang)
15:30
(Séminaire)
Social choice theory studies the aggregation of individual preferences towards a collective choice. Computational social choice emerged in the late 1980s, and mostly uses computational paradigms and techniques to provide a better analysis of social choice mechanisms (especially in the fields of voting and of fair division of resources), and to construct new ones. Among the subfields of artificial intelligence that are concerned by this interaction, knowledge representation plays an important role (other subfields being machine learning, reasoning with uncertainty, search, and constraint programming). The reasons for which KR plays an important role include: representing preferences and reasoning about them; computing collective decisions with incomplete knowledge of agents' preferences; the role of knowledge in strategic behavior; and using logic for automated theorem proving in social choice. The talk will give an overview of part of these topics, and will try to identify challenges for KR researchers interested in computational social choice.
2017-02-10Réduction des pics de consommation d’électricité et problèmes d’optimisation induits pour les... (Chloé DESDOUITS (thèse))
15:30
(E 323)
Réduction des pics de consommation d’électricité et problèmes d’optimisation induits pour les consommateurs. Alors que les préoccupations concernant le réchauffement climatique deviennent de plus en plus sérieuses, l'une de ses premières causes: la consommation d'électricité, continue à croître. Une des manières d'endiguer le phénomène pourrait être de mieux équilibrer la consommation et la production, afin d'allumer moins de gros groupes de production, et de permettre l'intégration de plus de sources renouvelables. Le nouveau paradigme du marché de l'électricité incite les consommateurs à réduire leur pic de consommation, et à différer leur consommation quand la demande est moindre, à l'aide d'incitations tarifaires. De nouveaux algorithmes d'optimisation, et de nouvelles méthodologies sont donc nécessaires pour optimiser la puissance d'électrique utilisée par les consommateurs. Schneider Electric propose, à travers le projet européen Arrowhead, d'étudier trois cas applicatifs : un ascenseur avec plusieurs sources énergétiques, une usine manufacturière et un réseau d'eau potable. Pour chacune de ces trois applications, une méthodologie pour optimiser les pics de puissance consommée (parfois à l'aide d'une fonction de coût de l'électricité) est donnée, ainsi que des algorithms d'optimisation. Dans le cas de l'ascenseur multi-sources, deux contrôleurs couplés sont proposés : l'un au niveau stratégique résolvant un problème linéaire, et l'autre à base de règles au niveau tactique. Dans le cas de l'usine, la méthodologie que nous avons utilisée pour faire des mesures, construire des modèles énergétiques, et finalement optimiser est expliquée. De plus, trois formulations linéaires, ainsi qu'une procédure de recherche locale, et une formulation naïve de programmation par contraintes sont données afin de résoudre le problème d'ordonnance
2017-01-20La contrainte circuit (Nicolas Briot)
15:30
(E 323)
De nombreux problèmes industriels peuvent se ramener à des problèmes de tournées. Le plus célèbre est le problème du voyageur de commerce (TSP) ou plus généralement, le problème de tournées de véhicules (VRP). Ces problèmes "purs" sont largement traités en RO. Malheureusement, l’ajout de contraintes tel que : les fenêtres de temps, les précédences, les tailles des tournées, etc peuvent rendre la modélisation (et la résolution) encore plus difficile. À l'inverse, la programmation par contraintes arrive à exprimer certaines de ces structures combinatoires. Depuis 1994 avec le solveur CHIP, il existe la contrainte globale circuit dont la validation consiste à trouver une séquence de variables formant un circuit hamiltonien dans un graphe donné. En 22 ans, seulement une dizaine d'articles ont été publiés sur la contrainte : nouveaux algorithmes de filtrages, nouvelles stratégies et évolution de la contrainte. Je vous propose dans un premier temps de nous concentrer sur la contrainte circuit et son algorithme de filtrage principal. Dans un deuxième temps, je vous parlerai de la version optimisation de la contrainte et de ses différents algorithmes de filtrages dont le plus intéressant est basé sur la relaxation 1-tree de Held and Karp.
2016-12-06Almost-stable” matchings in the Hospitals / Residents problem with Couples. D.F. Manlove et al. (Gaelle Hisler)
15:30
(E 323)
The Hospitals / Residents problem with Couples (hrc) models the allocation of intending junior doctors to hospitals where couples are allowed to submit joint preference lists over pairs of (typically geographically close) hospitals. It is known that a stable matching need not exist, so we consider min bp hrc, the problem of finding a matching that admits the minimum number of blocking pairs (i.e., is “as stable as possible”). We show that this problem is NP-hard and difficult to approximate even in the highly restricted case that each couple finds only one hospital pair acceptable. However if we further assume that the preference list of each single resident and hospital is of length at most 2, we give a polynomial-time algorithm for this case. We then present the first Integer Programming (IP) and Constraint Programming (CP) models for min bp hrc. Finally, we discuss an empirical evaluation of these models applied to randomly-generated instances of min bp hrc. We find that on average, the CP model is about 1.15 times faster than the IP model, and when presolving is applied to the CP model, it is on average 8.14 times faster. We further observe that the number of blocking pairs admitted by a solution is very small, i.e., usually at most 1, and never more than 2, for the (28,000) instances considered.
2016-12-02Calcul garanti d'ellipsoïdes attractives pour les systèmes dynamiques (Gilles Chabert)
11:00
(E 324)
Cet exposé propose une approche numérique (intervalles) pour traiter un problème d'automatique, à savoir caractériser le bassin d'attraction d'un point fixe asymptotiquement stable d'un système dynamique non-linéaire. Plus précisément, nous proposons de calculer des ellipsoïdes dont l'inclusion dans le bassin d'attraction est garantie, la méthode étant également robuste aux incertitudes sur le système et sur le point fixe. La méthode repose sur une extension aux intervalles de l'équation de Lyapunov et l'optimisation globale. Travaux en cours de développement.
2016-11-18Learning Parameters For the Sequence Constraint From Solutions [Picard-Cantin et al] (Robin Arcangioli)
15:30
(E 323)
This paper studies the problem of learning parameters for global constraints such as Sequence from a small set of positive exam- ples. The proposed technique computes the probability of observing a given constraint in a random solution. This probability is used to select the more likely constraint in a list of candidates. The learning method can be applied to both soft and hard constraints.
2016-11-02Complexity Results in Optimistic/Pessimistic Preference Reasoning (Gaelle Hisler)
15:00
(E 323)
Preference reasoning is a central problem in decision support. There exist various ways to interpret a set of qualitative preferences. Conditional preference logics allow to deal with semantics such as optimistic, pessimistic, strong or not. In this paper, we study the complexity of the main problems in optimistic/pessimistic preference logic: undominated, consistency and dominance. We show that they are all NP-hard in general, with some becoming polynomial under specific semantics. Our second contribution is to show that the dominance problem, which has an online component in its definition, is compilable to polynomial time.
2016-11-02An Interval Filtering Operator for Upper and Lower Bounding in Constrained Global Optimization (Olivier Sans)
14:00
(E 323)
This paper presents a new interval-based operator for continuous constrained global optimization. It is built upon a new filtering operator, named TEC, which constructs a bounded subtree using a Branch and Contract process and returns the parallel-to-axes hull of the leaf domains/boxes. Two extensions of TEC use the information contained in the leaf boxes of the TEC subtree to improve the two bounds of the objective function value: (i) for the lower bounding, a polyhedral hull of the (projected) leaf boxes replaces the parallel-to-axes hull, (ii) for the upper bounding, a good feasible point is searched for inside a leaf box of the TEC subtree, following a look-ahead principle. The algorithm proposed is an auto-adaptive version of TEC that plays with both extensions according to the outcomes of the upper and lower bounding phases. Experimental results show a significant speed-up on several state-of-the-art instances.
2016-10-27A ‘Reduce and Solve’ Algorithm for the Multidimensional Knapsack Problem (Mohamed Bachir Belaid)
15:30
(E 323)
In this talk, we present a ‘reduce and solve’ heuristic for approximately solving the multidimensional knapsack problem (MKP). The proposed heuristic combines solution space reduction with linear programming techniques. The solution space reduction is achieved by fast and simple algorithms and then refined by solving some relaxed MKP sub-problems leading at the end to optimality or near optimality. The performance of the proposed heuristic has been evaluated on several problem instances Encouraging results have been obtained.
2016-09-30Adaptive learning algorithm for constraint acquisition (Redouane Ezzahir)
15:30
(E 323)
Constraint acquisition assists a non-expert user in modeling her problem as a constraint network. Several learning algorithms have been developed for acquiring a constraint network. All these algorithms attempt to reduce the dialogue length between the system and the user, and they are only evaluated using the number and size of queries. How- ever, for hard problems, the delay time to generate a query can be signif- icantly longer and thus cause the rupture of the system. To address this problem we propose Alca a new adaptive learning algorithm for con- straint acquisition based on the Quacq system. It uses a parameter that is automatically adjusted to perform an adaptive local-constraint acquisition. It also includes two techniques: example reexamination and premature convergence. Experiments on some benchmarks illustrate the practical eectiveness of our approach and its ability to cope with both the dialogue length and the waiting time of a query.
2016-09-16IJCAI16 and CP16 overview (Robin Arcangioli)
15:30
(E 323)
Discussion arround a selection of what have been presented at IJCAI and CP this year (2016).
2016-09-02A Global Constraint for Closed Frequent Pattern Mining (Mehdi Maamar)
15:30
(E 324)
Discovering the set of closed frequent patterns is one of the fundamental problems in Data Mining. Recent Constraint Programming (CP) approaches for declarative itemset mining have proven their usefulness and flexibility. But the wide use of reified constraints in current CP approaches leads to difficulties in coping with high dimensional datasets. In this paper, we propose the \closed global constraint to capture the closed frequent pattern mining problem without requiring reified constraints or extra variables. We present an algorithm to enforce domain consistency on ClosedPattern in polynomial time. The computational properties of this algorithm are analyzed and its practical effectiveness is experimentally evaluated.
2016-06-29ight upper bound on splitting by linear combinations for pigeonhole principle (ight upper bound on splitting by linear combinatio)
14:30
(E 324)
SAT is a well-known problem to identify if a given boolean formula has a satisfying assignment. Main technique in solving SAT is DPLL-algorithms. A usual DPLL-algorithm chooses a single variable and order of values to substitute, and run recursively on simplified formulas seeking for a satisfying assignment. We consider an extension of the algorithm which chooses not a single variable, but a linear form modulo 2. There is a known result that formulas, encoding pigeonhole principle on $n$ pigeons and $m$ holes ($PHP_n^m$), still is hard even for linear splitting and requires $2^{\Omega(n)}$ time of work. The known bound for a usual DPLL-algorithm without linear splittings is $2^{\Theta(n \log n)}$. In this talk we show that $PHP_n^m$ can be solved in $2^{O(n)}$ time with linear splittings, and, as a corollary, formulas, encoding perfect matching principle on graphs with $n$ vertices, can be also solved in $2^{O(n)}$.
2016-06-23Multiple Constraint Acquisition (Robin Arcangioli)
15:30
(E 323)
QUACQ is a constraint acquisition system that assists a non-expert user to model her problem as a constraint network by classifying (partial) examples as positive or negative. For each negative example, QUACQ focuses onto a constraint of the target network. The drawback is that the user may need to answer a great number of such examples to learn all the constraints. In this paper, we provide a new approach that is able to learn a maximum number of constraints violated by a given negative example. Finally we give an experimental evaluation that shows that our approach improves on QUACQ.
2016-06-23Constraint Acquisition with Recommendation Queries (Abderrazak Daoudi)
16:00
(E 323)
Constraint acquisition systems assist the non-expert user in modeling her problem as a constraint network. Most existing constraint acquisition systems interact with the user by asking her to classify an example as positive or negative. Such queries do not use the structure of the problem and can thus lead the user to answer a large number of queries. In this paper, we propose PREDICT&ASK, an algorithm based on the prediction of missing constraints in the partial network learned so far. Such missing constraints are directly asked to the user through recommendation queries, a new, more informative kind of queries. PREDICT&ASK can be plugged in any constraint acquisition system. We experimentally compare the QUACQ system to an extended version boosted by the use of our recommendation queries. The results show that the extended version improves the basic QUACQ.
2016-06-10Argumentation-based defeasible reasoning (Leila Amgoud (DR-CNRS à l'IRIT, Toulouse))
15:30
(bat5-124/3)
Argumentation is an alternative approach for defeasible reasoning. It is based on the idea of justifying plausible conclusions by “strong” arguments. Starting from a knowledge base encoded in a logical language, an argumentation system defines arguments and attacks between them using the consequence operator associated with the language. Finally, it uses a semantics for evaluating the arguments. The plausible conclusions to be drawn from the knowledge base are those supported by “good” arguments. In this talk, we present two families of such systems: the family using extension semantics and the one using ranking semantics. We discuss the outcomes of both families and compare them. We then compare the argumentation approach with other well-known approaches for defeasible reasoning, namely default logic.
2016-06-10Argumentation based constraint acquisition (Robin Arcangioli)
15:30
(E 324)
Efficient acquisition of constraint networks is a key factor for the applicability of constraint problem solving methods. Current techniques learn constraint networks from sets of training examples, where each example is classified as either a solution or non-solution of a target network. However, in addition to this classification, an expert can usually provide arguments as to why examples should be rejected or accepted. Generally speaking domain specialists have partial knowledge about the theory to be acquired which can be exploited for knowledge acquisition. Based on this observation, we discuss the various types of arguments an expert can formulate and develop a knowledge acquisition algorithm for processing these types of arguments which gives the expert the possibility to input arguments in addition to the learning examples. The result of this approach is a significant reduction in the number of examples which must be provided to the learner in order to learn the target constraint network.
2016-06-03LCM (Linear-time Closed pattern Miner) vs ClosedPattern Global Constraint (Mehdi Maamar)
15:30
(E 324)
Discovering the set of closed frequent patterns is one of the fundamental problems in Data Mining. Recent Constraint Programming (CP) approaches for declarative itemset mining have proven their usefulness and flexibility. But the wide use of reified constraints in current CP approaches raises many difficulties to cope with high dimensional datasets. This talk presents LCM the well-known spesialized algorithm vs ClosedPattern global constraint.
2016-05-20An Interval Branch and Bound Algorithm for Parameter Estimation (Gilles Trombettoni)
15:30
(E 324)
The parameter estimation problem is a widespread and challenging problem in engineering sciences consisting in computing the parameters of a parametric model that fit observed data. The computer vision community has proposed the RANSAC algorithm to deal with outliers in the observed data. This randomized algorithm is efficient but non-deterministic and therefore incomplete. Jaulin et al. propose a branch-and-contract algorithm that returns all the model instances fitting at least $q$ observations. Assuming that at least $q$ observed data are inliers, this algorithm achieves on the observations a relaxed intersection operator called $q$-intersection. First, this paper presents several improvements to Jaulin et al.'s algorithm. Second, an interval branch and bound algorithm is designed to produce a model that can explain the maximum number of observations within a given tolerance. Experiments are carried out on computer vision and image processing problems. They highlight a significant speedup w.r.t. Jaulin et al.'s interval method in 2D and 3D shape recognition problems. We have also investigated how the approach scales up in dimensions up to 7 for stereovision (estimation of essential and fundamental matrices).
2016-05-13Modèles simples de clustering par CSP. (Joël Quinqueton)
15:30
(E 324)
Nous nous intéressons aux problèmes de clustering selon le diamètre minimum et la séparation maximum. Nous présentons quelques algorithmes approximatifs ou exacts qui ont été proposés pour ces problèmes. Nous présentons une modélisation CSP sans réification de ces problèmes. Nous expérimentons ces modèles sur les plateformes Minion et Gecode.
2016-03-11What is a Decision Problem? (Alexis Tsoukias)
10:00
(sem)
The talk introduces a general framework through which any type of decision problem should be characterisable. The aim of the framework is to show that only a finite number of primitive characteristics describe a decision problem, thus reducing the number of archetypes to a finite number of combinations. In the talk we present in some details some of these characteristics. We also provide a number of simple examples in order to show how this framework helps in categorising current decision problems, in reformulating decision problems and in constructing formal arguments for the decision aiding process.
2016-02-19Just-In-Time Compilation of Knowledge Bases [Audemard et.al. IJCAI13] (Gaelle Hisler)
15:30
(E 324)
Since the first principles of Knowledge Compila- tion (KC), most of the work has been focused in finding a good compilation target language in terms of compromises between compactness and expres- siveness. The central idea remains unchanged in the last fifteen years: an off-line, very hard, stage, allows to “compile” the initial theory in order to guarantee (theoretically) an efficient on-line stage, on a set of predefined queries and operations. We propose a new “Just-in-Time” approach for KC. Here, any Knowledge Base (KB) will be imme- diately available for queries, and the effort spent on past queries will be partly amortized for future ones. To guarantee efficient answers, we rely on the tremendous progresses made in the practical solving of SAT and incremental SAT applicative problems. Even if each query may be theoretically hard, we show that our approach outperforms previ- ous KC approaches on the set of classical problems used in the field, and allows to handle problems that are out of the scope of current approaches.
2016-02-12Acquisition de contraintes (Robin Arcangioli)
15:30
(E 324)
2016-01-29Probing-based variable selection heuristics for NCSPs [Reyes V, Araya I, ICTAI14] (Olivier Sans)
16:30
(E 323)
Interval branch & bound solvers are commonly used for solving numerical constraint satisfaction problems. They alternate filtering/contraction and branching steps in order to find small boxes containing all the solutions of the problem. The branching basically consists in generating two subproblems by dividing the domain of one variable into two. The selection of this variable is the topic of this work. Several heuristics have been proposed so far, most of them using local information from the current node (e.g., domain sizes, partial derivative images over the current box, etc). We propose instead an approach based on past information. This information is provided by a preprocessing phase of the algorithm (probing) and is used during the search. In simple words, our algorithm attempts to identify the most important variables in a series of cheap test runs. As a result of probing, the variables are weighted. These weights are then considered by the selection heuristic during the search. Experiments stress the interest of using techniques based on past information in interval branch & bound solvers.
2016-01-22Extrapolating from Limited Uncertain Information to Obtain Robust Solutions for Large-Scale Optimiza (Nicolas Briot)
15:30
(E 324)
Data uncertainty in real-life problems is a current challenge in many areas, including Operations Research (OR) and Constraint Programming (CP). This is especially true given the continual and accelerating increase in the amount of data associated with real-life problems, to which Large Scale Combinatorial Optimization (LSCO) techniques may be applied. Although data uncertainty has been studied extensively in the literature, many approaches do not take into account the partial or complete lack of information about uncertainty in real-life settings. To meet this challenge, in this paper we present a strategy for extrapolating data from limited uncertain information to ensure a certain level of robustness in the solutions obtained. Our approach is motivated by real-world applications of supply of timber from forests to saw-mills.
2016-01-08Stochastic Constraint Programming (Nicolas Briot)
15:30
(E 324)
2015-12-18Data analytics and optimisation for assessing a ride sharing system (Vincent Armant)
15:30
(E 324)
Ride-sharing schemes attempt to reduce road traffic by matching prospective passengers to drivers with spare seats in their cars. To be successful, such schemes require a critical mass of drivers and passengers. In this talk we will see how deployed ridesharing applications can be improved by combining data analytics and constraint programming. We use data analytics to model flexible user behaviours, and infer feasible matches between users. We then use the discovered feasible match graph and a constraint programming model to assess the potential of different ridesharing schemes. We demonstrate that flexible schemes that persuade existing drivers to switch to become passengers, have the potential to reduce the number of unmatched participants by up to 80%. Finally, we abstract the ridesharing problem to a more general problem, the problem of multi resources allocation with time dependencies, and discuss the solutions and the new challenges it raises.
2015-12-11Elicitation incrémentale de poids pour la décision multicritère ou collective (Patrice Perny)
14
(3/124 (B5))
Dans cet exposé, nous présentons de nouvelles méthodes d'élicitation incrémentale de préférences pour la décision multicritère ou collective. Plus précisément, l'exposé sera focalisé sur l'apprentissage actif de pondérations dans les fonctions d'agrégation. Nous nous intéressons ici à des protocoles d'élicitation qui visent, à chaque étape du processus de décision, à identifier la question la plus informative possible pour réduire l'incertitude sur la valeur des solutions potentielles (minimisation de regrets) et ainsi favoriser la détermination rapide d'une solution nécessairement optimale ou quasi-optimale. Après avoir rappelé les bases de l'élicitation incrémentale de poids dans le cas simple d'un agrégateur linéaire et d'un ensemble d'alternatives donné en extension, l'exposé se focalisera sur deux difficultés indépendantes que l'on peut envisager de manière séparée puis de manière combinée : l'élicitation des poids dans une intégrale de Choquet (problème 1), l'élicitation incrémentale des poids sur un ensemble d'alternatives défini de manière implicite (problème 2). Dans le problème 1, nous montrerons comment contourner la difficulté liée au nombre exponentiel de paramètres de l'intégrale de Choquet pour permettre un apprentissage actif efficace de la capacité. Dans le problème 2, nous proposerons une nouvelle approche entrelaçant élicitation des préférences et énumération implicite des solutions pour la détermination rapide d'une solution robuste avec une garantie de qualité. Cette approche permettant l'élicitation incrémentale des poids en cours de recherche peut être instanciée dans différents algorithmes de résolution tels que la programmation dynamique, les algorithmes gloutons et le branch and bound. A titre d'illustration nous présentons une sophistication interactive et incrémentale de MOA* pour l'optimisation multiobjectif dans les espaces d'états. Ce travail résulte d'une
2015-12-04Detecting Types of Variables for Generalization in Constraint Acquisition (Abderrazak Daoudi)
15:30
(E 324)
During the last decade several constraint acquisition systems have been proposed for assisting non-expert users in building constraint programming models. GENACQ is an algorithm based on generalization queries that can be plugged into many constraint acquisition systems. However, generalization queries require the aggregation of variables into types which is not always a simple task for non-expert users. In this paper, we propose a new algorithm that is able to learn types during the constraint acquisition process. The idea is to infer potential types by analyzing the structure of the current constraint network and to use the extracted types to ask generalization queries. Our approach gives good results although no knowledge on the types is provided.
2015-11-27A General Framework for Reordering Agents Asynchronously in Distributed CSP (Christian Bessiere)
15:30
(E 324)
Reordering agents during search is an essential component of the eciency of solving a distributed constraint satisfaction prob- lem. Termination values have been recently proposed as a way to sim- ulate the min-domain dynamic variable ordering heuristic. The use of termination values allows the greatest exibility in reordering agents dy- namically while keeping polynomial space. In this paper, we propose a general framework based on termination values for reordering agents asynchronously. The termination values are generalized to represent var- ious heuristics other than min-domain. Our general framework is sound, complete, terminates and has a polynomial space complexity. We im- plemented several variable ordering heuristics that are well-known in centralized CSPs but could not until now be applied to the distributed setting. Our empirical study shows the signi cance of our framework compared to state-of-the-art asynchronous dynamic ordering algorithms for solving distributed CSP
2015-11-20Apprentissage automatique large échelle: application au cas de la prédiction de l’attrition chez (Remi Coletta)
15:30
(E 324)
Dans cet exposé, je vous présenterais au travers d’un cas d’usage, qui constituera mon fil conducteur les différentes briques scientifiques (intégration de données, Machine Learning et Séquence Mining,..) et techniques (architecture InMemory SPARK) mises en oeuvre et/ou développées au sein de TellMePlus, qui expliquent pourquoi la solution de TellMePlus dépasse souvent en performances l’état de l’art industriel (y compris plusieurs grands éditeurs logiciels) sur des cas d’usage industriels. Certaines briques étant très technos, ou plutôt issues des bases de données, je conclurais en zoomant sur quelques perspectives de recherche plus proches de l’apprentissage automatique.
2015-11-13ModelSeeker: Constraint Acquisition System (Robin Arcangioli)
15:30
(E 323)
Model Seeker est un système qui permet, à partir d’exemples positifs, de générer un modèle sous contraintes d’un problème hautement structuré. Ce système est basé sur le catalogue des contraintes globales duquel il tente d’extraire le modèle. Il utilise l’outil appelé Constraint Seeker qui, pour un lot d’exemples (positifs ou négatifs), fournit une liste de contraintes candidates classées de la plus à la moins pertinente. Nous analyserons la méthode qu’utilise Model Seeker, en particulier, comment capture-t-il des structures régulières sur lesquelles s’applique potentiellement une contrainte et comment génère-t-il une liste de contraintes candidates pertinentes.
2015-11-06Towards Declarative Scripting Combining CP and Analytics (Hassan Aït-Kaci)
14:30
(Bat5-3/124)
We take a rapid look at the current trends of evolution of application software tools for combining the formal capacities of (mostly statistical) Analytics along with Constraint Programming (CP). We note that application orchestration is what makes this task difficult. As a result, a new area of software development is rapidly becoming of great importance in relation with CP and Analytics: viz., scripting application orchestration. This has become a clear need in particular for Data Analytics that are typically multiplexed applications interacting between data and specific processing tools acting thereon such as statistical analysis and simulation, optimization, graphing, tabling and DB interface, HTML/LaTeX/Doc report generation, etc., ... We overview some systems used for scripting application orchestration involving CP and Analytics, looking at pros and cons. Then, we discuss the following point: since scripting is crucial both to CP and Analytics, shouldn't we leverage CP to make Analytics programming as declarative as possible? If so, what are the challenges? This was an invited presentation as part of the CP-2015 Panel Workshop on CP and Analytics, Aug. 31, 2015, Cork, Ireland.
2015-10-23Fault localization using itemset mining under constraints (Mehdi Maamar)
15:30
(E 324)
We introduce in this paper an itemset mining approach to tackle the fault localization problem, which is one of the most difficult processes in software debugging. We formalize the problem of fault localisation as finding the k best patterns satisfying a set of constraints modelling the most suspicious statements. We use a Constraint Programming (CP) approach to model and to solve our itemset based fault localization problem. Our approach consists of two steps: i) mining top-k suspicious suites of statements; ii) fault localization by processing top-k patterns. Experiments performed on standard benchmark programs show that our approach enables to propose a more precise localization than a standard approach.
2015-10-16Discussion sur l'IA (Christian Bessiere)
14:30
(bat5/3-124)
Une réunion du pôle IA en forme de discussion informelle sur les enjeux et risques de l'IA sur la société.
2015-10-09TEC opérateur (Tree for Enforcing Consistency) sur le domaine continu (Olivier Sans)
15:30
(E 324)
2015-10-02PREFIX-PROJECTION Global Constraint for Sequential Pattern Mining (CP15) (Yahia Lebbah)
14:30
(bat5-1/124)
Many efficient ad hoc methods have been developed for mining sequential patterns, but they are all suffering from a lack of genericity. Recent works have investigated Constraint Programming (CP) methods, but they are not still effective because of their encoding. In this talk we present a global constraint based on the projected databases principle which remedies to this drawback. Experiments showed that our approach clearly outperforms CP approaches and competes well with ad hoc methods on large datasets.
2015-09-25Représentations des préférences (Gaelle Hisler)
15:30
(E 324)
2015-09-18Best Application paper at CP15 (Nicolas Briot)
15:30
(E 324)
title: Long-Haul Fleet Mix and Routing Optimisation with Constraint Programming and Large Neighbourhood Search Philip Kilby and Tommaso Urli Abstract. We present an original approach to compute efficient mid- term fleet configurations, at the request of a Queensland-based long-haul trucking carrier. Our approach considers one year’s worth of demand data, and employs a constraint programming (CP) model and an adap- tive large neighbourhood search (LNS) scheme to solve the underly- ing multi-day multi-commodity split delivery capacitated vehicle routing problem. A Pareto-based pre-processing step allows to tackle the large amount of data by extracting a set of representative days from the full horizon. Our solver is able to provide the decision maker with a set of Pareto-equivalent fleet setups trading off fleet efficiency against the likeli- hood of requiring on-hire vehicles and drivers. Moreover, the same solver can be used to solve the daily loading and routing problem. We carry out an extensive experimental analysis based on real-world data, and we discuss the properties of the generated fleets. We show that our app- roach is a sound methodology to provide decision support for the mid- and short-term decisions of a long-haul carrier.
2015-09-11Teaching CP (Christian Bessiere)
15:30
(E 324)
Teaching constraint programming (CP) is important: it allows us to increase the uptake of the technology and to train the next generation of researchers and practitioners. We are holding a cocomeet on teaching constraint programming.
2015-07-02Global Constraints in Software Testing Applications (Arnaud Gotlieb)
16:00
(E 323)
Global constraints are considered as a major contribution of the Constraint Programming community to the combinatorial optimisation area. In this talk, I will present two examples of software testing applications where global constraints are the key-enabling technology to solve large or difficult instances of the underlying combinatorial problems. These two applications have led to the development of CP models embedded into industrial products.
2015-06-19The Balance Constraint Family (Christian Bessiere)
15:30
(E 323)
Spread, Balance, AllBalance, Focus global constraints
2015-06-16L'art difficile de l'évaluation de stratégies dans un contexte multi-agents (Pole IA) (Philippe Mathieu)
10:30
(bat5-2/124)
La simulation ou la résolution de problèmes dans le cadre des systèmes complexes nécessite pour chaque élément constituant, la mise en place d'une stratégie comportementale. Une stratégie n'est ni plus ni moins qu'un mécanisme de sélection d'actions se basant en général sur le passé pour en déduire la prochaine action à réaliser. Proposer une nouvelle stratégie nécessite de savoir l'évaluer en la comparant à ses semblables. Malheureusement, ses performances dépendant du contexte, il est alors très difficile de mettre en place des comparaisons objectives permettant d'établir qu'une stratégie est meilleure qu'une autre. A travers deux contextes (très) différents, le contexte des jeux itérés puis celui des marchés financiers, nous montrons les difficultés et écueils que l'on rencontre, et proposons quelques pistes pour améliorer ces comparaisons.
2015-06-12Phase Transition (Christian Bessiere)
15:30
(E 323)
A talk on CP/SAT Phase transition, the Exceptionally Hard problems (EHP) and the notion of restart
2015-06-12Constraint Acquisition as a Reinforcement Learning task (Kostandina Veljanovska)
16:00
(E 323)
2015-05-29Programmation par contraintes pour la vendange sélective (Nicolas Briot)
15:30
(E 324)
En viticulture de précision, la vendange sélective consiste à récolter séparément deux qualités de raisin dans une même vigne. On dispose pour cela d’une machine à vendanger comportant deux trémies et on connait la localisation des zones de qualités ainsi qu’une estimation des quantités à récolter. Le problème consiste à optimiser le trajet de la machine à vendanger tout en respectant de nombreuses contraintes comme la prise en compte du sens de traitement des rangs, la capacité de stockage de la machine, son rayon de braquage, le nombre de vidanges, etc. Après une formalisation du problème de la vendange sélective, cet article présente une modélisation sous la forme d’un problème de satisfaction de contraintes. La dernière section donne des résultats préliminaires fournis par le solveur AbsCon.
2015-05-07Activity-Based Search for Black-Box Contraint-Programming Solvers [L. Michel and P.V Hentenrynk] (Olivier Sans)
16:00
(E 324)
Robust search procedures are a central component in the design of black-box constraint-programming solvers. This paper proposes activity-based search, the idea of using the activity of variables during propagation to guide the search. Activity-based search was compared experimentally to impact-based search and the wdeg heuristics. Experimental results on a variety of benchmarks show that activity-based search is more robust than other heuristics and may produce significant improvements in performance.
2015-04-29Arithmétique par intervalles (Nathalie Revol)
16h
(E 324)
Cet exposé commencera par une brève introduction à l'arithmétique par intervalles ; l'algorithme de Newton servira d'exemple pour illustrer ses spécificités, à savoir les précautions à prendre lorsque l'on écrit une formule mathématique comportant des intervalles, et les garanties que l'on peut obtenir sur le résultat calculé : encadrement du résultat, mais également preuve d'existence, d'unicité. Afin de pouvoir partager les bénéfices des calculs utilisant l'arithmétique par intervalles, un effort de normalisation, lancé en 2008 et qui devrait aboutir en 2015 à la norme IEEE 1788, sera présenté. Cependant, les intervalles calculés sont des encadrements parfois beaucoup trop larges des valeurs cherchées. Pour remédier en particulier au problème de décorrélation des variables (appelé "variable dependency" en anglais), des variantes telles que les modèles de Taylor ou l'arithmétique affine existent et seront évoquées. Enfin, une approche plus algorithmique sera présentée sur des problèmes d'algèbre linéaire : produit de matrices, résolution de systèmes linéaires.
2015-04-24Multi-Armed Bandits for Adaptive Constraint Propagation (Anastasia Paparrizou)
15:30
(E 324)
: Adaptive constraint propagation has recently received a great attention. It allows a constraint solver to exploit various levels of propagation during search, and in many cases it shows better performance than static/predefined. The crucial point is to make adaptive constraint propagation automatic, so that no expert knowledge or parameter specification is required. In this work, we propose a simple learning technique, based on multi-armed bandits, that allows to automatically select among several levels of propagation during search. Our technique enables the combination of any number of levels of propagation whereas existing techniques are only defined for pairs. An experimental evaluation demonstrates that the proposed technique results in a more efficient and stable solver.
2015-04-21Packing Curved Objects (Gilles Chabert)
11:00
(E 324)
We will present a generic approach based on continuous constraint programming for packing two-dimensional objects of quite arbitrary shapes, including non-convex curved shapes and assemblies of them. There has been considerable work on packing objects but, most of the time, with specific shapes (rectangles or "bins", circles, polygons,...). Nothing has really been done for the general case where different arbitrary shapes can be mixed together. More precisely, we address the wide class of objects described by non-linear inequalities. Our approach can be seen as a three-layered framework. The top layer uses a stochastic optimization algorithm with a fitness function that gives a global violation cost and equals zero when objects are all packed. The second layer decomposes the global function as a sum of elementary functions that measure the overlapping between each pair of different objects. The third layer calculates the set of all positions and orientations that can be assigned to a given shape so that the intersection with a second one is empty. This non-overlapping constraint is processed by a branch & prune algorithm that resorts to two original algorithms that we will present. We will show some experimental results and give the ideas we are currently exploring for extending our approach to objects described by parametric curves.
2015-04-10Fault localization using data mining under constraints (Nadjib Lazaar)
15:30
(E 324)
2015-03-27Learning How to Learn Constraints with Generalization Queries (Abderrazak Daoudi)
15:30
(E 324)
Generalization queries need an aggregation of variables into types. This aggregation of variables is provided by the user. However, in some cases, the user is not able to provide the types. In this work, we propose several techniques that take advantage of generalization, even when no knoweldge on variable types is provided. The main idea is to analyse the partial graph of learned constraints. The inference of potentiel types is based on detection of quasi-cliques and quasi-bicliques. We evaluate our techniques on some benchmarks.
2015-03-20Contribution to the parameter elicitation in multicriteria optimization (Noureddine Aribi)
15:30
(E 324)
Many methods exist for solving multicriteria optimization problems, and it is not easy to choose the right method well adapted to a given multicriteria problem. Even after choosing a multicriteria method, various parameters (e.g. weight, utility functions, etc.) must be carefully determined either to find the optimal solution (best compromise) or to classify all feasible solutions (the set of alternatives). To overcome this potential difficulty, elicitation methods are used in order to help the decision maker to fix safely the parameters. Additionally, we assume that we have a set of feasible solutions, and we also make the assumption that we have prior information about the preferences of the decision maker, and we focus on how to use these information, rather than how to get them. In this work, we take advantage of a simple and quickly computable statistical measure, namely, the Spearman correlation coefficient, to develop a greedy approach, and two exact approaches based on constraint programming (CP) and mixed integer programming (MIP). These methods are then used to automatically elicit the appropriate parameter of the lexicographic ordering method. We also propose some elicitation models for some multicriteria methods, such as OWA operators, weighted sum method, and Leximin. These elicitation approaches are based on solving max-CSP models. The parameters that we calculate ensure the best way according to which the chosen method will capture the preferences expressed by the decision maker. On the other hand, we propose a variant of a CP based model of Leximin, including a symmetry breaking algorithm having a better experimental results.
2015-03-13On the complexity of Clustering problems (Joel Quinqueton)
15:30
(E 324)
We present here results about the combinatorial aspects of atempting to deal with clustering problems using a CSP solver.
2015-03-04XCSP3 An Integrated Format for Benchmarking Combinatorial Constrained Problems (Christophe Lecoutre)
15:00
(E 323)
We propose a major revision of the format XCSP 2.1, called XCSP3, to build integrated representations of combinatorial constrained problems. This new format is able to deal with mono/multi optimization, many types of variables, cost functions, rei cation, views, annotations, variable quanti cation, distributed, probabilistic and qualitative reasonings. The new format is made compact, highly readable, and rather easy to parse. Interestingly, it partly captures the structure of the problem models, through the possibilities of declaring arrays of variables, and identifying syntaxic and semantic groups of constraints. The number of constraints is kept under control by introducing a set of approximately 50 basic constraint forms, and producing almost automatically some of their variations through lifting, restriction, sliding, logical combination and relaxation mechanisms. As a result, XCSP3 encompasses practically all constraints that can be found in major constraint solvers developed by the CP community. A website, which is developed conjointly with the format, is expected to contain very rapidly many models and series of instances. The user will be able to make sophisticated queries for selecting instances from very precise criteria. The objective of XCSP3 is to ease the e ort required to test and compare di erent algorithms by providing a common test-bed of combinatorial constrained instances.
2015-02-20Node Selection Strategies in Interval Branch and Bound Algorithms: zoom on Feasible Diving (Gilles Trombettoni)
15:30
(E 324)
The talk focuses on a strategy achieving a tradeoff between intensification and diversification for node selection in interval Branch and Bound algorithms. A description of Feasible Diving is given for intensifying the search of a feasible solution with a good cost at each node of a standard best-first search. A similar technique is used, while not really detailed in a publication, in the CPLEX MIP solver. To clearly explain the strategy, I will describe the current and (probably) future scheme of our IbexOpt constrained global (continuous) optimizer.
2015-01-23Preference Elicitation in Recommender Systems (Paolo Viappiani (LIP6, France))
14:00
(séinaires)
Many applications, such as decision support systems and recommender systems, need to reason about the user's preferences. However, acquiring the preferences of the user is a challenging task for a number of reasons. First, elicitation of user preferences is usually expensive (w.r.t. time, cognitive effort, etc.). Second, many decision problems have large outcome or decision spaces. Third, users are inherently "noisy" and inconsistent. Since classical elicitation paradigm from decision theory are impractical for modern software applications, a number of approaches for effectively eliciting preferences have been recently proposed in the artificial intelligence and machine learning community. This talk, after an introduction to recommender systems, will review the most prominent techniques for preference elicitation (also called preference learning in the machine learning community), including learning from ranked data. We will put a particular emphasis on interactive methods that aim at asking the most informative questions in order to make good (or even optimal) recommendations with sparse knowledge of the user's utility function.
2015-01-16Constraint programs: from acquisition to validation (Nadjib Lazaar)
14:00
(bat5-2/124)
Constraint programs are declarative programs written in high-level constraint languages such as OPL (Optimization Programming Language), CHOCO, GECODE or ZINC. These programs are more and more used in the design of safety-critical or business-critical systems with, for example, application to high-frequency trading, air-traffic control, car manufacturing or hardware verification. However, i) we need a fair expertise to write down a constraint program and, ii) as any other computer programs, constraint programs must be thoroughly tested and corrected before being used in operational mode. My research interests are mainly related to the two previous questions. In my presentation, I will give an overview on previous, ongoing and future works on constraint acquisition and constraint validation.
2015-01-16Modularisation de triplestores RDF déductifs (Federico Ulliana)
14:45
(bat5-2/124)
Reusing modules extracted from well-established knowledge bases allows to capitalize on the efforts made in the last years by many institutions, organizations, and domain experts, to publish quality information and make it available online. Today, this practice is crucial to ensure a coherent development of the Semantic Web. I will present a novel approach to the extraction of modules of data and knowledge from RDF Triplestores augmented with safe inference rules, à la Datalog. Dealing with a recursive rule language poses challenging issues for defining the module semantics, and also makes module extraction algorithmically unsolvable in some cases. I will present a set of module extraction algorithms compliant with the novel semantics, and experimental results showing the succinctness of the modules extracted with this approach.
2015-01-16Development of Optimal and Adaptive Strategy Using Q-learning (Kostandina Veljanovska)
16:00
(E 324)
Reinforcement learning is a machine learning technique, which can work without supervision. It is goal-directed learning from interaction with an environment, technique that will learn what to do - how to map situations to actions, in order to maximize a numerical reward signal. Reinforcement learning agents are implemented as controllers in order to provide optimal control. In my talk I will present the technique used to implement the control strategy.
2015-01-15Towards a global constraint for the itemset mining problem (Nadjib Lazaar)
15:30
(E 324)
2015-01-09Constraint-based CP-nets (Gaelle Hisler)
15:30
(E 324)
Gaelle will give us a talk based on two papers [Prestwich et al., AAAI05] and [Domshlak et al., Heuristics06]
2014-12-19Impact-Based Search Strategies for Constraint Programming [P. Refalo CP04] (Olivier Sans)
15:30
(E 324)
A key feature of constraint programming is the ability to design specific search strategies to solve problems. On the contrary, integer programming solvers have used efficient general-purpose strategies since their earliest implementations. We present a new general purpose search strategy for constraint programming inspired from integer programming techniques and based on the concept of the impact of a variable. The impact measures the importance of a variable for the reduction of the search space. Impacts are learned from the observation of domain reduction during search and we show how restarting search can dramatically improve performance. Using impacts for solving multiknapsack, magic square, and Latin square completion problems shows that this new criteria for choosing variables and values can outperform classical general-purpose strategies.
2014-12-12Node Edge Arc Routing Problem (Nicolas Briot)
15:30
(E 324)
The aim is to present the Capacitated General Windy Routing Problem and its variants. This new problem subsumes many important and well-known arc and node routing problems. Arc and node routing problems consist of finding a set of routes covering certain arcs, edges and/or vertices of a graph, which meet certain conditions. These problems allow us to model and to solve many real-life problems, especially in the context of transportation, distribution management and scheduling.
2014-12-11Handling uncertain data dependency constraints (Carmen Gervet)
11:00
(séminaire)
Uncertain data due to imprecise measurements is commonly specified as bounded intervals in a constraint decision or optimization problem. Dependencies do exist among such data, e.g. upper bound on the sum of uncertain production rates per machine, sum of traffic distribution ratios from a router over several links. For tractability reasons most approaches in optimization and constraint-based reasoning, assume independence of the data. Assuming independence is safe, but leads to large solution spaces, and a loss of problem structure. Thus it cannot be overlooked. In this presentation we present two approaches we developed to embed and tackle such dependencies effectively. The first one combines the strengths of two frameworks, namely constraint programming and regression analysis. We will show how we can exploit the strengths of both paradigms, and provide valuable insights to the decision maker by relating solutions to uncertain data satisfying dependency constraints. We also show the insights we learnt from running a regression: the solving of dependency constraints needs to take into account the possible values of the relevant decision variables. Our second approach addresses this particular issue efficiently. We consider dependency constraints where the dependencies are drawn from a matrix model. Matrix models are linear models whereby the data imprecision applies to the cells of the matrix and the output vector. We combine consistency techniques from constraint programming and modeling techniques from mathematical programming to derive a new model for uncertain data dependency constraints. We will develop and illustrate the strengths this approach: 1) a novel perspective on data dependency constraints, 2) an efficient modeling approach that suits solvers from multiple paradigms.
2014-11-27Adaptive constraint propagation (Anastasia Paparrizou)
15:00
(1.4)
Constraint propagation algorithms vary in the strength of propagation they enforce. Selecting the appropriate propagator is an essential problem in CP and therefore has attracted a lot of attention. The adaptive techniques that have been proposed to tackle this problem are mainly based on either ML or heuristic approaches. In my talk, I will present you an overview of existing works of both approaches.
2014-10-31Node Selection Heuristics Using the Upper Bound in Interval Branch and Bound (Gilles Trombettoni)
15:30
(E 324)
We present in this article a new strategy for selecting the current node in an interval Branch and Bound algorithm for constrained global optimization. The standard best-first strategy selects the node with the lowest lower bound of the objective estimate. We propose in this article new node selection policies where an upper bound of each node/box is also taken into account. The good accuracy of this upper bound achieved by several operators leads to a good performance of the criterion. These new strategies obtain better experimental results than classical best-first search on difficult instances.
2014-10-17a CP framework for Constrained Clustering (Joël Quinqueton)
15:30
(E 324)
Joel will talk us about the papers by Christel Vrain et al. on a CP framework for Constrained Clustering [ICTAI14, PKDD14].
2014-10-10Buffered Resource Constraint: Algorithms and Complexity. CPAIOR14 (Christian Bessiere)
15:30
(E 323)
The notion of buffered resource is useful in many problems. A buffer contains a finite set of items required by some activities, and changing the content of the buffer is costly. For instance, in instruction scheduling, the registers are a buffered resource and any switch of registers has a significant impact on the total runtime of the compiled code. We first show that sequencing activities to minimize the number of switches in the buffer is NP-hard. We then introduce an algorithm which, given a set of already sequenced activities, computes a buffer assignment which minimizes the number of switches in linear time, i.e., O(nd) where n is the length of the sequence and d the number of buffered items. Next, we introduce an algorithm to achieve bound consistency on the constraint Switch, that bounds the number of changes in the buffer, in O (n2d+n1.5d1.5) time. Finally, we report the results of experimental evaluations that demonstrate the efficiency of this propagator.
2014-09-29Introduction au choix social computationnel (Jérôme Lang -- Université Paris Dauphine)
14
(E 324)
La théorie du choix social vise à la construction et l'analyse de méthodes pour la décision collective; c'est une branche importante de l'économie mathématique. Voici quelques exemples typiques de décision collective : élections de représentants politiques; votes profanes (par exemple, un groupe d'amis décidant du choix d'un restaurant); partage équitable de ressources (par exemple, répartition des biens entre ex-conjoints dans un jugement de divorce, répartition des classes et des créneaux horaires dans un lycée); recherche d'un consensus sur un verdict lors d'un jury d'assises. Jusqu'à présent, les théoriciens du choix social se sont peu préoccupés de questions algorithmiques. C'est là que l'informatique, et plus spécifiquement l'intelligence artificielle, entre en jeu. Depuis une dizaine d'années se développe ainsi un champ de recherche, à la rencontre du choix social et de l'informatique, appelé "choix social computationnel". On peut distinguer deux directions de recherche. La première vise à importer des concepts et procédures de la théorie du choix social pour résoudre des problèmes issus d'applications provenant de l'informatique, notamment les procédures d'agrégation pour le classement de pages web. La seconde (et la plus importante) vise à utiliser des notions et méthodes venant de l'informatique pour résoudre ou repenser des problèmes complexes de décision collective. L'exposé passera en revue quelques-unes des problématiques-clé du choix social computationnel, et notamment le calcul exact ou approché de règles de vote complexes, le vote sur des domaines combinatoires, les barrières computationnelles aux comportements stratégiques, et la conception de protocoles d'interaction pour la décision collective.
2014-06-20One word about contractor in and beyond numerical CSP (Gilles Trombettoni)
15:00
(E 324)
I will introduce the notion of contractor proposed by Chabert and Jaulin in 2009. We will discuss about differences and similarities of contractor with filtering algorithm and global constraint.
2014-06-06Reformulating CSPs for Scalability with Application to Geospatial Reasoning (Ken M. Bayer, et. al) (Berthe Choueiry)
15:30
(E 324)
While many real-world combinatorial problems can be advantageously modeled and solved using Constraint Programming, scalability remains a major issue in practice. Constraint models that accurately reflect the inherent structure of a problem, solvers that exploit the properties of this structure, and reformulation techniques that modify the problem encoding to reduce the cost of problem solving are typically used to overcome the complexity barrier. In this paper, we investigate such approaches in a geospatial reasoning task, the building-identification problem (BID), introduced and modeled as a Constraint Satisfaction Problem by Michalowski and Knoblock [1]. We introduce an improved constraint model, a custom solver for this problem, and a number of reformulation techniques that modify various aspects of the problem encoding to improve scalability. We show how interleaving these reformulations with the various stages of the solver allows us to solve much larger BID problems than was previously possible. Importantly, we describe the usefulness of our reformulations techniques for general Constraint Satisfaction Problems, beyond the BID application.
2014-05-23Q-intersection Algorithms for Constraint-Based Robust Parameter Estimation (Philippe Vismara)
15:30
(E 324)
Given a set of axis-parallel n-dimensional boxes, the q-intersection is defined as the smallest box encompassing all the points that belong to at least q boxes. Computing the q-intersection is a combinatorial problem that allows us to handle robust parameter estimation with a numerical constraint programming approach. The q-intersection can be viewed as a filtering operator for soft constraints that model measurements subject to outliers. This paper highlights the equivalence of this operator with the search of q-cliques in a graph whose boxicity is bounded by the number of variables in the constraint network. We present a computational study of the q-intersection. We also propose a fast heuristic and a sophisticated exact q-intersection algorithm. First experiments show that our exact algorithm outperforms the existing one while our heuristic performs an efficient filtering on hard problems.
2014-04-30Boosting Constraint Acquisition via Generalization (Abderrazak Daoudi)
15:30
(E 324)
Constraint acquisition assists a non-expert user in modeling her problem as a constraint network. In existing constraint acquisition systems the user is only asked to answer very basic questions. The drawback is that when no background knowledge is provided, the user may need to answer a great number of such questions to learn all the constraints. In this paper, we introduce the concept of generalization query based on an aggregation of variables into types. We present a constraint generalization algorithm that can be plugged into any constraint acquisition system. We propose several strategies to make our approach more efficient in terms of number of queries. Finally we experimentally compare the recent QUACQ system to an extended version boosted by the use of our generalization functionality. The results show that the extended version dramatically improves the basic QUACQ.
2014-04-25Learning How to Propagate Using Random Probing (Amine Balafrej)
15:30
(E 323)
Amine will talk us about the paper by E. Stamatatos and K. Stergiou on "Learning How to Propagate Using Random Probing". CPAIOR'09 paper
2014-04-11A Computational Method for (MSS,CoMSS) Partitioning (Jean-Marie Lagniez)
14:30
(E 323)
The concepts of MSS (Maximal Satisfiable Subset) and CoMSS (also called Minimal Correction Subset) play a key role in many A.I. approaches and techniques. In my presentation, a novel algorithm for partitioning a Boolean CNF formula into one MSS and the corresponding CoMSS will be introduced. Extensive empirical evaluation shows that it is more robust and more efficient on most instances than currently available techniques.
2014-04-04Wireless Sensor Networks, a hybrid LP+CP scheme to solve it optimally (Eric Bourreau)

In this presentation we consider the design of energy efficient operation scheme in wireless sensor networks (WSN) in order to maximize network lifetime. We address target coverage with sensors used for sensing and sending data to a base station through multi-hop communication. With this purpose, a column generation algorithm (LP) exploiting a Constraint Programming (CP) approach to tackle the pricing subproblem is used to obtain optimal solutions. Global constraints, both on integer variables (‘gcc’) and graph variables are used (‘trees’).
2014-04-04Strong Bounds Consistencies (Anastasia Paparrizou)

Arc consistency (AC) and bounds consistency (BC) are the standard local consistencies used for propagation in CP solvers. Although many strong local consistencies that extend AC have been proposed, similar consistencies based on BC have been overlooked. We study relational Path Inverse Bounds Consistency (rPIBC) and Pairwise Bounds Consistency (PWBC), two new local consistencies that extend BC by simultaneously considering combinations of constraints as opposed to single constraints. We show that rPIBC and PWBC are stronger than BC, but they are NP-hard to enforce even when constraints are linear. We propose two polynomial-time techniques to enforce approximations of these two consistencies on linear constraints. One is a reformulation of the constraints on which we enforce BC whereas the other is a polynomial time algorithm. Both achieve stronger pruning than BC. Our experiments on randomly generated problems and on problems used for the evaluation of matchmakers in Web Services show large differences in favor of our approaches.
2014-03-28An interval based approach for graph-mining (Yahia Lebbah)

Canonical encoding is one of the key operations required by frequent subgraph mining algorithms for the candidate generation. Existing approaches make use of canonical encodings with an exponential time complexity. As a consequence, mining all frequent patterns for large and dense graphs is computationally expensive. In this talk, we propose a relaxation of the canonicity property, leading to two encodings, lower and upper encodings, with a polynomial time complexity, allowing to tightly enclose the exact set of frequent subgraphs. These two encodings have been integrated in the state of the art Gaston miner. Experiments performed on large dense synthetic and real graph datasets show that, our two encodings are very effective compared to state of the art solvers, while on large real sparse graph datasets they remain very competitive.
2014-03-27Our Recent/Current Works about Sliced Table Constraints and Subgraph Isomorphism (Christophe Lecoutre)
14
(E 324)
In this talk, we shall describe our recent developments concerning compression of table constraints (using a data-mining algorithm) and combination of (known) techniques for solving the subgraph isomorphism.
2014-03-21How to boost AI and constraint programming to tackle real-world combinatorial problems: set objects (Carmen Gervet)

Since the beginning of Constraint Programming (CP), its extension to tackle large scale and real-world problems has been an active and successful field of research. In this presentation, we first present our works to extend CP to reason with sets, as subsets of a larger known set. Sets are natural and fundamental combinatorial objects to model a wide range of configuration problems like network design, set packing and covering as well as allocation problems. Set variables raise fundamentally novel representations issues since their domains often contain an exponential number of sets. How to represent this domain and how to use this representation for pruning the search space effectively is the issue we tackle. Secondly, we address the problem of optimization in the presence of imprecise/incomplete data ubiquitous in real-world decision and optimization applications (e.g. network flow analysis, and economics of renewable energies). Interval parameters are effectively used in Operations Research and CP to specify such uncertain data, in order to provide reliable solutions to convex models. A reliable output is a set of solutions guaranteed to contain all solutions possible under any realization of the data. It is safe but often too large to be meaningful to the decision-maker. This raises the issues of embedding more problem structure in the representation of imprecise data while retaining tractability of the model. We consider two main aspects: how to represent a possible degree of knowledge of the data and how to account for dependencies among data in a tractable manner.
2014-03-21Résolution SAT incrémentale. (Gilles Audemard)
14
(séminaire)
Les dernières avancées réalisées dans le cadre de la résolution pratique du problème SAT ont rejailli bien au delà de ses frontières. Ainsi, à l'heure actuelle, de nombreux problèmes dont la classe d
2014-02-21Approches déclaratives pour la fouille de données (Lakhdar Sais)

Cet exposé fait un tour d’horizon de nos contributions à la fouille de donnée et plus généralement à la fertilisation croisée entre la fouille de données, la programmation par contraintes et la satisfiabilité propositionnelle (http://www.cril.univ-artois.fr/decMining/). Ces travaux ont été soutenus par le projet ANR DAG « Approches déclaratives pour l’énumération de motifs intéressants » (http://liris.cnrs.fr/dag/).
2014-02-14Solve a Constraint ProblemWithout Modeling It (Nadjib Lazaar)

We study how to find a solution to a constraint problem without modeling it. Constraint acquisition systems such as Conacq or ModelSeeker are not able to solve a single instance of a problem because they require positive examples to learn. The recent QuAcq algorithm for constraint acquisition does not require positive examples to learn a constraint network. It is thus able to solve a constraint problem without modeling it: we simply exit from QuAcq as soon as a complete example is classified as positive by the user. In this paper, we propose ASK&SOLVE, an elicitation-based solver that tries to find the best tradeoff between learning and solving to converge as soon as possible on a solution. We propose several strategies to speed-up ASK&SOLVE. Finally we give an experimental evaluation that shows that our approach improves the state of the art.
2014-02-07Adaptive (i,j)-consistency via hypertree decomposition (Younes Mechqrane)

State of the art constraint solvers can fail solving some instances that are surprisingly too hard for them. One way to improve these algorithms would be prevent them from searching deep into the search tree before realizing they made bad decisions. If there is no way to analyse failures, revising the wrong decisions can be extremely time consuming. Hypertree decomposition methods decompose a CSP into a tree of clusters (subsets of variables). When the order of the variables is compatible with a depth first traversal of the cluster tree, we know that once the variables appearing in the same separator are all instantiated, the subproblem defined by the subtree below the separator is independent from the rest of the problem. To minimize bad decisions, we propose that each time a separator has been completely instantiated, we check the extensibility of the current partial solution on the clusters that are immediately below. If it turns out that the current partial solution cannot be extended, then we know that the projection of the partial solution on the separator is a nogood. These learned nogoods are a relaxation of (i,j)-consistency where i is the size of the separator. A learned nogood immediately allows some form of backjunmping and later allows to decrease thrashing. To concentrate our efforts on clusters that are more likely to produce nogoods, we use machine learning techniques to automatically learn the clusters that are most likely to be unsatisfiable.
2014-01-17Tractable Combinations of Global Constraints (Robert Woodward)

Robert will talk us about the paper by David A. Cohen, Peter G. Jeavons, Evgenij Thorstensen, Stanislav Živný from CP 2013, on "Tractable Combinations of Global Constraints"
2014-01-10Various kinds of co-clustering as CSP (Joel Quinqueton)

we present several CSP models of coclustering and some experiments of these models
2013-12-20ICTAI13 overview (Gilles Trombettoni)

2013-12-13Solving Distributed Constraint Optimization Problem (Mohamed Wahbi)

The Distributed Constraint Optimization Problem (DCOP) is a powerful framework for modeling and solving applications in multi-agent coordination. The presentation is about AFBBJ+, a revisited version of the Asynchronous Forward Bounding algorithm (one of the best algorithms to solve DCOPs). In AFBBJ+, we refine the lower bound computations. We also propose to compute lower bounds for the whole domain of the last assigned agent instead of only doing this for its current assignment. In addition, these lower bounds can be used as a value ordering heuristic in AFBBJ+.
2013-12-06Learning propagation policies (Amine Balafrej)

Amine will talk us about the work done in [S. Epstein, E. Freuder, R. Wallace, and X. Li, 2005] on "Learning propagation policies"
2013-11-21Programmation par contraintes appliquée à la robotique mobile (Luc Jaulin, ENSTA-Bretagne/Lab-STICC, Brest)

De nombreux problèmes de la robotique mobile (planification de trajectoire, localisation, cartographie) peuvent se modéliser par des contraintes ou CSP (Constraint Satisfaction Problem). En robotique, les variables de ces CSP ont une nature extrêmement variée relativement à ce qui est classiquement considéré dans la communauté CP (Constraint Programming).
2013-11-08Localizing & Bolstering Constraint Propagation in a Tree Decomposition (Berthe Choueiry)

The tractability of a Constraint Satisfaction Problem (CSP) is guaranteed by a direct relationship between its consistency level and a structural parameter of its constraint network such as the treewidth. This result is not widely exploited in practice because enforcing higher-level consistencies can be costly and can change the structure of the constraint network and increase its width. Recently, R(∗,m)C was proposed as a relational consistency property that does not modify the structure of the graph and, thus, does not affect its width. In this paper, we explore two main strategies, based on a tree decomposition of the CSP, for improving the performance of enforcing R(∗,m)C and getting closer to the above tractability condition. Those strategies are: a) localizing the application of the consistency algorithm to the clusters of the tree decomposition, and b) bolstering constraint propagation between clusters by adding redundant constraints at their separators, for which we propose three new schemes. We characterize the resulting consistency properties by comparing them, theoretically and empirically, to the original R(∗,m)C and the popular GAC and maxRPWC, and establish the benefits of our approach for solving difficult problems. Joint work with Shant Karakashian and Robert Woodward. The presentation is based on a AAAI 2013 paper and material from the doctoral dissertation of Shant Karakashian.
2013-10-25ICTAI13: Adaptive Constructive Interval Disjunction (Gilles Trombettoni)

An operator called CID and an efficient variant 3BCID were proposed in 2007. For numerical CSPs handled by interval methods, these operators compute a partial consistency equivalent to Partition-1-AC for discrete CSPs. The two main parameters of CID are the number of times the main CID procedure is called and the maximum number of sub-intervals treated by the procedure. The 3BCID operator is state-of-the-art in numerical CSP solving, but not in constrained global optimization. This paper presents an adaptive variant of 3BCID. The number of variables handled is auto-adapted during the search, the other parameters are fixed and robust to modifications. On a representative sample of instances, ACID appears to be the best approach in solving and optimization, and has been added to the default strategies of the Ibex interval solver.
2013-10-25ICTAI13: A CSP Approach for Metamodel Instantiation (Adel Fedjoukh)

Abstract—This work is a contribution of Artificial Intelli- gence to Software Engineering. We present a comprehensive approach to metamodel instantiation using CSP. The gener- ation of models which conform to a given metamodel is a crucial issue in Software Engineering, especially when it comes to produce a variate and large dataset of relevant models to test model transformations or to properly design new metamodels. We define an original constraint modeling of the problem of generating a model conform to a metamodel, also taking into account its additional OCL constraints. The generation process we describe appears to be quicker, more efficient and flexible than any other state-of-the-art approach.
2013-10-18Recent Advances in High-Level Relational Consistency (Robert J. Woodward )

Consistency properties and algorithms for achieving them are at the heart of the success of Constraint Programming. Most research has focused on properties to be verified by and enforced on the domain values of a Constraint Satisfaction Problem (CSP). We propose new consistency properties and algorithms that apply to the relations of the constraints in a problem. An important aspect of our research is that increasing the level of consistency does not need to modify the topology of the network and may enable backtrack-free search, the holy grail of Constraint Programming. In this talk, I will discuss two relational consistency properties, R(*,m)C and RNIC, which I will theoretically characterize and empirically evaluate on difficult benchmark problems. This work is done in collaboration with S. Karakashian, C. Reeson, B.Y. Choueiry, and C. Bessiere.
2013-10-11CP13 overview (Christian Bessiere)

Christian will summarize and talk us on CP13 papers
2013-10-04Roastering/planning application (Rémi Coletta)

Rémi details us an application of roastering/planning for hospital, proposed by Berger-Levrault (www.berger-levrault.fr/)
2013-09-27Inner Regions and Interval Linearizations for (Continuous) Global Optimization (Gilles Trombettoni)

Researchers from interval analysis and constraint (logic) programming communities have studied intervals for their ability to manage infinite solution sets of numerical constraint systems. In particular, inner regions represent subsets of the search space in which all points are solutions. Our main contribution is the use of recent and new inner region extraction algorithms in the upper bounding phase of constrained global optimization. Convexification is a major key for efficiently lower bounding the objective function. We have adapted the convex interval taylor form proposed by Lin & Stadtherr for producing a reliable outer and inner polyhedral approximation of the solution set and a linearization of the objective function. Other original ingredients are part of our optimizer, including an efficient interval constraint propagation algorithm exploiting monotonicity of functions. We end up with a new framework for reliable continuous constrained global optimization. Our interval B&B is implemented in the interval-based explorer Ibex and extends this free C++ library. Our strategy significantly outperforms the best reliable global optimizers.
2013-09-24Duality: A Fundamental Feature of Combinatorial Op (John Slaney (14h))

Over the years, there have been many applications of the technique of finding solutions, or optimal solutions, to combinatorial problems by generating hitting sets (or minimal-cost hitting sets) for the solutions to some sort of dual problem. In this talk, I clarify this notion of duality at an abstract level and set out some of its properties. Then I consider an algorithm for applying it in a generic way to a large family of combinatorial optimisation problems.
2013-09-13Adaptive Parameterized Consistency (Amine Balafrej)

State-of-the-art constraint solvers uniformly maintain the same level of local consistency (usually arc consistency) on all the instances. We propose parameterized local consistency, an original approach to adjust the level of consistency depending on the instance and on which part of the instance we propagate. We do not use as parameter one of the features of the instance, as done for instance in portfolios of solvers. We use as parameter the stability of values, which is a feature based on the state of the arc consistency algorithm during its execution. Parameterized local consistencies choose to enforce arc consistency or a higher level of local consistency on a value depending on whether the stability of the value is above or below a given threshold. We also propose a way to dynamically adapt the parameter, and thus the level of local consistency, during search. This approach allows us to get a good trade-off between the number of values pruned and the computational co
2013-09-13Constrained Wine Blending (Philippe Vismara)

Assemblage consists in blending base wines in order to create a target wine. Recent developments in aroma analysis allow us to measure chemical compounds impacting the taste of wines. This chemical analysis makes it possible to design a decision tool for the following problem: given a set of target wines, determine which volumes must be extracted from each base wine to produce wines that satisfy constraints on aroma concentration, volumes, alcohol contents and price. This talk describes the modeling of wine assemblage as a non linear constrained Min-Max problem (minimizing the gap to the desired concentrations for every aromatic criterion) efficiently handled by the Ibex interval branch and bound.
2013-06-27La Q-intersection par les graphes (Clément Carbonnel)

En estimation de paramètres, il arrive de devoir gérer des mesures aberrantes (outliers). Dans le cas où le nombre d'outliers peut être borné a priori, une stratégie de résolution robuste a été proposée, mais elle requiert de pouvoir résoudre efficacement un problème appelé "q-intersection", qui a fait l'objet de mon stage. Après un tour rapide de l'état de l'art, je présenterai les résultats obtenus pour ce problème : complexité, nouveaux algorithmes et évaluation expérimentale. Je présenterai également un petit résultat de théorie des graphes lié à la q-intersection.
2013-06-21CPTEST: CP for testing CP (Nadjib Lazaar)

Constraint programs, such as those written in high-level constraint modeling languages, e.g., OPL (Optimization Programming Language), COMET, GECODE, ZINC or CHOCO, are more and more used in the design of safety-critical or business-critical systems. Applications, as diverse as high-frequency trading systems, air-traffic control, car manufacturing systems or hardware verification are now using constraint models in their daily operations and predictions. However, as any other conventional programs, constraint programs require to be thoroughly tested and corrected before being used in operation. Indeed, incorrect predictions in an auction system for e-commerce, or innapropriate deviation order in air-traffic control may eventually result in catastrophic consequences such as human losses or huge money loss. This paper presents CPTEST, an automated testing framework for constraints programs, and extends it with two major functionalities. CPTEST is organized as a comprehensive toolbox for automatically detecting, localizing functional faults, and also correcting these faults. The framework takes into account the specificity of the refinement process of constraint programs as well as their typical faults. Our experimental results on real-world constraint programs written in OPL and CHOCO show that CPTEST can actually relieve the validation engineers of heavy validation tasks. Adoption of the tool by the industrial world involved in the development and maintenance of constraint programs is in progress.
2013-06-10Efficient algorithms and heuristics for Strong Loc (Anastasia Paparrizou)

Many Constraint Satisfaction Problems (CSPs), consisting of non-binary constraints, include table constraints (i.e. lists of allowed or disallowed tuples). Such constraints are very important in constraint programming as they are present in many real problems from areas such as configuration and databases. As a result, numerous specialized algorithms that achieve Generalized Arc Consistency (GAC) on table constraints have been proposed in the literature. However, since these algorithms process one constraint at a time they cannot exploit possible intersections that may exist between different constraints. On the other hand, existing algorithms for consistencies stronger than GAC that can exploit constraint intersections are generic and thus very expensive. One objective of this research is to propose efficient algorithms for strong local consistencies that can be applied on table constraints and can be easily adopted by standard CP solvers. Towards this, we propose an extension
2013-06-07(15h30) Symmetry breaking (Philippe vismara)

Constraint programming provides effective models to solve subgraph isomorphism problems. Many real-world applications require the enumeration of all solutions that are not equivalent to a symmetry of the processed graphs. In this talk we focus on the conditions under which it is possible to eliminate these symmetries using lexicographic constraints. When the number of symmetries becomes too large, we propose a method based on set constraints.
2013-06-07(16h) Cohérences Locales Paramétrées (Amine Balafrej)

2013-05-31(14h30) De la réduction polynomiale à la décom (Olivier Bailleux -- Université de Bourgogne)

Un solveur pour les résoudre tous. L’idée est séduisante. Elle s’appuie sur la notion théorique de réduction polynomiale dans une perspective pratique : réutiliser un solveur dédié à un problème cible pour résoudre d’autres problèmes sans avoir à développer des solveurs spécifiques. Cette approche pose de nombreuses questions : quels problèmes sources peuvent être résolus via un problème cible donné ? Existe-t-il un problème cible « universel » ? Comment choisir le meilleur modèle de traduction d’un problème source à un problème cible donnés ? Comment caractériser un « bon » modèle de traduction ? Le cadre de la programmation par contraintes permet-il d’aborder cette problématique sans perte de généralité ? J’aborderai toutes ces questions au regard de l’état des connaissances dans les domaines concernés et je proposerai quelques pistes de recherches.
2013-05-17R-NIC (Christian Bessiere)

2013-05-03SAT-based Sequence mining (Remi Coletta)

2013-04-12How to review a paper? (Christian Bessiere)

2013-04-05Algorithmes efficaces pour la résolution CSP (Christophe Lecoutre -- CRIL )

Durant cet exposé, je présenterai les grandes lignes de quelques algorithmes (reconnus) efficaces pour l'inférence (filtrage de l'espace de recherche) et la recherche de solutions. Les principes (structures) sous-jacents à ces algorithmes seront présentés : residus/watched literals, réduction tabulaire, sparse sets. Je terminerai avec quelques développements récents choisis (pour les cadres CSP et WCSP).
2013-03-29The Power of OSAC (Johan Thapper)

We consider the valued constraint satisfaction problem (VCSP). That is, given a set of cost functions, each with a variable scope, find an assignment to the variables that minimises the sum of the cost functions applied to their variables. I will present recent work on the VCSP that describes how the complexity of this problem depends on the set of cost functions used. This result has consequences for the soft consistency notion OSAC (optimal soft arc consistency). Assuming P != NP, we obtain a characterisation of the power of OSAC, i.e., we can determine when establishing OSAC alone solves the VCSP. This is joint work with Stanislav Zivny (Warwick, UK)
2013-03-27Q-intersection (algos) (Clément Carbonnel)

Clément nous présente les premiers résultats de complexité et les algorithmes approchés et exacts pour la q-intersection.
2013-03-26 A Constraint Programming approach for coverage op (Daoudi Abderrazak)

Wireless Sensor Networks (WSN) constitutes the platform of a broad range of applications related to surveillance such as military, and environmental monitoring. In these fields, the deployment of WSN to support optimal coverage is obviously a fastidious task, especially when a multiple coverage is needed. We present a multi-objective constraints optimization approach for solving the above issue. Therefore, we investigate two main ideas: First, a pre-processing step with field analysis in three dimensions (3D) environment. Second, a robust based-constraint modeling approach with multi-objective function.
2013-02-01Q-intersection (Gilles Trombettoni)

Le probleme d’estimation de parametres non lineaire traite par des methodes a intervalles peut etre traite par une strategie combinatoire differente : filtrage/contraction + q-intersection (exponentiel, probleme NP-complet).
2012-12-21QuAcq (Quick Acquisition) (Nadjib Lazaar)

A l’occasion de l’extension du deadline Maya, Nadjib nous présentera une approche basée sur des requêtes partielles pour une acquisition rapide des contraintes.
2012-12-14Datamining/Clustering (attention 16h) (Eniko Szekely)

Eniko Szekely, postdoc équipe TaToo et spécialiste DataMining/Clustering nous fera une présentation des variantes du problème de Clustering ainsi que d'un certain nombre d'algos. Horaire exceptionnellement décalé à 16h pour cause de soutenance de thèse.
2012-12-07CP 4 Datamining (Rémi Coletta)

Il existe plusieurs problèmes de mining en fonction des données: - itemset mining - sequence mining (ordre sur les items) - tree-mining - graph-mining A PKDD 2008, De Readt et al ont présenté un modèle CP pour l'itemset mining. Brahim Douar a soutenu sa thèse sur une approche basée sur l'AC pour le graph-mining. Dans ce talk, Rémi nous présentera un modèle pseudo-boolean pour le sequential mining.
2012-11-30CSPs numériques et Recherche adaptative (Gilles Trombettoni)

La fin du cours de Gilles sur les CSPs numériques. Il abordera notamment les aspects sur la consistance adapative.
2012-11-22Aquisition de pré-modèles (Nadjib Lazaar (attention JEUDI))

Présentation par Nadjib de: Lopez/Lallouet ICTAI 2010 G.Raymond 2007
2012-11-16max-RPC (Amine Balafrej)

Présentation par Amine des différents algos pour maintenir la consistance max-RPC
2012-11-09Negation de contraintes Globales (Nadjib Lazaar)

Presentation par Nadjib Lazaar de son papier a ModRef (WS CP 2012)
2012-10-26CSPs numériques (part 2) (Gilles Trombettoni)

La suite du cours de Gilles d'introduction sur les CSPs numériques
2012-10-19Nyseos (Philippe, Rémi, Eric)

Modele CSP pour Assemblage de vins
2012-09-28Optimisation de planning dans l’hôtellerie (Rémi Le Brouster)

Cette présentation fixera le cadre du stage de Rémi Le Brouster au sein de la société Thélis, concernant une amélioration apportée au logiciel Unicamp. Le but est d'augmenter le chiffre d'affaire des campings, clients de Thélis, en effectuant des modifications sur le planning des réservations futures afin de libérer le maximum de blocs vendables.
+