An effort has been made to try to identify the real place of a CLP
programming methodology within more general software development methods,
in particular methodologies for structured knowledge-based software
development. Major steps of CLP program development have been identified.
A certain number of `loops' are mentioned. Though some of them are
probably inevitable, this document should contribute to decrease their
number for future CLP applications. This document also contains an
`Application Description Framework' (section
), which was proposed in CHIC to ensure a reasonable level of uniformity
of application descriptions by the Partners. It provides guide-lines,
in form of a structured list of questions, that can also be used for
a much more general purpose: that of checking, at various steps of
an application development process, that no important issue has been
missed.
This document deals with CLP as a method for solving complex problems,
more than with particular CLP languages. Experience in the CHIC project
is based on practical utilization by the partners of various CLP systems:
CHIP (ECRC CHIP compiler, ICL Decision Power, Cosytec CHIPv4), Charme,
internally developed software (at Renault) and ECLiPSe , which, being
the CHIC platform for the second half of the Project had a privileged
status. The authors, however, believe that most of the lessons learnt
are valid for any CLP systems. Accounts on the evaluation and enhancement
of ECLiPSe , an important task of the Project, can be found in other
reports written by the Users and ECRC. The reader interested in a
critical review of currently available industrial CLP tools, be they
languages, libraries or environments, might consider consulting the
review written by Jean-Yves Cras, who was, when he was at Bull, the
person in charge of the Methodology task in the CHIC project [Cras
Review 93]. For a general introduction to CLP, see
[ECRC Tutorial 92] .
Most of the issues addressed here concern the Finite Domain
aspect of CLP, since it is the most difficult in terms of Qualification
and Constraint Expression. The Rational solver was actually used in
CHIC only by AIS for financial applications [AIS Report 95]. When
this solver is concerned, this is explicitly mentioned. The issues
raised by the Boolean solver technology are not addressed here; the
interested reader may consult e.g. the CHIC reports of Siemens
[Siemens CHIC 94].
This remark was initially mentioned by Renault for production planning
project, but its scope is clearly wider. The systems are usually not
isolated, stand-alone and, therefore, the projects cannot
be managed with their own rules, their own methodology, etc .;
they have to be considered on the same basis as mainstream computer
projects.
When R&D teams are obviously the best suited to develop such an application,
they are often unable - both because of their structure and because
of the backgrounds of the people - to exploit it and to maintain it
correctly. The development of innovative applications should lead
to a technology transfer. Only a joint development - between R&D people
and mainstream computer developers - may ensure the real success,
on a long-term basis, of an innovative project.
According to the Charme Object team, if joint development is a recommended
practice, CLP technology transfer is still hard to achieve. Bull have
had both positive and negative experience with technology transfer.
Clients like Renault or the Gendarmerie Nationale rapidly
became autonomous in CLP development. For others, Bull have to
maintain their applications. This is clearly the case if the application
is sold as a completely packaged product, like their Time-Tabling
application for the Education Nationale. Otherwise, the client
company, or entity if it is inside the same company as the CLP team,
need to be aware that, if they want to be able to maintain or adapt
CLP applications themselves, they have to create a specific competence
internally, and that this is not straightforward. A two-year experience
seems, so far, to be an average time for a computer scientist to become
a reasonably skilled CLP hacker. Maintenance of a relatively small
application at the client's site may not be worth the effort. In contrast,
if the application, or the potential for new applications, is significant,
than having a team of one, two, or more, CLP developers is probably
a good investment. These are anyway small teams, for, in any case,
joint development together with the application site's computer developers
is a right procedure, in which everybody performs the task they are
skilled for, within a commonly agreed design.
CLP can be defined as a particular set of problem solving techniques
for decision-support system. This will be explained in more detail
in the rest of this document. If we assume it, it is clear that several
issues that will be addressed about CLP methodology can also be applied
to
The following phases of a software development cycle may be concerned
with CLP:
The informal assignment of the possible roles of CLP within a software
development project given above can be made more precise by reference
to methodologies for structured development of knowledge-based systems,
like, e.g. , CommonKADS developed within ESPRIT Project
EP5248 KADS II.
These methodologies are, basically, characterized by:
Within this framework, CLP methodology addresses, and
is restricted to, the following issues:
The purpose of this section is to introduce a tentative formalization
of the main steps of CLP programming. As was said above, these relate
to
These steps are:
These steps are strongly linked. (1) may be re-evaluated with input
from (2), (3) and (4). The problem model in (2) may be reconsidered
if (4) reveals `traps'. Several sequences (2)-(3)-(4)-(1) may occur.
CLP can be used as a modelling tool, thus grouping (2) and (4). The
next sections will be devoted to each of the identified steps. Recommendations
and concrete procedures will be given, as much as possible, and some
typical `traps' mentioned.
We start with a basic statement from the Renault lessons [Renault
IJCAI 93]: Choose the most relevant techniques, whatever they
are. This is not an easy task. Sometimes, one has to combine
techniques to take the best of each one.
Examples are given in the Renault paper. Combination of various techniques,
viz. constraint propagation and linear programming together
with spreadsheet technology, is exemplified on the Short-Term Planning
problem [Renault IJCAI 93; Renault STP 92].
The general issue is that of finding the right method for a given
problem. From CLP's point of view, the questions are:
The competitors are mainly
Most of the discussion on CLP's Characterization in this
document will be related to its comparison with Operations Research.
A short description of Simulated Annealing and a comparison with Constraint
Solving can be found in the Review by J.-Y. Cras [Cras Review 93].
The main idea is that Simulated Annealing addresses large, weakly
constrained problems, while Constraint Solving fits well for tightly
constrained problems. See also the experience of Renault on Car sequencing
[Renault CarSeq 92; CHIP CarSeq 88].
Any attempt to characterize the problem and the methods apt to solve
it presupposes at least some informal description. The following framework
is based on that given in the first methodology report [Bull CHIC
92]. It consists in
Knowledge about the existence or absence of an OR (or other well-known)
approach to solve similar problems is of utmost importance. Renault
[Renault CHIC 93] suggests to take as basic criteria:
- whether the problem fits or not an existing OR model - in particular
whether it can be linearized - or a simulated annealing model,
- the size of the problem,
with the following conclusions:
According to [Bull CHIC 92], one has to distinguish between:
global vs. local constraints,
homogeneous vs. heterogeneous constraints.
CLP is a likely candidate for heterogeneous problems; it is not
a likely candidate for homogeneous problems where a global object
can be identified ( e.g. a graph). Often, indeed, in that case,
some OR algorithm already exists and will do the job an order of magnitude
more efficiently than local constraints propagation, typical
of CLP. It is worth noting that the level of globality/homogeneity
of the problem is (in a quite informal way) reflected by the size
of the description.
In the first methodology report [Bull CHIC 92], a set of concepts
is defined in order to establish the notion of solution density. The
search space is defined by structural constraints. The solution space
is defined by specific constraints. The solution density is the density
of the solution space within the search space. It needs to be sufficiently
low for CLP to be a candidate. Pure optimization problems, e.g. ,
are not likely to be handled by CLP.
But these notions appear extremely hard, if not impossible, to quantify.
On the other hand, a mere density criterion would not be enough. If
a problem, indeed, has got hard constraints, but a huge search space
and very few solutions (a large problem with a very low solution density),
this may be very bad even for CLP. The situation actually depends
on the pruning power of the available constraint implementations
(which is often difficult to predict). It has also some relationship
with the number of variables - few variables and a lot of constraints
seems to be a good case for CLP - and with the hard vs. soft
character of these constraints: the harder the constraints, the better
for CLP. A problem featuring mostly preferences is bad for CLP: the
user would like these `constraints' to be satisfied, but each of them
in isolation can be violated and it is accepted that a certain number
of them be violated in the solution. Turning these preferences into
a linear combination of some criteria is in general not a very good
idea, unless the problem can be completely linearized and a linear
solver used. Complex cost functions, indeed, usually lead to very
poor propagation with the Finite domains. Some possibilities still
exist to handle `soft' constraints (see ),
but the general idea is that if the problem is dominated
by the preference aspect, then CLP is in general not the right technology.
To sum up, a `good' CLP problem would be characterized by
This is another perspective, also mentioned in [Bull CHIC 92]. CLP
is basically adequate for systems
Renault suggest that on a long term basis CLP may be the most promising
formalism with which to integrate the various disparate problem solving
techniques, be they OR or other [Renault CHIC 93].
One way of integration consists in `packaging' OR algorithms, relevant
to classes of applications, into constraints. The thus obtained constraints
are parametrizeable black boxes which fully communicate with the rest
of the CLP solver. A clear advantage of that method is
that efficiency is gained for industrial applications within the declarative
and homogeneous framework of Constraint Logic Programming; being equipped
with a nicely packaged, safe and efficient machinery, the application
developer can concentrate on the specific strategies to solve the
particular problem. A disadvantage might be a loss of purity of the
CLP system: no longer a language (somehow too close to applications
classes), not quite a library of algorithms... But that purity argument
might be relevant only to purists!
Another method consists in case by case integration of algorithms
relevant to a particular application (not a priori a class
of applications). The problem is, then, to have the external algorithm
communicate with the CLP language and, obviously, a key issue there
is to make the algorithm incremental, if this is not initially the
case. Of course the question might arise whether the CLP language
is still useful.
A study on the possibility of integrating novel search techniques
like Genetic Algorithms in CLP was done in the CHIC project [ECRC
NovelSearch 92].
If one wants to use certain CLP systems (CHIP, ECLiPSe ), in particular
for prototyping purposes, one can be faced with the problem using
either the Linear Solver over Rational Numbers (LS) or Propagation
over Integers with Finite Domains (FD). LS is global and complete
(one should never, e.g. , add redundant constraints!). FD in
contrast, is local, but can better allow for non linear aspects. We
give hereafter a short comparison. It should be noted that if the
problem completely fits into the framework of linear programming tools
such as CPLEX and OSL and if performance is a key issue, it is hardly
meaningful to use the LS of CLP, for `the impressive performance of
dedicated [new linear programming] tools goes far beyond what can
be achieved by the simplex algorithms integrated into Constraint Satisfaction
tools' [Cras Review 93]. Only if the application also implies, e.g. ,
some degree of interaction, or some clever rule-based heuristics, etc.
, can LS be a better candidate.
As to the specific constraints (those linking various objects
and defining the solutions), they should probably be described in
a `flat' relational formalism, trying to exhibit global properties
wherever this is relevant. The same kind of approach should apply
to rules . In Charme Object, e.g. , constraints are
also objects, but it does not seem relevant to describe them this
way at the Problem Modelling level. According to the Charme Object
team, having constraints implemented as objects is useful at the constraint
prototyping and implementation levels, for structure
and conciseness of the code.
See also the enhanced model, called Object-oriented constraint
satisfaction problem , defined by Bull [Bull CSP 94].
It is suggested [Bull CHIC 92] that CLP itself can be used as a modelling
tool. Thus, Problem Modelling and Constraint Expression and Prototyping
could be merged. However, it seems better to keep the levels separate,
since merging them may mean early hacking at the detriment of the
modelling effort.
Conclusion 1 : The CLP-oriented problem modelling
formalism is still to be invented; however, an object-oriented description
of the application entities together with a flat and exhaustive semi-formal
constraint enumeration seems to be a practical compromise.
Conclusion 3 : If a description of the problem exists, but
has been made using a particular formalism oriented towards a non-CLP
programming paradigm, it is probably a good attitude to reconsider
this description from scratch. A new elicitation of expertise is
necessary. Information layers, indeed, not fitting naturally into
the initial paradigm, may have been omitted or distorted.
The following recommendations come mainly from the first methodology
report [Bull CHIC 92] and the experience of the Charme Object team:
A main issue is that of the degree and nature of the interaction the
system should provide. The focus here is on problems arising with
fully automatic systems as well as on the issue of explaining the
results given by the system and handling constraint failure. Solutions
are sketched. This issue is not related to CLP per se , but
rather to decision-support systems designed to solve very complex
problems.
In that respect a fundamental, yet often missed, recommendation is
given by Renault [Renault IJCAI 93]: Help people to do their job,
do not try to replace them. The quite common attempt of substituting
automatic systems for the people in charge of the job often leads
to many difficulties. On the contrary, the user is the only judge
in identifying which objectives are the most important and those that
are or are not negotiable. These objectives may change from one month
to another. One should look for a cooperative framework, combining
the strengths and skills of both the user and the system. Try to replace
people only where it is appropriate, that is for those tedious tasks
for which they are not particularly skilled.
Automatic systems often face the following difficulties:
Misunderstanding the real need is a quite common mistake. As an example,
in production planning, one often focuses at the beginning on automatic
optimization tools, computing solutions `from scratch', whereas the
real need is for a decision-support system for plan revision. This
is also a lesson from Renault [Renault IJCAI 93]: Production planning
is more often concerned with revising obsolete plans than creating
new plans. Most of the daily job consists of adapting, revising, refining
plans that are now obsolete or not detailed enough. Replanning requires
taking into account what has already been done or commitments based
on the previous plan. The real function of the system has to be identified
from the very beginning. According to [Renault CHIC 93: 3.1.3], one
has to `be very vigilant right from the start that the user is not
seeking a decision analysis tool rather than a one-shot tool whose
aim is just to generate one solution. For example in LP problems,
it is more difficult to implement decision analysis tools than just
a pure solution generation tool'. In that case FD might be more appropriate.
According to the Charme Object team, there is, indeed, a strong tendency
from the end-user to ask for sophisticated graphical tools allowing
to easily build and modify solutions, merely indicating constraint
violations.
Explaining the results given by the system is a very general problem,
likely to be encountered with any technology allowing to solve very
complex problems: it will be anyway hard for the user to understand,
check the validity of, and accept the solutions computed by the system.
Failure is also inacceptable, if no indication is given to the user
as to what to do next. In this respect, the situation may be very
bad with CLP: if one does not take care, definitive failures can be
produced, with no intermediate, partly feasible results generated,
as would be the case if the solution was obtained after iterative
refinements. With CLP, the user should have a way of reconsidering
the constraints imposed to the solution. Unfortunately, there is no
general method for explaining failures and detecting which constraints
have to be relaxed. If technologies like dependency-based backtracking
or ATMS were better mastered and worked at a reasonable
cost, perhaps something might be done in terms of finding the latest
decision causing a given failure. But it is apparently not the case
(see the implementation by ECRC of an Intelligent Backtracking library
in ECLiPSe ], based on [CERT IBT 93], for a non-conclusive attempt
to implement intelligent backtracking methods). Methods like that
proposed e.g. in [De Backer & Beringer IJCAI 93: p.300] for
the detection of minimal inconsistent sets are restricted to linear
programming.
Therefore, one has to find practical answers for making the
results understandable and giving ways out of states of failure. The
bad news are that that this is a serious problem with CLP. The good
news are that constraints make the answer possible in a natural way.
Two methods were proposed in CHIC, by Renault and Dassault, both relying
on the properties of the particular application and on the user's
expertise and initiative.
The method [Renault CHIC 93: 3.3] is the following:
The advantages of such a method are the following:
Both methods rely, somehow, on incremental setting of constraint
types . In the Renault example, the method is `fully' incremental:
the new constraints are added to the previous computation state of
the linear solver. In the Dassault case, since the FD solver is used,
which is not complete, an enumeration phase is required at each step
to guarantee the existence of feasible solution. The instantiations
done in this phase have to be undone before new constraints are added.
These methods provide practical answers, relying on the knowledge
the end-user has of the domain. From a theoretical point of view,
when inconsistency is detected, the whole set of constraints can be
assumed to be responsible. Identifying the last introduced subset
as the cause of inconsistency is a pragmatic approach, founded
by the fact that the user will normally enforce first those constraints
that have in any case to be satisfied and then those that are, in
some sense, less fundamental. Practically, the methods give a way
of backtracking to a less constrained, partially valid, state.
Both methods imply a significant amount of man machine interaction.
This is a crucial aspect, very specific to CLP. It has to be reconsidered
whenever necessary in the course of the project and relies on results
from Problem Characterization, Problem Modelling and System Specification.
Each subproblem, as identified in the Problem Modelling, has to be
handled separately, but the different subproblems are likely to interact.
One should try to solve first the apparently most constrained subproblems
and if they are feasible, the whole problem should be feasible (see
below, Prototyping in CLP , 8.3).
Most of the issues here relate to Finite Domains. Issues specific
to scheduling problems are addressed in [ECRC Scheduling 94].
We give hereafter some hints for variable choice and constraint expression.
In the Car Sequencing problem, as handled in the original CHIP [CHIP
CarSeq 88], the sequencing constraints apply to 0/1 variables representing
the absence or presence of options on the cars. In the Charme version
of this program, this is replaced by global distribute constraints
[Bull CharmeObject 94].
See the use of element constraints to express finite mappings
in the Braghenti loom scheduling application [ECRC Braghenti 94].
The Dassault Planning Program features a number of 0/1 variables,
that are explicitly instantiated in the labelling procedure, viz.
one variable per aircraft, representing the decision to change or
not the production rate when this aircraft is manufactured, and, if
yes, to increase or decrease it. This leads to poor propagation. A
constraint model focused on `where to change the rate' rather than
`change or not at each a priori possible point' might yield
better results.
In [Bull Sets&Graphs 94: 4.3] an example is given of how an array
of 0/1 variables can be efficiently replaced by set constraints to
solve a bin-packing problem (`packing' songs into records). Note that
set partitioning constraints are global constraints.
If the search space is too large, one has to accept the non-completeness
of the search. According to the Charme Object team, it is probably
better to leave only a few (significant) possibilities
at each node of the search tree down to a certain depth, cut the rest,
but leave the system run through the whole reduced tree, than e.g. leav
e all possibilities but impose a time-out. In other words, it might
be better completely running through a less `dense' search tree than
having an exhaustive search on an very local part of the initial search
tree.
Thus, for many disjunctive problems, it is probably a good approach
to
In MADE, the Dassault team has adopted a strategy which consists in
a non deterministic task assignment in the bottleneck resource, but
a deterministic assignment in the other work stations. This, in general,
suppresses potential solution, but the search space is so large that
backtracking on secondary aspects should at any rate be avoided.
In scheduling problems, significant choices are e.g.
With dichotomic labelling, any failure occurring when a domain is
split discards a whole half of the domain and the system directly
backtracks to the other half. This usually gives good results for
problems like cutting stock [CHIP CuttingStock 88] or placement problems
(see, e.g. , the part geometrical arrangement module of the
Dassault scheduling application [Dassault PAP 94; CHIC 95]).
The issue of soft constraints was already mentioned in Problem Characterization
. Preferences can in principle be turned into linear combinations
of the criteria and the initial problem recast into the
problem of optimizing this cost function. Assuming this is meaningful
(which is often questionable), either the problem is completely linear
and, then, the Linear Solver or an OR linear package will be appropriate
provided the problem is not too large, or the problem features
a number of discontinuous aspects, like disjunctions. In this latter
case, the Finite Domains are in principle appropriate, but
a complex cost constraint leads to very poor propagation and
this method is not to be recommended.
In Charme Object, to each constraint is associated a Boolean variable
representing its truth value and, thus, meta-constraints can be expressed
[Bull CharmeObject 94]. Therefore, it is theoretically possible
to optimize, e.g. , a linear combination of constraint truth
values, as a means of achieving a maximum preference. This process
may not be very efficient, but experience is so far too limited to
allow definitive conclusions.
Soft constraints can sometimes be allowed for by adequate heuristic
methods in the enumeration procedure. The idea is to use domain-dependent
knowledge to design strategies apt to ensure `good' first solution,
with respect to user preferences.
Hard constraints can be derived from the soft ones to help to prune
the search tree. Often, simple and strong cost constraints can be
designed, which do not exactly match the user's criteria, but will
lead to `good' results and allow efficient pruning of the search space.
This issue will be discussed in more detail below (8.2.4
).
Finally, it is always possible to design a semi-automatic system,
with which the user will be able to express his preferences during
the construction of the solution. This may mean reconsidering the
Communication Model in the System Specification.
According to the Dassault team, when designing an interactive system,
one should clearly distinguish between cases when
Problem modelling often exhibits several kinds of cost factors. It
is important to see which costs can efficiently be exploited during
the search and which ones are merely results to be reported to the
user as indicators.
Whenever
The following features of the first prototypes are important for the
quality of further development:
One should prototype first the most constrained subproblems [Bull
CHIC 92]. If they are feasible, the whole problem should be feasible.
Strongly coupled subproblems should be handled as a whole and one
should not impose too early a decomposition to the problem in the
Problem Modelling phase. That, indeed, might jeopardize `transversal'
propagations that could help to solve locally `stuck' situations.
One should rather try and keep a global approach to the problem. There
are, of course, quantitative limits (like the number of constraints
that can be set up).
It may happen on the contrary that a problem is initially stated as
global, but can eventually be split into independent subproblems.
It is extremely important to handle real data very early in the prototyping
process, in order
Some recommendations for practical measure of complexity:
Debugging is greatly eased when the kind of modular and incremental
methods of setting up constraints, mentioned in
, are implemented. The need of explaining a failure during development
bears, indeed, a lot of similarity with the need of having the end-user
understand the reason why the system does not find any solution. Even
for the developer, implementation of such method may be necessary
for the understanding of the system's results.
`Over-using' powerful built-ins may make things harder to debug: refrain
from using constraints or predefined heuristics, e.g. , when
simple computations or sorting procedures can do the job.
Inconsistency in the data is a common source of trouble. When possible,
it may be interesting to have simple ad-hoc procedures to
detect major inconsistencies, before running the main system.
Another interesting approach consists in focusing on key variables
and monitoring their changes (in the FD). This can be achieved by
daemons set on the relevant domain changes. A dynamic graphical
interface , allowing to visually achieve this kind of monitoring,
is usually of great help in the development process. Tools for that
purpose have been developed in CHIC [ICL Tools 94], based on ECLiPSe
suspension mechanisms [ECRC ECLiPSe 94] .
If the system end-user has got a strategy for solving the problem,
it may be useful to have the system provide his solution as reference
solution, at least in the prototyping phase, instead of ignoring it.
In certain cases, this strategy, or parts of it, may prove adequate
for the constraint system. In addition, it has to be shown that the
solutions provided by the system are at least as good as those the
user would find using his own methods. This is a discipline for the
developer and a condition of acceptance of the system by the user.
CLP languages like ECLiPSe or CHIP are complete languages, in the
sense that
In contrast, the tools developed at Renault are integrated in a wider
environment and mix constraint-based programs with other resolution
techniques. Therefore, the philosophy adopted for these development
is that of libraries of tools and a C/C++ `glue'.
This section about Constraint Optimization and Definitive Implementation
deals with Performance improvement, either at an advanced prototyping
stage, or for the definitive implementation. It is completely specific
to CLP. The following issues and recommendations basically come from
the first methodology report [Bull CHIC 92]. The discussions concerning
structural constraints, application-dependent knowledge and practical
complexity are general to CLP, but the rest of the section is specific
to the Finite Domains.
Implementation of structural constraints can be performed
See Bull's experience of integrating a transportation algorithm with
Charme.
The operational behaviour of constraints should be studied in terms
of
In general, at this implementation level (not at the modelling/prototyping
level), one should use powerful constraints only when their expressive
power and inference capacity are actually required, and simpler constraints
should always be preferred to complex ones when they allow the same
deductions.
In Charme (using an ECLiPSe-like syntax)
[X, Y, Z] :: 1..3,
alldifferent([X, Y, Z])
is better than distribute(3, [X, Y, Z], [1, 2, 3], [1, 1, 1]) ( i.
e. `distribute 3 of the variables X, Y, Z, onto values 1, 2, 3,
each value being taken once').
Issues concerning the directionality of deduction are also interesting.
While in principle constraints are completely adirectional, in most
cases a mainstream of information propagation may be detected in a
running program. If, for instance, in a constraint relating A and
B, one observes that 90% of the information comes from changes in
the domain of A and results in modifications in the domain of B, it
is a natural idea to favour the mainstream by writing a restricted
version of the constraint that works in one direction only. Once again,
powerful constraints should be used only when necessary.
On the other hand, when gaps are detected in the propagation
mainstreams, redundant (surrogate) constraints can be introduced
(only) as a means of filling these gaps. (Note that this applies to finite
domains only . When using the Linear Solver, on the contrary, redundant
constraints are a burden for the system and should be avoided. Note
also that `redundant' refers to the logical aspect - the constraint
being logically implied by the others is, in that sense, unnecessary
-, whereas `surrogate' refers to the procedural aspect - the constraint
performs actions that other cannot.)
Assume that non_overlap(StartA, DurA, StartB, DurB) , is a
FD constraint, defined by conditional propagation (daemon) mechanisms,
expressing that two tasks A and B may not overlap in time. Then, merely
stating
non_overlap(StartA, DurA, StartB, DurB),
StartA + DurA #<= StartC, % task C follows A ...
StartB + DurB #<= StartC, % ... and B
is not enough for the system to deduce that
StartC #>= minimum([StartA, StartB]) + DurA + DurB
and it is worth adding this explicitly.
In general, it is worth playing with the prototypes, and even designing
small application-specific trace tools to detect this kind of optimization
possibilities.
As mentioned above, a (careful) use of user-defined constrained can
be recommended, in order to
Approximate Generalized Propagation
[ECRC GP 93] is a promising
approach to make this work easier, since it allows both a declarative
statement of the constraints and the tuning of the amount of information
deduced at each step.
With Constraint Handling Rules
[ECRC CHR 93] ,
a high-level
language (Prolog) is used to define the propagations.
We call `CLP team' the people who faced the problem of applying CLP
to the application, within the lifetime of the CHIC project. `Previous(ly)'
refers to a situation prior to the moment when the CLP team started
or took over the job.
[Bull CharmeObject 94] Charme Object documentation, Bull-Cediag,
1994.
[Bull CHIC 91] Jean-Yves Cras, Benoît Guinaudeau, Outline
of CHIC User Requirements and First Methodologic Notes , CHIC deliverable
report D 2.1.2.1, Bull-Cediag, June 1991.
[Bull CHIC 92] Jean-Yves Cras, Michel Leconte, Benoît Guinaudeau,
Towards a Constraint Programming Methodology , CHIC deliverable
report D 2.1.2.2, Bull-Cediag, January 1992.
[Bull CHIC 92'] Michel Leconte, Positioning CLP Methodology
in CHIC , CHIC report, Bull-Cediag, June 1992.
[Bull CHIC 93] Michel Leconte, Francis Leyter, Update
Specification of Logic Constraint Programming Methodology , CHIC
report, Bull-Cediag, June 1993.
[Bull CSP 94] Massimo Paltrinieri, Some Remarks on the
Design of Constraint Satisfaction Problems, Bull, 1994.
[Bull Disjunctive 94] André Guillaud, Michel Leconte, Disjunctive
Constraints, a Global Approach , CHIC deliverable report D 4.2,
Bull, June 1994.
[Bull Formalism 90] Jean-Yves Cras, Conventions for Constraint
Expression , CHIC non-deliverable report, Bull-Cediag, September
1990.
[Bull Sets&Graphs 94] Michel Leconte, Implementation of
Sets and Graphs Constraints , CHIC deliverable report 5.1.1, Bull,
January 1994.
[Bull Tutorial 92] Peter Chan, Tutorial on Constraint
Programming, Bull-Cediag, 1992.
[CERT IBT 93] Eric Bensana, Intelligent Backtracking ,
CHIC deliverable report D 5.3.2.2, CERT, June 1993.
[CHIP CarSeq 88] Mehmet Dincbas & al. , Solving
the Car Sequencing Problem in Constraint Logic Programming , ECAI-88,
Munich, 1988.
[CHIP CuttingStock 88] Mehmet Dincbas & al. , Solving
a Cutting Stock Problem in Constraint Logic Programming , Fifth
International Conference on Logic Programming, Seattle, 1992.
[CHIP Warehouse 89] Pascal van Hentenrick, A Logic Language
for Combinatorial Optimization , Annals of Operations Research:
Special Issue on Links with Artificial Intelligence, 1989. (Warehouse
location problem pp. 20-26.)
[Cosytec JFPL 92] Nicolas Beldiceanu, Abderrahmane Aggoun, Extending CHIP in order to solve Complex Scheduling and Placement Problems ,
JFPL, Lille, France, 1992.
[Cras Review 93] Jean-Yves Cras, A Review of Industrial
Constraint Satisfaction Tools , AI Perspectives Report, AI Intelligence
(PO Box 95, Oxford, OX2 7XL, UK), 1993.
[Dassault CHIC 95] André Chamard, Annie Fischler, Applying
CLP to a Complex Scheduling Problem - The MADE System , CHIC deliverable
report D6.5.3, Dassault Aviation, January 1995.
[Dassault PAP 92] Jacques Bellone, André Chamard, Claudine
Pradelles, PLANE, An Evolutive Planning System for Aircraft Production
written in CHIP , The First International Conference on the Practical
Application of Prolog, London, April 1992.
[Dassault PAP 94] André Chamard, Annie Fischler, MADE,
A Workshop Scheduler System written in CHIP , The Second International
Conference on the Practical Application of Prolog, London, April 1994.
(Work partially funded by CHIC.)
[Dassault PAP 95] André Chamard, Decision-support Systems
for Planning and Scheduling Aircraft Manufacturing , The Third
Conference on the Practical Application of Prolog, Paris, April 1995.
(Work partially funded by CHIC.)
[De Backer & Beringer IJCAI 93] Henri Beringer, Bruno De Backer,
Satisfiability of Boolean Formulas over Linear Constraints ,
IJCAI-93, Chambéry, France, 1993.
[ECRC Braghenti 94] Pierre Lim, The Braghenti scheduling
application, CTAI Workshop, 1994.
[ECRC CHR 93] Thom Fruhwirth, Temporal Reasoning
with Constraint Handling Rules , CHIC deliverable report D 5.3.1.2,
ECRC, January 1993.
[ECRC Disjunctive 93] André Veron, Disjunctive Scheduling
in Elipsys , APPLAUSE report ECRC.02, May 1993.
[ECRC ECLiPSe 94] ECLiPSe Extensions User Manual, ECRC,
January 1994. HTML
[ECRC GP 93] Thierry Le Provost, Mark Wallace, Approximate
Generalized Propagation , CHIC deliverable report D 5.2.3.3, ECRC,
January 1993.
[ECRC NovelSearch 92] Volker Kuchenhoff, Novel Search
and Constraints - an Integration , CHIC deliverable report D
5.3.3.2, ECRC, December 1992.
[ECRC Scheduling 94] Mark Wallace, Constraints for Scheduling
, Constraint Programming", Mayoh, Penjaam and Tyugu eds.,
Nato Advanced Study Institute Series, Springer Verlag, 1994.
[ECRC Tutorial 92] Thom Fruhwirth & al., Constraint Logic
Programming - An Informal Introduction , Tutorial, Logic Programming
Summer School, Zurich, September 1992.
[ECRC-Cosytec Debugger 92] Abderrahmane Aggoun, Emmanuel van
Rossum, Mark Wallace, Specification of a CHIP debugger , CHIC
deliverable report D 2.3.1, ECRC, January 1992.
[ICL Tools 94] Owen V.D. Evans, Tools to Visualise
Domain Variables , CHIC deliverable report D5.3.4.2, June 1994.
[Renault 2GES 95] Jean-Marc David, Les systêmes Experts
de Seconde Génération , Techniques et Sciences Informatiques, to
appear.
[Renault CarSeq 92] Chew Tat Leong, Jean-Marc David, Yves
Tourbier, Solving Constraint Satisfaction Problems with Simulated
Annealing: The Car Sequencing Problem revisited , Avignon 1992.
(Work partially funded by CHIC.)
[Renault CHIC 93] Chew Tat Leong, Evaluation of Scheduling
Using CHIP , CHIC deliverable report D 3.1.2, Renault, January
1993 .
[Renault CONSTRUCT 92] Alain Nguyen & Jean-Marc David, Relationship
s between Problem Features and Problem Solving Methods , Contribution
to Deliverable D/WP5/1, ESPRIT II Project EP5477 CONSTRUCT, May 1992.
[Renault IJCAI 93] Jean-Marc David, Chew Tat Leong, Pushing
Innovative Techniques into Production Planning; some lessons learned
so far , IJCAI Workshop on Knowledge-based Production Planning,
Scheduling and Control, Chambéry, 1993. (Work partially funded by
CHIC.)
[Renault STP 92] Chew Tat Leong, Jean-Marc David, A
Constraint-based Spreadsheet for Cooperative Production Planning ,
AAAI SIGMAN Workshop on Knowledge-based Production Planning, Scheduling
and Control, San Jose, California, 1992. (Work partially funded by
CHIC.)
[Siemens CHIC 94] Erik Tidén & al. , Verifying Sequential
Circuits with CVE: Technological Considerations and Assessments, CHIC
deliverable report D 3.2.4, Siemens, 1994.
Contents
2 CLP: an Innovative Technology. Caveats.
2.1 Three general lessons
We will start with general caveats concerning projects which rely
on innovative techniques, given by Renault [Renault IJCAI 93]. They
come from their experience on Short-Term Production Planning, Vehicle
Sequencing and Vehicle Sorting projects [Renault STP 92], for which
CLP was either applied, or a potential a priori candidate.
We quote here three of these lessons, the most general ones (others
will be commented later in the document). This will introduce a key
issue: the place of CLP within a general software development cycle.
2.2 Think in terms
of organization, not in terms of algorithms
The optimization algorithm is only part of the success. More important
is the proposed organization. A lot of work is devoted to integrating
the application within existing information systems, developing user-oriented
functionalities to support the daily work, etc. , and it might
happen that these functionalities soon constitute, for the future
user, the main motivation for the new system. The development of optimization
algorithms tends to be a `small' part - even if a critical part -
of these projects, from 10 to 20 % of the global development effort.
Dassault noted for their MADE scheduling application [Dassault PAP
94; CHIC 95] that the Static Analysis and Simulation function, i.e.
the non-constraint part, tended to be considered by the factory staff
as at least as important as the constraint-based part.
2.3
In CHIC, Dassault acknowledged that their workshop scheduling project
had, to a great extent, `escaped' from the standard company procedures
for production management computer projects. A reason for that was
that it had started as a research project, with a prospective goal.
Another reason is worth mentioning, for it resides in the clients'
expectation: AI in general and CLP in particular, with their flexibility
and ability to quickly provide prototypes, inducing feed-back from
the client, etc. , were perceived as an alternative to the
heavy and rigid company projects.
2.4
This is also a problem acknowledged by the CLP team at Dassault Aviation,
which is an R&D team: who will maintain their constraint applications?
When the number of operational applications increases, new structures
need to be found.
A way of solving this problem was found at Renault: Chew Tat Leong
moved from the AI Group to a Production Department during the CHIC
Project in order to ensure industrialization and maintenance of the
constraint-based applications.
These lessons show that CLP-based developments are just a part of
larger software developments, including exploitation and maintenance.
The next section will be devoted to identifying more precisely the
place of this technology within general software developments methodologies.
3 Place of CLP within General Software
Development Methodologies
3.1
Two examples:
1. The difficulty in predicting the performance of CLP is often
more related to the intrinsic complexity of the problem (an NP-complete
problem for instance) than to the technology. Mixed Integer-Linear
Programming applied to such problems will exhibit the same unpredictable
behaviour.
2. The difficulty in having users understand and accept solutions
provided by the system is also often caused the deep complexity of
the problem, which makes the solutions hard, if not impossible, to
check by a human operator.
The distinction between what is really specific to CLP and what has
a more general scope will be made in this document as far as the authors
were aware of it.
3.2 Role of CLP within a software
project
The first three items belong to an Analysis and Pre-design step of
a development cycle. CLP is at this stage a particular candidate method
and, possibly, a prototyping tool. In the Implementation phase, the
implementation of the problem solving capabilities is performed using
a particular (CLP) language. It should be noted that once the definitive
specifications have been written and feasibility of the resolution
methods demonstrated, then nothing actually differentiates a `CLP-based'
development cycle from a classical one. Implementation raises, of
course, problems that are specific to the technology, but this is
the case for any technology. Within the Analysis and Pre-design step,
CLP may be specific in that prototyping is both
The following naive diagram may help assessing the place of CLP within
the overall project development effort:
3.3 CLP
and methodologies for knowledge-based software development
An overview is given in a paper by J.-M. David, of Renault, about
`Second Generation Expert Systems' [Renault 2GES 95], from which the
following scheme, representative of these methodologies at an abstract
level, is taken. The Libraries of reusable models, the Task Model
and Communication Model have been added:
In the present document, these aspects are addressed in different
sections:
of a project development cycle. These steps are not really in a chronological
order. The whole process, indeed, involves numerous loops. The purpose
of methodology, however, is to contribute to decrease their number
for future CLP applications. The basis of this formalization is the
first methodology report by Bull [Bull CHIC 92]. Not all these issues
are specific to CLP, but they are all crucial.
1- Problem Characterization
2- Problem Modelling
3- System Specification. The Communication Model
4- Constraint Expression and Prototyping
5- Constraint Optimization and Definitive Implementation
5.2 First Problem
and Context Description
5.3 CLP vs. competitors. External
qualification criteria
This classification is rather general and external. In the following
sections, we address criteria based a set of more specific and intrinsic
problem features.
Bull reported in CHIC on work done to integrate a transportation algorithm
inside Charme, to solve a particular problem for which the required
performance level had not been reached with Charme alone. The algorithm
was already incremental and reasonably extensible. Looking back on
it, the conclusion of the Charme Object team is that transportation
problems are, actually, very well handled by such algorithms alone,
without CLP's intervention.
See the long description required for the Dassault scheduling problem,
a quite heterogeneous problem (it has geometrical and temporal aspects)
with constraints depending on the particular work station.
5.5 A characterization procedure
This procedure is taken from the first methodology report [Bull CHIC
92]. Among the identified steps are not only a First Problem Description
(see above ), but also a First problem
Modelling as well as Prototyping phases, which will be discussed in
the next sections ( and
). This is an instance of `loops'
in the development process. The procedure is as follows:
The attempt developed here for a CLP-oriented characterization procedure
can be viewed as a part of a more general enterprise like that of
the CONSTRUCT project, mentioned above ().
It is, indeed an aim of the COMMET framework developed in this project
to allow a mapping between
Two kinds of matching are defined, which bear some similarity with
the approaches developed here:
Collecting the necessary know-how is an encyclopedic enterprise, to
which the CLP methodology should be integrated. This would require
establishing accepted and reliable benchmarks to compare the various
techniques, the lack of which is mentioned by Renault [Renault CHIC
93].
The introduction of `disjunctive' constraints in ECLiPSe and Charme,
based on methods taken from the scheduling world, is an instance of
this type of integration.
Cosytec are going further and further into this direction. Their latest
version of CHIPv4 features very powerful constraints on various kinds
of problems: disjunctive and cumulative resources, placement, cycle, etc.
See the already mentioned transportation problem solved by Bull via
integration of an OR algorithm with Charme. According to the Charme
Object team, linking these algorithms with the Finite Domain solver
is, actually, not completely natural (their pivoting operations are
exchange-based mechanisms, not propagation: this is quite disconnected
from the constraint-engine).
Deciding which solver to choose may not be easy, even
with the criteria in the table below. Dassault is working on a Training
Curriculum Optimization problem which features both disjunctive aspects
(+FD, -LS) and a complex cost structure (-FD, +LS). They will try
to solve it using the Linear Solver of parallel ECLiPSe , with sophisticated
enumeration procedures executed in parallel. (This is one of their
contributions in the APPLAUSE ESPRIT III project.)
Problem Modelling should:
Problem Modelling is not resolution method-independent. This is not
specific to CLP being a candidate. If, e.g. , it is assumed
that Linear Programming might solve a given problem, then Problem
Modelling will consist mainly in linearizing the problem (and checking
whether this is acceptable or not). Problem Modelling should be performed
with a formalism and procedures that are adequate to the candidate
method. This section therefore contains:
Renault suggests to try and apply OR formulation methods used for
LP problems, since there are many examples in textbooks [Renault CHIC 93]
. However, the first methodology report [Bull CHIC 92] stresses that,
with mathematical models coming from OR, one should be aware that
Since industry is moving more and more towards object, an object-oriented
methodology will often available. It is then, probably, a good idea
to use it. However [Bull CHIC 92], using object-oriented
knowledge representation formalisms, one should take care of:
Note that the difficulty in properly representing transversal relations
with object-oriented formalisms is not specific to constraints. It
is also the case with rules. These relations, indeed, tend to be hidden
inside the descriptions of the objects. On the other hand, an object-oriented
description of the application entities , including the structural
constraints (those defining the entities, like the relationship
between start date, duration and end date of a task), is quite appropriate,
especially if it is supported by object-oriented features in the CLP
language at the implementation level (like with Charme Object, based
on C++, or CHIPv4, which has an object-oriented layer). According
to the Charme Object team, object-oriented modelling allows, in particular,
much of the object model to be shared between the constraint core
and the other modules of the application, e.g. the graphical
interfaces. The Dassault team, from their experience with CHIPv4,
confirm the advantage of defining an object-oriented data model and
specializing it for the various application modules, since this makes
extension and modification of the model easier.
In a quite early stage of the CHIC project (before the official kick-off),
a tentative formalism was proposed by Bull [Bull Formalism 90]. The
idea underlying this formalism was to allow both a kind of mathematical
formulation and a relatively straightforward translation to constraint
expression. Dassault wrote a full description of their scheduling
problem using this framework. The proposed formalism proved too heavy
and stringent, and the attempt was not taken over. Yet, the experience
had the positive effect for Dassault of clarifying certain issues
(like the status of certain rules) and of helping them to ensure exhaustivity
of the constraint enumeration.
A question by Jean-Yves Cras about the Dassault workshop problem was
the following one: does the statement `Heat treatment is performed
at night' express a hard constraint, or rather an organization rule?
The answer is: no, it is not a hard constraint, heat treatment can
be performed during day time, it has simply been found more rational
and simpler to have it systematically at night, but this might change
if a scheduler system is implemented.
7 System Specification. Communication Model
The first Dassault scheduler prototype automatically computed detailed
schedules for all operations inside the workshop. It would have replaced
both the man in charge of work releasing and the foreman
in his work distribution function and each worker's initiative
in organizing their own work (dates in tenths of hours were assigned
to the tasks!). This was criticized as being far too rigid and stringent.
The real need, identified after the first prototyping phase, was a
workload smoothing-oriented decision-support system, with analysis
tools.
The system developed at Dassault for long-term planning of aircraft
production [Dassault PAP 92] was first aimed at computing
optimized schedules without any intervention of the expert.
This was not a proper approach, for the system could not allow for
all aspects the expert would take into consideration and, even when
restricting the problem to the well-formalized constraints, finding
solutions better than those the user would find by `visual' reasoning
proved non feasible in many cases.
7.4.2.2 Constraint setting under user control
This is a method implemented in the Dassault assembly line balancing
application as a way of giving the user a reasonable control over
the construction of an understandable and coherent solution. The user
chooses the sets of constraints (constraints of the same types) he
wants to take into account. This is similar to the Renault method,
but the order on the sets is only partially imposed.
In the Dassault assembly-line balancing application [Dassault
PAP 95], setting only precedence constraints allows to check the consistency
of the precedence graph, which is an input data. It also gives a lower
bound for the total cycle duration, which provides a basis for evaluating
the impact of the rate constraints when they are added. Accessibility
constraints may or may not influence significantly the schedule, which
can be assessed by comparing the results with and without them. Similarly,
the level of constraint on the resource aspect (skilled worker manpower)
can be estimated.
The need for a more global expression, together with increased performance,
led e.g. to the development of the disjunctive constraint
at ECRC, now available in ECLiPSe [ECRC Disjunctive 93], as well as
of the disjunctive constraint in Charme Object [Bull Disjunctive 94
] and the cumulative constraint in CHIP4 [Cosytec JFPL 92]).
The Warehouse Location problem [CHIP Warehouse 89] is an example of
using element/3 constraints to express the cost for each customer
as a function of the warehouse responsible of delivering goods to
this customer. With Integer Programming, 0/1 variables would have
to be used. See the original paper for the resulting reduction of
the search space size.
In the car sequencing problem [CHIP CarSeq 88], classes of cars are
considered instead of individual cars, in order to remove symmetries.
A first step taken in the ECRC handling of the Braghenti problem was
to give up with finding a global optimum and to move to heuristic
search [ECRC Braghenti 94].
as opposed to value assignment to date variables.
Thus, it is often a good idea for such problems to have a non-deterministic
variable selection and a deterministic value assignment. In any case,
the allowed backtracking should always be significant.
For scheduling problems, e.g. , this may mean to have as basic
choice either `what order to impose on heuristically chosen pairs
of tasks', or `what task to consider to put next', rather than `what
value to try next for the currently considered task' (which most of
the time leads to absolutely irrelevant backtracking).
Here lies a possible explanation to difficulties encountered by Dassault
on their planning application. The problem, indeed features a very
high solution density and was initially specified as an optimization
one. Though solutions get scarcer when stringer cost bounds are set
in the course of the B&B optimization, this, unfortunately, cannot
be efficiently exploited by the solver due to the additive structure
of the cost function and the weakness of the underlying constraint
model.
In the MADE system, strategies based on a classical Bin-Packing heuristic
methods, First Fit Decreasing, are used to produce compact schedule,
satisfactory with respect to resource utilization (soft) constraints
expressed by the end-user.
The Renault constraint spreadsheet paradigm implemented
in their Short-Term Production application [Renault IJCAI 94; Renault
STP 92] is an example of that.
The Dassault assembly line application is an instance of highly interactive
system aimed at users who are experts in the tasks to be performed
on the assembly line, not in the process of building schedules. Only
high level and global actions are allowed, like incrementally setting
subsets of the constraints (see above ,
Handling Constraint Failure) and modifying the related parameters
to express desired tendencies ( e.g. , given a solution, locally
modify the maximum manpower available in order to get a smoother solution
at the next run).
A maximum acceptable delay may not exist as a hard limit (increasing
the delay may just have increasing `costs' as a consequence), but
fixing an upper bound helps the search.
The sum of delays in job shop scheduling is both non-linear (one cannot
use the linear solver) and a complex additive cost (a sum of a great
number of elementary costs). This is disastrous when used as a constraint
or an objective function with the FD solver. It may be worth trying
to replace such a criterion by the maximum delay criterion.
The maximum end date in job shop scheduling has no
particular meaning for the user (new orders will be issued anyway,
unlike in the case of project planning), but, in a B & B optimization
with finite domains, it is a powerful constraint for the search of
compact solution (which are, as such, satisfactory for the user though
the end-date indicator is not relevant for him).
it is always worth prototyping in CLP in order to clarify
This is due to the following features of CLP:
The natural tendency with CLP is to model the problem as a non-deterministic
problem, with resolution being improved by powerful constraint propagation.
This is perfectly suited if the problem is actually hard,
typically if it is NP-complete, but may be quite inadequate if
In such cases, CLP may prove eventually adequate as an implementation
language (though other possibilities are clearly to be considered),
but the right modelling is likely not to be the standard non-deterministic
one. Prototyping should precisely help to define the problem and discover
the specific properties that will determine its real complexity. Flexibility
is required and one should not stick at any cost to the non-deterministic
modelling of the very first mock-up.
The Dassault scheduling application features both part-grouping and
temporal constraints. Due to quantitative limits and the predominance
of disjunctive constraints, it was impossible to solve the problem
as a whole. The choice that was initially made was to first solve
the part-grouping subproblem, then the temporal one. This proved computationall
y feasible, but appeared to be too restrictive. In the MADE system,
the successor of the first prototype, the part-grouping constraints
are partly pre-solved (partial part aggregation), then a tentative
schedule is produced, then the part-grouping constraints are completely
taken into account and, finally, a definitive schedule is produced
for the definitive batches.
Bull had a manpower assignment application where the problem was presented
as global by the client. But this problem eventually proved to be
composed of two totally decoupled subproblems, concerning two distinct
subsets of the personnel, with distinct activities, performing unrelated
tasks, the only link between them being that they belong to the same
company.
Temporal granularity in scheduling problems is an example: only real
data can show what the required level of temporal precision really
is.
Constraint programs are particularly hard to debug, in that it is
practically impossible to follow constraint propagation step by step
for any non-toy program. The debugger developed within CHIC for the
CHIP compiler [ECRC-Cosytec Debugger 92] showed interesting features,
in terms of extraction and display of relevant information, which
will need to be further explored.
A mistake done by the Dassault team in one of their applications consisted
of dynamically choosing the smallest of a set of instantiated
values using the variable choice heuristic method `smallest' of CHIPv4,
instead of a static sort! Not only was this definitely inefficient,
but it was also simply wrong, for the heuristic method picks up integers
randomly !
In their Time-tabling application, the Charme Object
team perform some pre-processing to detect `standard' causes of inconsistency
in the data.
This allows to make representative prototypes in a relatively short
time. Now, due to this advantage of these languages, the temptation
may arise to keep them anyway for definitive implementation of the
whole system. If the system is relatively autonomous or with data
base links, that might be the right solution. But if the system has
to be closely integrated to a larger environment, with e.g.
other problem solving tools and/or an information system already having
its established interfacing capabilities etc. , then it should
be considered to implement only the constraint kernel in CLP. There
is no technical problem in that, since ECLiPSe or CHIP programs can
be compiled into C functions. Charme Object, on the other hand, is
a C++ library and CHIPv4 also exists as a C/C++ library, which makes
integration even easier.
The COCA system at Dassault is fully implemented in CHIP. This appears
to be a right choice, since the system is relatively autonomous and
its users are not accustomed to a particular interaction style with
computer systems.
For a manpower assignment problem, Bull has used a priori
filtering of the candidate tuples through rules. According to the
Charme Object team, this may be simply a sign that the constraints
are not powerful enough. Anyway, with CLP systems provided in the
form of C++ libraries, like Charme Object, the very notion of preprocessing
tends to disappear, for all treatments are performed using the same
language.
Since the most powerful constraints are often expensive, one should
take care that the information they deduce will actually be used by
other constraints. Thus, achieving a certain uniformity
in the grain of information is a good optimization hint, whenever
this can be controlled by the CLP user (otherwise, this is a built-in
design issue, for CLP language developers).
An early discussion in CHIC was about the usefulness of an extended
element constraint, in particular for scheduling problems. Having
such a constraint in the CHIC platform was then a requirement of Dassault.
Charme already featured an implementation of this constraint. The
problem is the following: when used for scheduling problems, extended
element may deduce information about holes in the domains, but
this will not be used by arithmetic constraints; one should then decide
whether a restricted version of extended element should be
designed, either as a built-in or a user-constraint (see below), or
whether local saturation e.g. should be applied to arithmetic
constraints, so that they really take hole information into consideration.
The practical answer in the Charme implementation was to `simulate'
the holes by daemons: the holes are not propagated when they are created,
but only when one of their bounds is reached by a domain bound. Dassault
eventually applied user-defined conditional constraints to achieve
a similar behaviour on pairwise disjunctive constraints and dropped
their requirement of a powerful extended element .
For instance, X + Y + Z #<= 1 is better
than atmost(1, [X, Y, Z], 1) .
An arg(N, T, V) delayed on N is better than
element(N, L, V)
if the main propagation is from N , when
it gets instantiated, to V.
An extremely simple example is the following, in the
Finite Domains
[A, B, C] :: 0..100,
A + B #= 10,
A + B + C #= 11.
is not enough for C = 1 to be deduced. One should write, e.g.
[A, B, AB, C] :: 0..100,
AB #= A + B,
AB #= 10,
AB + C #= 11.
which, of course, in this example, is trivial.
This task is far from easy, for one has to be cautious about correctness
and completeness of the implementation.
10 Application
Description Framework
This tentative framework used in CHIC to ensure a reasonable level
of uniformity of application descriptions by the Partners. The guide-lines
it provides, however, can also be used for a much more general purpose:
that of checking, at various steps of an application development process,
that no important issue has been missed. It roughly follows the plan
of the document. For each section, a list of items is proposed, most
often in the form of questions. When trying to answer these questions,
one should not restrict oneself to successes, but also talk about
failures, dead-ends, points where one had to backtrack.
10.1 Short Problem/Context Description
10.3
10.4 System Specification. Communication Model
[AIS Report 95] Artificial Intelligence Software, CHIC deliverable
report D6.5.3, January 1995.
Jean-Marc David and Chew Tat Leong of Renault have greatly contributed
to this methodology task, not only by their excellent reports, which
are used all along this document, but also by helping the authors
to clarify the issue of the CLP Methodology's place inside a general
methodology for knowledge-based systems. The authors also wish to
thank the CHIC Partners in general for their constructive criticism
of the first version of this document. This work was also discussed
at a meeting of the APPLAUSE Project, where valuable remarks on its
contents were expressed.