Re: [eclipse-clp-users] bb_min problem

From: Bogdan Tanasa <g-bogta_at_...107...>
Date: Wed, 25 Aug 2010 09:26:39 +0200
Thank you all!

I was absolutely sure that my program is wrong. :)

-----Original Message-----
From: Joachim Schimpf [mailto:joachim.schimpf_at_...269...] 
Sent: Wednesday, August 25, 2010 3:32 AM
To: eclipse-clp-users_at_lists.sourceforge.net
Subject: Re: [eclipse-clp-users] bb_min problem

Bogdan Tanasa wrote:
> Can you please help with the attached code?
> 
> For some reasons the bb_min predicate does not explore the solution space.

First, add some arguments to problem/0, so you can see more easily what's
going on (without having to litter the code with write statements):

  problem(X, Y, Obj) :-
    ...

Now, temporarily comment out your call to bb_min, and instead call your
search routine directly, i.e.

    ...
    % bb_min(pb_search(X, Y, Obj), Obj, bb_options{strategy:restart}),
    pb_search(X, Y, Obj),
    ...

Now you have a program that should, on backtracking, return _all_ solutions
to your problem.   Call problem(X,Y,Obj) from the top-level, and you get
solutions one by one, each time you ask for _more_:

?- problem(X, Y, Obj).
X = [[0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0], [0, 0], [0, 0], [0]]
Y = [0, 0, 0, 0, 0, 0]
Obj = -2.4424906541753444e-15__6.6613381477509392e-16
There are 10 delayed goals.
Yes (0.01s cpu, solution 1, maybe more)

X = [[0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0], [0, 0], [0, 0], [1]]
Y = [0, 0, 0, 0, 0, 1]
Obj = 0.069999999999997953__0.070000000000000728
There are 12 delayed goals.
Yes (0.02s cpu, solution 2, maybe more)

X = [[0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0], [0, 0], [0, 1], [0]]
Y = [0, 0, 0, 0, 1, 0]
Obj = 0.15065343999999803__0.1506534400000008
There are 12 delayed goals.
Yes (0.02s cpu, solution 3, maybe more)

Etc.

If these solutions look reasonable to you, it means the search is working
(you should also check the leftover _delayed_goals_, but it turns out they
are just the result of floating point inaccuracies).

bb_min is simply built on top of this, and finds the solution with the
smallest value of Obj.  Because you were lucky, the first solution found
happens to be the cheapest (-2.4424906541753444e-15__6.6613381477509392e-16,
which means essentially 0).  That's why you see (once you put bb_min back
into the code):

Found a solution with cost -2.4424906541753444e-15__6.6613381477509392e-16
Found no solution with cost -2.4424906541753444e-15 .. -0.99999999999999933
X:[[0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0], [0, 0], [0, 0], [0]]
Y:[0, 0, 0, 0, 0, 0]

So, your program actually works as expected!


There is one additional point when you work with non-integers: the default
increment that bb_min uses is 1.0, i.e. when it has found a solution with
cost c, it will in the next iteration try to find one with cost c-1 or
better.
If there are any solutions in between, they will be missed.  You should
therefore set the delta-parameter to a smaller value, for example

    bb_min(pb_search(X, Y, Obj), Obj, bb_options{delta:0.01})


Matthew Skala wrote:
> You seem to
> want the effect of the foreach loop to be the logical conjunction of all
> its iterations, and I'm not sure that's really what foreach loops do.

That's exactly what do-loops do, so no problem there.


>  I rather think they are more like backtracking loops, with some special
> variables (and only those) saved across each backtrack.

No, do-loops are equivalent to recursions and involve no implicit
backtracking.


> Although you're calling search/6, you're only
> calling it on one variable at a time, so it can't meaningfully do variable
> selection.  (Your use of "first fail" is of no use because you're
> overriding it with the one-at-a-time loop.)
> 
> I think you'd be better off building a list of variables and passing that
> into search/6 to have it label them in its own choice of sequence.  The
> list should include ALL the variables that need to be labelled, including
> the objective.

Bogdan had a list of lists, and was calling search/6 on each sub-list
separately.  Although this works and is correct, Matthew's general advice
is absolutely right: it is better and easier to write

    flatten(X, Vars),
    search(Vars, 0, first_fail, indomain, complete, []),

or (if you couldn't be sure that the Y variables always get instantiated
as a consequence of the Xs getting instantiated)

    flatten([X,Y], Vars),
    search(Vars, 0, first_fail, indomain, complete, []),

However, in your particular case you won't see any difference, because the
first_fail heuristic is based on domain size, and therefore has no effect
on boolean variables, whose domain size is uniformly 2.


-- Joachim






----------------------------------------------------------------------------
--
Sell apps to millions through the Intel(R) Atom(Tm) Developer Program
Be part of this innovative community and reach millions of netbook users 
worldwide. Take advantage of special opportunities to increase revenue and 
speed time-to-market. Join now, and jumpstart your future.
http://p.sf.net/sfu/intel-atom-d2d
_______________________________________________
ECLiPSe-CLP-Users mailing list
ECLiPSe-CLP-Users_at_lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/eclipse-clp-users
Received on Wed Aug 25 2010 - 07:26:54 CEST

This archive was generated by hypermail 2.3.0 : Wed Sep 25 2024 - 15:13:20 CEST