On 02/10/2013 00:16, mskala_at_...206... wrote: > I have a constraint program with 25 main variables (and several hundred > others that depend on them). I would like to make a list of all vectors > of values for 9 of the variables such that at least one solution exists > for all the variables; but I don't care about the satisfying values of the > other 16, nor how many solutions exist for them, only the fact that some > satisfying assignment exists given the first 9. I expect the list of > results to have maybe a few thousand entries on it, but that's only a > guesstimate; the ones I've generated so far vary wildly in length and how > long they turn out to be is one of my research questions. It takes a few > seconds to a minute to solve each constraint problem (also varying > wildly) so I'm looking at a few CPU-hours to CPU-days to generate each > list, and that's acceptable, but more speed is always welcome. > > What I've been doing is this: > > * Search for any solution to the whole problem and capture the 9 variables > I care about; call the vector of them S. > * Recursively search for solutions with the constraint that the 9 > variables, as a vector, are lexicographically less than S. > * Return S on backtracking through this point. > * Recursively search for solutions with the constraint that the 9 > variables, as a vector, are lexicographically greater than S. > > That gives me the whole list of vectors of my 9 variables for which > satisfying assignments to the entire problem exist; in lexicographic > order, even. The main point is NOT to do more than one search through all > assignments of the remaining variables once we have one to return. And it > works pretty well. In particular, it works better than searching just the > 9 variables first and then searching the rest of the problem inside of > once/1, because that approach messes up the variable selection heuristic > in the search. But it still seems like I may be doing extra work because > of searching from the top of the tree on every recursive call, even with > the lexicographic constraint pruning those trees. It seems like maybe > there ought to be something possible akin to what the branch and bound > library does, restarting searches in the middle as solutions are > accumulated. > > So my question is, is there a known clever way to do this sort of thing > efficiently? Hi Matthew, your approach is similar to what branch-and-bound (with 'restart' strategy) does, and also similar to the topological branch-and bound algorithm in Propia. Both, after every new solution, restart the goal with additional constraints. Beyond that, library(branch_and_bound) implements the 'continue' strategy, which does _not_ restart the search, and is pretty close to what you envisage. It uses an undocumented ECLiPSe feature called fail-events to dynamically impose newly learned (bound-)constraints during the search process. It should be possible to use the same technique in your case. Please have a look at the source of bb_continue/8 in library branch_and_bound.pl. I hope I have commented it sufficiently, otherwise please contact me. Cheers, JoachimReceived on Wed Oct 02 2013 - 13:48:31 CEST
This archive was generated by hypermail 2.3.0 : Wed Sep 25 2024 - 15:13:21 CEST