Conversation
|
Failing tests seem to come from performance issues of the Cumulative constraint (for xcsp failing test, it seems to come from a greater number of nodes, which is a bit odd, as new implementation should not induce a weaker filtering). To investigate |
| h[i].updateLowerBound(0, aCause); | ||
| s[i].updateBounds(e[i].getLB() - d[i].getUB(), e[i].getUB() - d[i].getLB(), aCause); | ||
| e[i].updateBounds(s[i].getLB() + d[i].getLB(), s[i].getUB() + d[i].getUB(), aCause); | ||
| d[i].updateBounds(e[i].getLB() - s[i].getUB(), e[i].getUB() - s[i].getLB(), aCause); |
There was a problem hiding this comment.
if domains are enumerated, you might need to loop
| */ | ||
| public abstract void filter(IntVar[] s, IntVar[] d, IntVar[] e, IntVar[] h, IntVar capa, ISet tasks, Propagator<IntVar> aCause) throws ContradictionException; | ||
|
|
||
| protected void propStartDurationEndRelation(IntVar[] s, IntVar[] d, IntVar[] e, IntVar[] h, Propagator<IntVar> aCause) throws ContradictionException { |
There was a problem hiding this comment.
what if we don't do that at all and leave it to arithm ?
There was a problem hiding this comment.
As I said, the idea is to avoid a ping-pong between arithm constraints and the cumulative one, as the latter could be long to initiate. But maybe it is OK to allow it
There was a problem hiding this comment.
Ok I found a new reason : some filtering algorithms rely on the hypothesis of the task relation, especially the one in NRJCumulFilter
But I still can't really explain why we have the performance issues
There was a problem hiding this comment.
I think the slow donw mainly comes from the fact that, with views, only the modified tasks are updated and with the proposal all of them are updated.
There was a problem hiding this comment.
One possible way to avoid iterating over all tasks is to use IVariableMonitor to store modified tasks and only iterate over them. This is, for example, what is done in ActivityBased search.
…cy() + loop in CumulFilter.propStartDurationEndRelation() for enumerated domains
| * @since 04/02/2013 | ||
| */ | ||
| public class Task { | ||
| public class Task implements ICause { |
There was a problem hiding this comment.
I'm confused: the Task object was initially designed to maintain the relationship between start, duration and end during propagation. Now, if the relation is maintained explicitly by propagators, what is the purpose of Task ?
| @@ -147,7 +151,15 @@ private void declareMonitor() { | |||
| * @throws ContradictionException thrown if a inconsistency has been detected between start, end and duration | |||
| */ | |||
| public void ensureBoundConsistency() throws ContradictionException { | |||
There was a problem hiding this comment.
Here again, is this method still relevant?
|
|
||
| @Test(groups="10s", timeOut=60000) | ||
| public void test6(){ | ||
| long time = System.currentTimeMillis(); |
Replace the TaskMonitor by arithm constraints
Update code of Cumulative implementations to update
start + duration = endrelation as soon as possibleThe only problem of this implementation is that it might slow down a bit the solving process, as arithm constraints and cumulative constraints might do a ping-pong (however, it should not happen if the relation is well propagated inside cumulative filter implementations, but I could not find where it was not done fine). I have spotted such a slow down in CumulativeTest.test6() which is now taking 1.5s, instead of 0.5s (but no slow down on other testing methods).
That said, I tested on the new implementations of cumulative and disjunctive constraints I am working on, and no slow down has been spotted in these. So it is up to you to wait for this bigger Pull Request, or to accept this quick fix.