Sequencing to Minimize Maximum Cumulative Cost

I want to give credit to our Interlibrary Loan people who tracked this reference down- it’s a Ph.D. thesis that only existed on microfilm.   So I got to use the microfilm reader for the first time since elementary school.  Those things are very different now.

The problem: Sequencing to Minimize Maximum Cumulative Cost.  This is problem SS7 in the appendix.

The description: Given a set of tasks with a partial order defined on the tasks, and a (possibly negative) integer cost c(t) for each task t, and a positive value Z.  Can we create a one-processor schedule that obeys the partial order constraints such that for each task t, the cost of each task scheduled before t is K or less?

Example: Suppose we have the following precedence graph of tasks (the number in the vertex is the cost of the task):

The schedule (d,e,a,c,b), has the last cost (task b) with a total cost of 12 (the sum of the costs of all tasks before it).  Notice that if some task costs are negative, it’s possible that the last task that is done does not have the maximum total cost.  For example, if task c’s cost was -4, then the schedule (d,e,a,c,b) has its maximum cost of 10 after task a is scheduled.

Reduction: I got this from a Ph.D. thesis by Abdel-Wahab, and in it, he reduces from Register Sufficiency.  So we’re given an instance of register sufficiency: a directed graph G=(V,E), and a K.  The precedence graph G’=(V’, E’) that we’ll build has 2 copies of every vertex in G.  The edge set of G’ will have all of the edges in E, and additionally, if (i,j) is an edge in E, then (j’, i) is an edge in E’.  (That’s an edge from the second copy of j, back to i.)  Each vertex costs 1 if it was in V, and -1 if it is a second copy.  Set K’ = K+1.

If the register sufficiency problem has a solution, then we build a sequence based on it.  Thinking in terms of Sethi’s “stone game”, each time we place a new stone on a vertex, we add that vertex to the end of the sequence.  If we pick up a stone from vertex x, add x’ (the second copy) to the end of the sequence.  If we’re moving a stone from y to x, then that’s like removing it from y and adding it to x.  You can prove by induction that the max cost of any task in this sequence is K’.

If the scheduling task has a solution, then we build a solution to the stone game.  For each task in the schedule, if it is in V (the original set of vertices), then place a stone on that vertex.   If it is not in V then it is a new vertex, so remove a stone from the copy in V.  Since K’ = K+1, it’s possible that this sequence of steps uses K+1 stones, so we need to modify it to use K.   So, find a node in the sequence whose cost is exactly K’.  We know this node is in V (it was a “place a stone” move since it took us to K’ stones).  It can’t be the last task since we have to end with one that costs -1 (since all of those vertices have no outdegree, and the ones that cost 1 all have an outdegree of at least 1).    So look at the task after it.  It can’t be a “place a stone” move, otherwise, we’ll use K’+1 stones.  So we know that the next task to be scheduled has cost -1.  Since that move will be to pick up a stone, just move the stone that’s about to be removed to our peak cost task instead of allocating a new stone, and we will solve the problem using 1 less stone at this point. If multiple places in the schedule cost K’ exactly, we can do this modification to all of them, creating a program sequence that uses just K stones at most.

Difficulty: 6.  The reduction is not hard, but the stone game takes some work to understand, and the turning a solution that costs K’ into a solution that costs K is a little tricky.  I wonder if there’s an alternate reduction that has K’=K.

Leave a Reply

Your email address will not be published. Required fields are marked *