Monthly Archives: January 2018

Knapsack

It’s kind of amazing that I’ve gotten this far without needing such a classic problem.  But I was writing up the next post and needed to link to my Knapsack article, and it wasn’t there, so..

The problem: Knapsack.  (More technically, “0/1 Knapsack”).  This is problem MP9 in the appendix.

The description: Given a set U of items, each item u in U has a profit p(u), and a weight w(u).  (G&J call these “value” and “size”), and positive integers B and K.  Can we create a subset U’ of U that has total profit at least K, but total weight at most B?

Example: Knapsack is one of my all-time favorite problems, which probably says something about me.  But I inflict it on my students at all levels of the curriculum – it’s a good motivation to introduce parallel arrays, and building simple classes at the low levels, the fractional version is a good example of a greedy algorithm, and the 0/1 version is a good example of where greedy fails.  It also fascinates me how it’s an example of a problem where a problem that has infinite solutions (Fractional Knapsack, when you can take any proportion of an item) is easily solvable, but a problem that has fewer solutions to consider (0/1 Knapsack, where there are “just” 2 choices for each item) is intractable.

Anyway, here is my classic “Your Greedy Approach Won’t Work” example:

Item Profit Weight
1 5 4
2 3 3
3 3 3

If B=6, the best option is to choose items 2 and 3.  But greedily picking items by profit/weight ratio (which works for the Fractional Knapsack problem) will choose item 1, and lose profit.

Reduction: G&J reference Karp’s paper and this is a classic reduction you see in most books that do this problem.  You go from Subset Sum.  We’re given a set S, and an integer K.  We create a set U with the same number of elements, make the profit and weight of each element the same, and equal to the size of the corresponding element in S.  We set K’=B=K.

If the original set had a subset summing to K, then taking those elements will make us a profit of K’ and a weight of B.

If we have a Knapsack solution with profit K’ or more, then since the profits of all items are equal to their weights, the only for the total weight to not exceed B is for the profit to be exactly K’.  So taking the corresponding items in S will get a sum equal to exactly K.

Difficulty: 3.  I use this as an easy homework or test question all the time.

Sequencing to Minimize Weighted Tardiness

Happy New Year, everyone!  I just checked ahead in the appendix, and at my current pace of about 1 problem a week, we should get through appendices A5 (Sequencing and Scheduling), A6 (Mathematical Programming), and possibly A7 (Algebra and Number Theory) this year.  The appendix goes through A12, so it looks like I have plenty of work ahead of me.

The problem: Sequencing to Minimize Weighted Tardiness.  This is problem SS5 in the appendix.

The description: Given a set T of tasks, each task t  has a length l(t), a weight w(t) (which is really a penalty for finishing a task late) and a deadline d(t), and a positive integer K.  Is there a one-processor schedule for all tasks in T where each task is penalized by w(t) for each time unit it goes beyond its deadline, such that the total penalty is K or less?

Example: Suppose I have 3 tasks:

Task Length Deadine Weight
1 3 3 100
2 1 4 3
3 2 4 1

If task 1 is going to not miss its deadline, it needs to start right away.  Since the weight penalty for missing the deadline is pretty disastrous, we should do that.

At time 3, when task 1 is done, we have a choice: Schedule task 2, and have task 2 finish 2 time units late, or schedule task 3, have it finish just 1 time unit late and schedule task 2 afterward.  With these weights, scheduling task 2 first gives a penalty of 2 (task 3 finishes 2 time units late, for a penalty of 1, and 1×2=2).  The other plan would have a penalty of 7 (Task 3 finishes 1 unit late, and task 2 finishes 2 units late).

If the weight of task 2 was 1, and the weight of task 3 was 5, then scheduling task 2 first gives a penalty of 10, but scheduling task 3 first gives a penalty of 6.

Reduction: G&J reference a paper by Lawler, which itself references G&J.  Weird.   Anyway, they reduce from 3-Partition.  Given a 3-Partition instance (a set of 3n elements, and a bound B.  Each element is between B/4 and B/2, and the sum of all elements is m*B), we create 4n jobs.  3n of them will be “A” jobs, each one corresponding to an element in the 3-Partition instance.  We will also have n “X” jobs.

Each “A” job Ai corresponds to an element in the 3-partition instance ai.  These jobs have weight and length B+ai and deadline 0.

The X jobs all have processing time 16B2+n(n+1)/2 + 1.  Call this number L.  The X jobs have weight (L+4B)*(4B)*n(n+1)/2 + 1.  Call this number W. Each X job Xi has deadline iL+(i-1)*4B.    Notice that these X jobs don’t depend on the values in the partition instance at all (they do depend on the size).

Our K value is W-1.

If the original set has a partition, then we can make a schedule of: X1 then the A jobs corresponding to a 3 element partition, then X2, then another set of 3 A jobs corresponding to another 3 element partition, and so on down.  Since we know the  3 elements in a partition sum to exactly B, we know that the processing time of each set of 3 A jobs is exactly 4B.  So each X job will finish at time i*L+(i-1)*4B, and not be tardy.  The A jobs all have deadline 0 and will all be tardy and will pay penalties related to their completion times, but that will sum to less than K.

In the other direction, suppose that we have a legal schedule.  Since all of the X jobs have weight > K, they all have to be scheduled before their deadlines.  Then we can talk about the weights and completion times that the A jobs need to have to fit around these X jobs.  It’s a lot of algebra to make it all work out, but the math isn’t too hard.

Difficulty: 7.  While the algebra is followable, the idea that you’d need to come up with these numbers in the first place is a hard one- I know they’re picked because they “work”, but I feel like there must be an easier way.