# Protected: Permutation Generation

# Protected: Finite Function Generation

# Protected: Reachability for 1-Conservative Petri Nets

# Protected: Non-Liveness of Free Choice Petri Nets

# Protected: Cyclic Ordering

# Protected: Betweenness

# Programs With Formally Recursive Procedures

This is the last of the “Program Optimization” problems, and the next section is the last one on “Miscellaneous” problems, which has 19 problems in it. So, unless some diversion into a problem not in the book happens, we’re on the countdown through the last 20 problems! The end is in sight!

**The problem: **Programs with Formally Recursive Procedures. This is problem PO20 in the Appendix.

**The description: **Given a set of function definitions and a program that makes a sequence of function calls using those definitions (the paper by Winklmann uses Algol because he wrote it in 1982. But any language that allows recursion should work). Are any of the procedures in the program formally recursive? For our purposes, “formally recursive” means that there exists a chain of calls that has a call to a function higher in the chain (not necessarily a parent, any ancestor will do).

My first instinct was to wonder if this problem was actually undecidable. It’s not because we will have some fixed # of procedures d, and so any call chain needs to be checked only to depth d+1- if we get that far, we must have a repeat someplace. If we never get that far, we can back up and check other call chains, eventually checking all of the paths through the call tree exhaustively. The proof that the problem is in NP is in the paper, but it involves representing the call tree as a DAG so we can use one vertex for two equivalent paths. Winklmann says that if you do that the number of vertices is “bounded” by the number of procedures in the program (he doesn’t say what that bound is, though).

**Example: **It’s hard to make a small example that’s not obvious. How about this:

`E(){`

` E();`

`}`

`D(){`

`}`

`C(){`

` A();`

`}`

`B(){`

` D();`

`}`

`A(){`

` B();`

` F`

`}`

`main(){`

` A()`

`}`

If “F” is replaced by a call to C, the program is formally recursive, because the call chain main->A->C->A has a repeat. If we replace that F with another call to B the program is not formally recursive because even though recursive functions like E exist (and even though B gets called twice in two different chains), there is no call chain that will start at main and hit E (or anything) more than once.

**Reduction: **Winklmann uses 3SAT. So we start with a formula B with m clauses and n variables. We’ll write some functions:

`AssignX`

: generates two calls to _{i}(c_{1}..c_{m})`AssignX`

. The function has one parameter for each clause. The first call simulates setting variable X_{i+1}_{i} to true by passing a “true” value in all parameters that represent clauses that have variable X_{i} represented positively. The second call simulates setting variable X_{i} to false by passing a “true” value in all parameters that represent clauses that have variable X_{i} returning negatively. In all other cases, the function passes on the c_{i} parameter it was given.

`AssignX`

: once we reach the end of the variables, we have to stop. This function calls the “_{n}(c_{1}..c_{m})`Check`

” function we’ll be defining in a minute twice, once reflecting setting variable X_{n} to true, one reflecting setting it to false.

`True`

There is one “true” function for each clause, each one has a parameter for all clauses with larger numbers. The paper writes this function a little weirdly, but I think a good definition is:_{j}(c_{j+1}..c_{m}):

`if(c`

_{j+1} is true){

`True`

_{j+1}(c_{j+2}..c_{m});

`else`

` False`

_{j+1}(c_{j+2}...c_{m});

`}`

So, as long as the parameters are true, we’ll continue calling down through the true functions to the end. The last True function is what will make us formally recursive:

`True`

: Calls _{m}()`Assign`

_{1}`(False, False, ...False)`

. Our main program will begin with a call to `Assign`

_{1} with these exact parameters, so if we can get here, we will have found our recursive call.

The False functions end a call chain:

`False`

_{j}(c_{j+1}..c_{m}){

// do nothing

}

The Check function’s job is to start the appropriate call to True or False:

`Check(c`

_{1}..c_{m}){

if(c_{1} is true)

True_{1}(c_{2}..c_{m});

else

False_{1}(c_{2}..c_{m});

}

Our main program is just a call to `Assign`

. It’ll go through all of the various _{1}(False, False..False)`AssignX`

funtions trying to assign variables (and thus clauses) to true. Each assignment will be checked, and each check will either hit a False function (meaning a clause wasn’t satisfied, and so we will backtrack and try an alternate assignment) or the final True function, which will generate our recursive call. So this program has a recursive call chain if and only if the formula was satisfiable._{i}

Notice that while the *run* of this program might take exponential time, that’s ok because reductions are concerned with the time it takes to *create* the new problem instance, and there are “just” 2m+n+1 functions to create.

**Difficulty: **6. This is a neat problem and reduction- I think the “write a backtracking program to try all combinations” is pretty gettable. The hard part is the check and true/false functions at the end. We can’t really use loops because loops make programs undecidable (you can’t show the problem is in NP by going down d+1 calls if you might have an infinite loop someplace and won’t get through all of those calls). So we have to build a chain of recursive calls instead, with dead-ends to force the backtracking.