Heap's Algorithm

Generating all permutations with minimal swaps

How do you generate every possible arrangement of a set of elements? The naive approach copies and rearranges repeatedly. But there's a beautiful algorithm discovered by B.R. Heap in 1963 that does something remarkable: it generates each new permutation by swapping just two elements.

Starting from the first permutation, Heap's algorithm reaches each later output with exactly one swap. For [A,B,C,D], that means 24 permutations and 23 swaps.

Start Simple: Two Elements

With just two elements, there are only two permutations:

[A, B]
swap 0↔1
[B, A]

One swap. Done. Now let's add a third element.

Three Elements: The Pattern Emerges

With three elements, we have 3! = 6 permutations. Watch what Heap's algorithm does:

A
B
C
Play or Step. Each action either emits a permutation or prepares the next block.
k=3 (odd)
Current block: slot 2 fixed = C
Block 1 slot 2 fixed = C
ABC
Permutations: 1/6 Swaps: 0

Notice the trail groups: each group keeps one value fixed in the last slot, then one swap prepares the next group.

The Key Insight

Heap's algorithm makes more sense if you think in blocks. A call to heapPermute(arr, k) emits k blocks of (k - 1)! permutations each. Inside one block, arr[k - 1] stays fixed while the recursive call permutes the prefix.

  1. Run heapPermute(arr, k - 1) to generate all permutations of arr[0..k-2]
  2. If another block remains, do one swap to move a new element into arr[k - 1]
  3. Repeat until every element has occupied the last slot once
For k = 3, each block runs the same recursive subproblem on the prefix:
Block 1 permute slots 0..1 slot 2 fixed = C
swap(0, 2)
Block 2 permute slots 0..1 slot 2 fixed = B
swap(0, 2)
Block 3 permute slots 0..1 slot 2 fixed = A

The recursive call shape never changes. Only the value fixed in the last slot changes between blocks.

The swap does not output a new permutation by itself. It prepares the next block. The only non-obvious question is: which element should move into the last position?

The Parity Rule

The answer depends on parity. After heapPermute(arr, k - 1) returns, the prefix is left in a different shape for even-sized and odd-sized subproblems, so the parent needs two different swap rules:

When k is odd

0 1 2 k-1
The child returns with the next candidate at index 0, so use swap(0, k - 1).

When k is even

i ... k-1
The child returns with the next candidate aligned to loop index i, so use swap(i, k - 1).

These rules are not arbitrary. They are what preserve Heap's invariant: each block starts with a different element fixed in the last slot, and the transition between blocks costs exactly one swap.

Why Parity Matters

A useful mental model is that heapPermute(arr, k) is a scheduler for blocks. Each recursive call emits one full block of (k - 1)! permutations, then the parent performs a single swap to choose the element that will be fixed at k - 1 in the next block.

For k = 3, the odd rule is easy to see:

ABC, BAC   // C fixed at the end
swap(0, 2)
CAB, ACB   // B fixed at the end
swap(0, 2)
BCA, CBA   // A fixed at the end

For k = 4, each call to heapPermute(arr, 3) emits a six-permutation block. After each block, the parent uses swap(i, 3), so a different element from positions 0, 1, and 2 gets promoted into the last slot for the next block.

Engineer's version: the parity check tells the parent how to interpret the array state returned by the child call.

Four Elements: See It All Together

Now watch the full algorithm on four elements. Each group of six permutations keeps one element fixed at the end, and one swap prepares the next group:

A
B
C
D
Play or Step. Each action either emits a permutation or prepares the next block.
k=4 (even)
Current block: slot 3 fixed = D
Block 1 slot 3 fixed = D
ABCD
Permutations: 1/24 Swaps: 0

The Code

The entire algorithm in just a few lines:

function heapPermute(arr, k = arr.length) {
    if (k === 1) {
        output(arr);
        return;
    }

    for (let i = 0; i < k; i++) {
        heapPermute(arr, k - 1);

        if (i < k - 1) {  // don't swap after last iteration
            if (k % 2 === 0) {
                swap(arr, i, k - 1);
            } else {
                swap(arr, 0, k - 1);
            }
        }
    }
}

That's the whole algorithm. The only subtle line is the parity check: it decides which element should move into slot k - 1 before the next recursive block starts.

Why This Matters

Heap's algorithm achieves the theoretical minimum: n! - 1 swaps for n! permutations. After the first output, each later arrangement is reached by one swap. This makes it:

The deeper lesson: sometimes the right abstraction makes a complex problem trivial. Heap found that parity—the simplest property of numbers—perfectly captures what's needed to navigate the space of permutations.