Heap's Algorithm
Generating all permutations with minimal swaps
How do you generate every possible arrangement of a set of elements? The naive approach copies and rearranges repeatedly. But there's a beautiful algorithm discovered by B.R. Heap in 1963 that does something remarkable: it generates each new permutation by swapping just two elements.
Starting from the first permutation, Heap's algorithm reaches each later output with exactly one swap. For [A,B,C,D], that means 24 permutations and 23 swaps.
Start Simple: Two Elements
With just two elements, there are only two permutations:
One swap. Done. Now let's add a third element.
Three Elements: The Pattern Emerges
With three elements, we have 3! = 6 permutations. Watch what Heap's algorithm does:
Notice the trail groups: each group keeps one value fixed in the last slot, then one swap prepares the next group.
The Key Insight
Heap's algorithm makes more sense if you think in blocks.
A call to heapPermute(arr, k) emits k blocks of
(k - 1)! permutations each. Inside one block,
arr[k - 1] stays fixed while the recursive call permutes the prefix.
- Run
heapPermute(arr, k - 1)to generate all permutations ofarr[0..k-2] - If another block remains, do one swap to move a new element into
arr[k - 1] - Repeat until every element has occupied the last slot once
k = 3, each block runs the same recursive subproblem on the prefix:swap(0, 2) →swap(0, 2) →The recursive call shape never changes. Only the value fixed in the last slot changes between blocks.
The swap does not output a new permutation by itself. It prepares the next block. The only non-obvious question is: which element should move into the last position?
The Parity Rule
The answer depends on parity. After heapPermute(arr, k - 1) returns,
the prefix is left in a different shape for even-sized and odd-sized subproblems,
so the parent needs two different swap rules:
When k is odd
0, so use swap(0, k - 1).When k is even
i, so use swap(i, k - 1).These rules are not arbitrary. They are what preserve Heap's invariant: each block starts with a different element fixed in the last slot, and the transition between blocks costs exactly one swap.
Why Parity Matters
A useful mental model is that heapPermute(arr, k) is a scheduler for blocks.
Each recursive call emits one full block of (k - 1)! permutations, then the parent
performs a single swap to choose the element that will be fixed at k - 1 in the next block.
For k = 3, the odd rule is easy to see:
ABC, BAC // C fixed at the end
swap(0, 2)
CAB, ACB // B fixed at the end
swap(0, 2)
BCA, CBA // A fixed at the end
For k = 4, each call to heapPermute(arr, 3) emits a six-permutation block.
After each block, the parent uses swap(i, 3), so a different element from positions
0, 1, and 2 gets promoted into the last slot for the next block.
- Odd k: the even-sized child block has already rotated the next candidate into index
0, so the parent always swaps0withk - 1. - Even k: the odd-sized child block leaves the next candidate aligned with the loop index
i, so the parent swapsiwithk - 1.
Engineer's version: the parity check tells the parent how to interpret the array state returned by the child call.
Four Elements: See It All Together
Now watch the full algorithm on four elements. Each group of six permutations keeps one element fixed at the end, and one swap prepares the next group:
The Code
The entire algorithm in just a few lines:
function heapPermute(arr, k = arr.length) {
if (k === 1) {
output(arr);
return;
}
for (let i = 0; i < k; i++) {
heapPermute(arr, k - 1);
if (i < k - 1) { // don't swap after last iteration
if (k % 2 === 0) {
swap(arr, i, k - 1);
} else {
swap(arr, 0, k - 1);
}
}
}
}
That's the whole algorithm. The only subtle line is the parity check:
it decides which element should move into slot k - 1 before the
next recursive block starts.
Why This Matters
Heap's algorithm achieves the theoretical minimum: n! - 1 swaps for
n! permutations. After the first output, each later arrangement is reached
by one swap. This makes it:
- Cache-friendly: Only two elements move at a time
- Minimal memory: Works in-place, no copying needed
- Simple to implement: Just recursion + parity check
The deeper lesson: sometimes the right abstraction makes a complex problem trivial. Heap found that parity—the simplest property of numbers—perfectly captures what's needed to navigate the space of permutations.