AP

archive

Self Balancing Binary Search Trees

May 20, 2021

Operations on binary search trees are faster when the tree is shorter. The more nodes we can pack in each level of the tree, the faster our searches and insertions can run. How do we ensure our trees are as densely filled as possible, so that we can quickly operate on them?

The self-balancing binary search tree was invented for this purpose. The key idea is that you build a binary search tree with a series of insertion operations. If we can think up a clever way to insert nodes, we can always ensure that our tree is as dense as possible.

The AVL tree had the first of these clever insertions. Let's look at how it could work. If we start with a single node, and insert a second one, the tree would look like the following:

There's a root, and a single child. This binary search tree is as dense as possible. We know this because we couldn't rearrange any nodes to make the tree shorter. The first layer can only have a single node, the second can have a maximum of two. If we moved the 2 node around, it wouldn't change anything.

What if we inserted another node?

Now there's a problem. Our tree is just a straight line of nodes. A linked list. We have 3 slots for a height two tree, but we didn't fill all of them. Instead we increased the height of the tree, which is costly for performance.

We could take those same nodes and re-arrange them like so:

All we've done is move the 1 down and we have a fundamentally better tree. Not a whole lot of work for a nice performance boost.

This is the key insight for creating a self balancing binary search tree. If a tree is optimal, you will not be able to find any of these runs of three nodes. So if we want to create a clever insertion algorithm, it should be looking to fix these runs of three nodes.

And we already know how to fix those runs. Well, almost. We have one more case to cover.

Fixing this one isn't so easy. Before, we just dropped a 1 down. Now, we need a slightly more complicated one. A two step process.

First, we switch the 3 and the 2.

And look! It's the same case as before! So we can do the same thing, and just move the 1 down.

Hopefully those ideas make the problem clear. If we ever have a tree that isn't as dense as possible, it has a run of three nodes like that. As an exercise to sharpen the idea, see if you can spot the problem in the following tree:

Did you spot the trick? The 2 -> 1 -> 0 run isn't a problem at all. It is completely fine. Let's pick just that part out.

Because that 3 node is there, this is actually a perfectly fine tree. We can't arrange just these nodes to create a tree that is any denser than this one. The real problem was the 3 -> 4 -> 5 run.

Because that example was tricky to spot, its worth creating a stronger way to identify which runs of three nodes are problems and which aren't.

Runs of three nodes are just a proxy for the real problem, which is that the tree is unbalanced. One branch is much bigger than the others, which means that we've wasted some space. If all the branches are the same size, or pretty close to it (within 1 node), we know that the tree is perfectly dense.

For example, take another look at the tricky tree. The left branch has only two nodes, and the right branch has three. If we want a maximally dense tree, we would rather have shorter branches, so we should break up the longer branch.

How do we know where to make the change? We consider which node begins our three node run. And we can figure that out because a three node run has one branch that's bigger than the other.

This is the problematic three node run from the tricky tree. Because the right branch of the 3 node has two nodes, and the left branch has 0, we know that the tree is unbalanced. To express this mathematically, we subtract the height of the branches. In this case, 3 has a balance factor of 2. Because 2 - 0 = 2.

If the balance factor of a tree is 0, we are quite happy with that. That means that both branches have the same height. If the balance factor is 1, we're okay with it, becuase there is a two node run somewhere, but no three node runs. Let's consider the trick branch from earlier.

The left branch has height two, the right branch has height 1, so our balance factor is -1 (1 - 2 = -1), and because it's a 1, we're okay. The negative sign means that the tree is heavier on the left. This is an arbitrary decision, but it's important to make a decision one way or the other so that you can use that extra information in your insertion algorithm later.

If that 3 node wasn't there, we would have a similar case as earlier, and would have a balance factor of -2, a problematic three node run, and we would need to fix it.

So now we know how to algorithmically spot these three node runs. If any node has a balance factor of 2 or -2, we know it begins a three node run, and we have work to do.

So once we've spotted them, how do we fix them? There are four cases for a three node run. Let's look at all of them.

In each of these, the root's balance factor is either 2 or -2. And for each of them, we actually want them all to end up in the same configuration:

Each of them should end like that. The middle number is the root, largest number to the right, smallest to the left. This is the only way to arrange these 3 nodes in the densest way. So now the question is, how do we have to wrangle each of those three node runs so that they become this?

Well one idea is just to build 3 nodes in that arrangement and slap that into the tree. We find a node with balance factor 2 or -2, find its value and the other two values in the run. We make a root node from the middle value and give it two children that we build from the lowest and highest values.

Creating new nodes turns out to not be very helpful, let's look at yet another tricky tree.

Spot the problem? This tree is unbalanced, the right branch has a height of 3 and the left has a height of 1. So this is a problem, because the 1 node has a balance factor of 2, so we need to fix it, but it's not a traditional three node run.

But, there is a sneaky three node run in there. Can you spot it? Here's a hint: if we were to make the densest possible tree out of these nodes, which node has to be the root?

There are two options. You could make 2 the root, and the tree looks like this:

Or you could make 3 the root, and it would look like:

Take a second and think about how you could transform the initial tree into either of those two? Which is easier?

If we make 3 the root, we save ourself a whole lot of steps. We made two changes. We made 3 the parent of 1, and made 1 the parent of 2. 1 was already the parent of 0, and 4 was already the parent of 5, so we made very few changes.

If we made 2 the root, it would require us to do extra shuffling, so let's make 3 the root.

If we make 3 the root, it becomes nice and clear what happened. We had a three node run of 1 -> 3 -> 4, and we contorted it just right so that the tree turned out nicely. (And we chose 1 -> 3 -> 4 by following where the tree was unbalanced. 1 has a positive balance factor, and 4 has a positive balance factor, so we went right twice from the root)

The three node run we contorted was in the shape of one of the cases we considered earlier.

And to do that, we contorted this by dropping the 1 down. Take another look at the original tree:

Following the example, our goal is to move the 3 up to become the root, and move the 1 down. But we can't move the 1 down, because there is a 2 where we are trying to move the 1 into. In this case, we perform a handoff. We give the left child of 3 (which is a 2) to the 1 node and it becomes a right child. In other words, we inserted that 2 node into the 1 node, which is an important generalization in case the 1 node had multiple children.

This reflects a general pattern. There are 4 possible cases for a tree being unbalanced. We enumerated all of those earlier. There is one way that we can fix each of those. In this case we moved the middle node up and the left node down. Each of those 4 cases has their own special operation that will fix them, and we call those operations rotations. And we know that whenever we insert a node into an already balanced tree, it can only change the balance factor by at most 1, and a balanced tree has only balance factors of 0 or 1, so there's no way you can create any case other than one of the 4 we've already figured out how to solve.

So how do we know that this is actually going to work? When we initially insert an element, the tree can unbalance. If we have only 3 nodes, and they are unbalanced, we know for certain that the rotations will balance the tree, because we defined those operations to be whatever fixes them.

If we have more than 3 nodes, we know that unbalancing a tree can only happen in a three node run, so if those nodes are nice, and don't have any children, then it's basically the same case as just having one of those four base cases, and we're good.

If the nodes in the run have children, like in the tricky case we just went over, then we have some problems, but those can be solved too! If we just pop the problem children off, and insert them back into the tree, it has the same effect as handing it off, just like we did earlier.