You might think of it this way - lets take the third case as an example:
f(n) = O(n^(log(b a) + e)) for e < 0
(The log is not of (a - e), but rather it s (log in base b of a) - e)
What does this mean?
Lets establish one thing first: The entire blob at the right side - O(n^(log(b a))). This is, in essence, the asymptotic behaviour of the T(n) function, without taking into account the f(n) part of it.
This part is not perfectly intuitive, but think about it, and you ll see its correct. (Just calculate some values for f(n) = O(1) and you ll see I m right. Since nonexistant f(n) is, for all intents and purposes, O(1).)
So, given that, we look at the e. What does the e do? It s lower than zero for sure, we know that thanks to the constraints we put on it, so that means the e will lower the asymptotic "class" (as in, n^2, log n, etc.) when put in the equation. In other words, if you can lower the asymptotic class of the aT(n/b)
part and make it equal to f(n), then that means aT(n/b)
is asymptotically bigger than f(n), and we act accordingly.
What this means, and what the master method is all about, is solving the following: O(T(n) - f(n)) = O(f(n)).
Lets look at the generic form the master method is based upon:
T(n) = aT(n/b) + f(n)
The aT(n/b)
part, is in essence the loop. The thing that decides how many iterations we will have. The right part, is the body of the loop. The actual work done. If the right side is asymptotically weak to the left side, than the right side matters less, and we have some lovely formulas to decide the asymptotic behaviour if its weaker, equal or bigger. We substract or add the e as I explained above, to find out if it s bigger, smaller or equal.
It s a bit hard for me to explain these kinds of things in text and not in my native language, I hope it was understood.