Pages

Dynamic Programming, Memoization, Fibonacci and more





We will present some important concepts that can help you improving your code. But first, let's start with some concepts.

---------------------------------------------------------------------------------------------------------------------

Dynamic Programming

The concept of dynamic programming was created by Richard Bellman in the 50s, with the idea of breaking down a complex problem into small pieces that could be easily solved. This idea was transposed to our algorithm world, where we can break the code execution into small pieces and, sometimes, use the concept of recursion to solve a problem.



Recursion

Recursion is the method of solving a computational problem where the solution depends on solutions to smaller instances of the same problem. With this approach, we are able to call the function inside itself many times, with different parameters, returning the result to the main function at the end. One example of problem that can be tacked with recursion is the famous Fibonacci sequence.



Fibonacci

The Fibonacci number, or sequence, was created in approximately 200 B.C. and it is described by a sequence where each number is the sum of the two preceding ones:

0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144,...

and
for n > 1.


Let's solve Fibonacci with applying the concepts of dynamic programming and recursion.




Pretty simple code right? In principle, it sounds like a good idea to solve the problem in an easy and straightforward way. However, let's remember our Big-O complexity chart:


Each time we run a code like the example before, where we iterate recursively dividing a base value into two small ones in order to call the same method, the execution is doubled each time more:



Long story short, that basically means the Time Complexity of our recursive Fibonacci code will be O(2^n) or Exponential. If you need to calculate the Fibonacci of 35 in our example, you will end up calling the fibonacci function more than 29 million times!!!





Ok, so should we just forget about recursive methods and come back to our normal loop with variables transferring and summarizing values between them? No, we can use  the concept of memoization to solve this problem and still continuing using recursive methods.



Memoization

The memoization concept is the idea of saving the results of the function calculations , given a specific input, in a memory cache. Them, each time the function with the same input is called, it is just a matter of returning from the cache. Let's see an example, using our fibonacci function?




With this technique, once fibonacci(1) and fibonacci(2) are solved and saved in cache, all the other interactions that will be called recursively and ending up with this specific call are not necessary anymore. We will end up with a Time Complexity of O(n) and, for fibonacci(35), with 69 callings to the method (much better right?).




For each solution there is no silver bullet. So, next time you think about using recursion to solve some problem, take in consideration the number of interactions that your code will execute to solve the specific problem, and consider the use of memoization (caching) to reuse the previous executed code. If, even with cache, data is too expensive to be handled in memory, consider coming back to the idea of using loops and variables (the old and effective way of solving problems) :-)

 

Observations:

They are libraries that you can use to continue working with the recursion + using the memorization technique. But it is important to understand that, under the hoods, they are doing the same thing we did in our example.

The code used in this blog can be downloaded here.


See you next time.



Fabio Ono

No comments:

Post a Comment