Chapter 1: Introduction – 5. Applications in algorithm analysis

Tram Ho

From the discussion in the previous article (for all three notations: worst case, best case, and average case), we have understood that in all cases with a function f(n), we try to find 1 The function g(n) is approximately the same as the function f(n) where n are large values ​​(over the threshold n0).
This means that g(n) is a curve that approximates f(n) with n being large values ​​(over the threshold n0) as shown in the graph images in the previous article.
In mathematics, we call such a curve asymptotic analysis
=> For this reason, we call algorithmic analysis asymptotic analysis

1.19 Asymptotic Analysis Guide

There are some general rules that help us determine the running time of an algorithm.

  1. Loops : The maximum running time of a loop is the running time of the statements inside the loop (including tests) multiplied by the number of iterations.

Total time = 1 constant c * n iterations = c*n = O(n)

  1. Nested loops: Inside -out analysis. The total running time is the product of the size of all the loops.

Total time = c * n * n = c * n^2 = O(n^2)

  1. Consecutive statements – Consecutive statements: Adds the time complexity of each statement.

Total time = c0 + c1 * n + c2 * n^2 = O(n^2)

  1. If-then-else statements: Calculate the running time in the Worst case, run the test statement in the if condition, then run it in the body of the if or else (Depends on which part is larger)

Total time = c0 + c1 + (c2+c3) * n = O(n)

  1. Logarithmic complexity: An algorithm is O(logn) if it takes a constant amount of time to cut the problem size by a fraction (usually ½). As an example, let’s consider the following program:

If we look closely, the value of i is doubling every time. Initially i = 1, in the next step I = 2 and in the next steps i = 4, 8, etc
Assume that the loop is executing a number of times k. At the 2nd kth step^k = n, and at the (k + 1)th step, we exit the loop.
Take the logarithm for both sides of the equation 2^k = n
=> log(2^k) = log(n
=> k * log2 = log(n)
=> k = log(n) (// Suppose we take the logarithm to base 2 => log2 = 1)

Similarly, for the descending i below, the rate of growth in the worst case is O(logn).

Important Note:
When we talk about big O-notation, the base of the log function is not important.
Just as O(n) can mean 2 n, or 10 n or 10^6 * n, similarly, O(log n) can mean log(2) n or lo(10) n or log (e) n. It’s not important.
The important thing is that for sufficiently large n, O(logn) < O(n) < O(n.logn) < O(n^2) ~ That is, we are only interested in the rate of growth rate of growth of the function. number.

1.20 Simplifying properties of asymptotic symbols

  • Transitivity: f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒f(n) = Θ(h(n)). Valid for both O and Ω.
  • Reflexivity: f(n) = Θ(f(n)). Valid for both O and Ω.
  • Symmetry: f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n)).
  • Transpose symmetry: f(n) = O(g(n)) if and only if g(n) = Ω(f(n)).
  • If f(n) is in O(k * g(n)) for any constant k > 0, then f(n) is also in O(g(n))
  • If f1(n) is in O(g1(n)) and f2(n) is in O(g2(n)) then f1(n) * f2(n) is in O(g1(n) * g2(n) )).

1.21 Commonly used Logarithms and Summations

These are all calculations that we used to learn in the past, but perhaps rarely used, so we forgot (Hope you are not like me)

Logarithms
image.png


Arithmetic series – Arithmetic series
image.png


Geometric series
image.png


Harmonic series
image.png


And a few other important recipes
image.png

Ending

Ok here we go, this article is a bit much about math, hope everyone will remember the old knowledge
In the next lesson, I will present the basic theorems of the divide and conquer algorithm (Divide and Conquer).

Share the news now

Source : Viblo