In this article, I will detail how we will represent 3 cases with a mathematical function as shown above using symbols and graphs to illustrate.

### 1.11 Asymptotic Notation

We already know the expressions for the best, average and worst cases, for all three cases we need to define the upper and lower bounds.

To represent these upper and lower bounds, we need some sort of syntax, and that’s the topic we’ll discuss next.

Assume that the given algorithm is represented as a function f(n).

### 1.12 Big-O Notation

This notation provides the **upper bound** of the given function.

It is represented as **f(n) = O(g(n))** .

This means that, for large numbers n, the upper asymptote of f(n) is g(n).

Graphs for everyone to easily visualize:

As the example in the previous post, if f(n) = n^4 + (2 * n^2) + 100n + 500 is the representation function for the given algorithm, then n^4 is g(n).

Originally Posted in English: “That means g(n) gives the **maximum rate of growth** for f(n) at larger values of n.”

To put it simply, g(n) is a function whose rate of change corresponds to the function f(n) when n increases greatly.

OK, we’ll look at O-notation notation in a bit more detail.

O–notation is defined as follows:

**O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ c * g(n) for all n ≥ n0 }.** *

=> g(n) is an asymptotic tight upper bound (simply understood as the upper asymptote) to the function f(n).

In general, we will discard low values of n. This means that the rate of growth at low values of n is not important. In the figure, *n0* is the point from which we need to consider the rate of growth of a given algorithm. Below *n0* , the rate of growth may be different.

=> *n0* is called the threshold of the given function.

This place is a bit abstract, so I will explain more: like the previous example I took in the sorting algorithm, if the input array has only a very small number of elements (say 10 or 20) , then the computational volume will be very small and the processing time is extremely fast, almost negligible, so it can be ignored.

Things only start to get difficult and complicated when the size of the array exceeds a certain number (that is,n0– the threshold of the function) => This is why we will mostly only care when n ≥n0

**Big-O Visualization – Visualization of Big-O**

Original English: **O(g(n)) is the set of functions with smaller or the same order of growth as g(n).** ~ a set of functions whose growth order is less than or equal to g(n)

For example, O(n^2) includes O(1), O(n), O(nlogn), …

**Note** : Only analyze the algorithm at larger values of n. This means that, below *n0* , we don’t care about the rate of growth.

**Big-O Examples**

Example-1: Find the upper bound of the function f(n) = 3n+8

Solution: We already have the definition: **O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ c * g(n) for all n ≥ n0 }.** *

=> To find the upper bound means we have to

**find a positive constant c and**satisfying the condition 0 ≤ f(n) c * g(n) for all n ≥

*n0**n0*

For the given problem: 3n+8 <= 4n for all n >= 8

=> We have upper bound O(n) with c = 4 and

*n0*= 8.

Example-2: Find the upper bound of the function f(n) = n^2 + 1

Solution: Similar to example 1, we look for c and *n0*

We have n^2 + 1 <= 2 * n^2 for all n >= 1

=> upper bound O(n^2) with c = 2 and *n0* = 1

Example-3: Find the upper bound of the function f(n) = n^4 + 100 * n^2 + 50

Solution: We have n^4 + 100 * n^2 + 50 <= 2 * n^4 for all n >= 11

=> upper bound O(n^4) with c = 2 and *n0* = 11

(These inequalities are simply the addition and subtraction of the inequality and then the root is quite simple, so I will not present it in detail)

Example-4: Find the upper bound of the function f(n) = 2n^3 – 2n^2

Solution: We have 2n^3 – 2n^2 <= 2n^3 for all n >= 1

=> upper bound O(n^3) with c = 2 and *n0* = 1

Example-5: Find the upper bound of the function f(n) = n

Solution: We have n <= n for all n >= 1

=> upper bound O(n) with c = 1 and *n0* = 1

Example-6: Find the upper bound of the function f(n) = 410

Solution: We have 410 <= 410 for all n >= 1

=> upper bound O(1) with c = 1 and *n0* = 1

**No Uniqueness?**

There is no unique set of values for *n0* and c in proving asymptotic limits.

Let’s consider 100n + 5 = O(n). For this function, there are many satisfying c and n0.

Solution1: 100n + 5 ≤ 100n + n = 101n ≤ 101n, for all n ≥ 5, *n0* = 5 and c = 101 satisfied.

Solution2: 100n + 5 ≤ 100n + 5n = 105n ≤ 105n, for all n ≥ 1, *n0* = 1 and c = 105 are also satisfied.

### 1.13 Omega-Ω Notation

The notation Ω (read as Omega) represents the lower bound of an algorithm and we express it as **f(n) = Ω(g(n))**

This means that for large numbers n, the lower bound of f(n) is g(n). For example, f(n) = 100 * n^2 + 10n + 50, the lower bound is Ω(n^2).

The notation Ω can be defined as **Ω(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }**

g(n) is a lower asymptote to f(n)

Our goal here is to have a function g(n) that has the largest rate of growth but is still less than or equal to the rate of growth of the function f(n).

**Ω Examples**

Example-1: Find the lower bound of the function f(n) = 5 * n^2 .

Solution: Similar to the example of the Big O part, here we will also find c and *n0* that satisfy 0 ≤ cg(n) ≤ f(n) for all n ≥ *n0* We have 0 ≤ c * n^2 ≤ 5n^2 c * n^2≤ 5n^2 c = 5 and *n0* = 1

Example-2: Prove that f(n) = 100n + 5 ≠ Ω(n^2 ).

Solution: c, *n0* such that: 0 c * n^2 100n + 5

We have: 100n + 5 100n + 5n (∀n ≥ 1) = 105n

c * n^2 105n n(c*n – 105) 0

Because n is a positive number cn – 105 0 ⇒ n 105/c

⇒ Contradictory: n cannot be less than a constant. (Because of the original function f(n) there is no limit on the value of n)

### 1.14 Theta-Θ Notation

This notation determines whether the upper and lower bounds of a given function (algorithm) are the same.

The average running time of an algorithm is always between the lower and upper bounds.

If the upper limit (O) and the lower limit (Ω) give the same result, the symbol Θ will also have the same rate of growth.

For a given function (algorithm), if the rate of growth (bounds) for O and Ω is not the same, then the rate of growth for the case Θ might be different.

In this case, we need to consider all possible time complexities and average them (for example, for the Quick Sort average case, when going to the chapter on Sorting I will talk more about this part).

Now we will take a closer look at the notation Θ. It is defined as **Θ(g(n)) = {f(n): there exist c1 , c2 and n0 such that 0 ≤ c1 * g(n) ≤ f(n) ≤ c2 * g(n) for all n ≥ n0 }.** Θ(g(n)) is a set of functions with the same order of growth as g(n).

**Θ Examples**

Example 1: Find the bound for the function f(n) = (n^2)/2 – n/2

Solution: (n^2)/5 <= (n^2)/2 – n/2 <= n^2 for all n >= 2 ⇒ We have Θ(n^2) where c1 = 1/5, c2 = 1 and n0 = 2.

Example 2: Prove that n ≠ Θ(n^2)

Solution: c1 * n^2 <= n <= c2 * n^2

<=> n <= 1/c1 (Nonsense because there is no limit on how large n must be less than 1 number).

Example 3: Prove 6 * n^3 ≠ Θ(n^2) Solution: c1 * n^2 ≤ 6 * n^3 ≤ c2 * n^2 ⇒ n ≤ c2/6 (Irrational because there is no limit about the magnitude of n must be less than some number).

Example 4: Prove that n ≠ Θ(logn) Solution: c1 * log(n) <= n <= c2 * log(n) ⇒ We have the right hand side c2 >= n/log(n) for all n >= n0 (Nonsense because there can’t exist a positive number c2 where n/log(n) is always smaller for any n > n0).

### Closing – With 1 important note

For our analysis (best case, worst case and average), we’ve got upper bound (O), lower bound (Ω) and average running time (Θ).

From the examples above, it should also be clear that, for a given function (algorithm), taking upper bound (O), lower bound (Ω) and average running time (Θ) may not always be Feasibility.

In the remaining chapters, we will **focus on the upper bound (O)** because knowing the lower bound (Ω) of an algorithm is not important in practice and **we will use the notation if the upper bound (O ) and lower bound (Ω) are the same.**