In computer science, we use algorithms to solve computational problems. However, depending on the implementation, they differ from one another in efficiency. We measure an algorithm’s efficiency and performance in terms of time complexity, meaning how fast they run and space complexity, meaning how much memory they use.
Since these factors depend on the size of the input n, we need a way to express how algorithms behave as n grows arbitrarily large. The Big O, the Big Θ and the Big Ω Notations are mathematical tools used to describe the growth of functions that represent the runtime or resource consumption of algorithms. A precise, standardized language is provided for comparing the performance of the algorithms, allowing us to find the most suitable one for a specific given problem.
The Big O: Upper Bound (Worst Case)
By definition the Big O notation provides an upper bound on the growth rate of an algorithm. It describes the worst-case scenario of an algorithm’s performance.
Formal definition: A function f(n) is in 𝑂(𝑔(𝑛)) if there exists positive constants c and n0 such that:
f(n) ≤ c ⋅ g(n) for all n ≥ n0
Interpretation: Big 𝑂 ensures that 𝑓(𝑛) does not grow faster than 𝑔(𝑛)
g(n) asymptotically. It provides an upper bound on the growth rate, suppressing constant factors and lower-order terms.
Example: For 𝑓(𝑛) = 3𝑛2 + 5𝑛 + 7, we have 𝑓(𝑛) ∈ 𝑂(𝑛2) because:
2𝑛2 ≤ 𝑓(𝑛) ≤ 4𝑛2 for sufficiently large 𝑛.
The Big Ω: Lower Bound (Best case)
Definition: Big Ω notation describes the asymptotic lower bound of a function, describing the best-case performance of an algorithm.
Formal Definition: A function 𝑓(𝑛) is in Ω(𝑔(𝑛)) if there exist positive constants 𝑐 and 𝑛0 such that: 𝑓(𝑛) ≥ 𝑐 ⋅ 𝑔(𝑛) for all 𝑛 ≥ 𝑛0
Interpretation: Big Ω ensures that 𝑓(𝑛) does not grow slower than 𝑔(𝑛) asymptotically. It provides a lower bound on the growth rate, again suppressing constant factors and lower-order terms.
Example: For 𝑓(𝑛) = 3𝑛2 + 5𝑛 + 7, we have 𝑓(𝑛) ∈ Ω (𝑛2) because:
𝑓(𝑛) ≥ 2𝑛2 for sufficiently large 𝑛
Big Θ: Asymptotically Tight Bound
By definition, the Big Θ notation describes the asymptotically tight bound of a function, characterizing the exact growth rate of an algorithm.
Formal Definition: A function 𝑓(𝑛) is in Θ(𝑔(𝑛)) if there exist positive constants 𝑐1 𝑐2, and 𝑛0 such that: 𝑐1 ⋅ 𝑔(𝑛) ≤ 𝑓(𝑛) ≤ 𝑐2 ⋅ 𝑔(𝑛) for all 𝑛 ≥ n0.
Interpretation: Big Θ ensures that 𝑓(𝑛) grows asymptotically at the same rate as 𝑔(𝑛). It combines both upper and lower bounds, providing a tight bound.
Example: For 𝑓(𝑛) = 3𝑛2 + 5𝑛 + 7, we have 𝑓(𝑛) ∈ Θ (𝑛2) because:
2𝑛2 ≤ 𝑓(𝑛) ≤ 4𝑛2 for sufficiently large 𝑛.
Key Takeaways
- Big 𝑂: Worst-case scenario (upper bound).
- Big Ω: Best-case scenario (lower bound).
- Big Θ: Describes the exact growth, combining both 𝑂 and Ω.