Some Definitions for Time Complexity.

    1 Mar, 2024

    Give five symbolic and mathematical definitions related to time complexity.

    Symbols and definitions

    1. Big O notation (OO): If c,n0R+\exists{c, n_{0}} \in \mathbb{R}^{+}, such that n>n0\forall{n} > n_0, f(n)cg(n)f(n) \leq cg(n), then f(n)=O(g(n))f(n) = O(g(n)).
    2. Small o symbol (oo): If c>0\forall{c} > 0, n0\exists{n_0}, such that n>n0\forall{n} > n_0, f(n)<cg(n)f(n) < cg (n), then f(n)=o(g(n))f(n) = o(g(n)).
    3. Big Omega symbol (ΩΩ): If c,n0R+\exists{c, n_{0}} \in \mathbb{R}^{+}, such that n>n0\forall{n} > n_0, f(n)cg(n) f(n) \geq cg(n), then f(n)=Ω(g(n))f(n) = \Omega(g(n)).
    4. Small omega symbol (ωω): If c>0\forall{c} > 0, n0\exists{n_0}, such that n>n0\forall{n} > n_0, f(n)>cg(n)f(n) > cg (n), then f(n)=ω(g(n))f(n) = \omega(g(n)).
    5. Theta symbol (θθ): If c1,c2,n0R+\exists{c_1, c_2, n_0} \in \mathbb{R}^{+}, such that n>n0\forall{n} > n_0, c1g(n)f(n)c2g(n)c_1g( n) \leq f(n) \leq c_2g(n), then f(n)=Θ(g(n))f(n) = \Theta(g(n)). It can be understood that when the complexity of the upper bound and the lower bound both increase by the same amount, then this complexity is also called a compact bound.

    These notations describe the asymptotic behavior of algorithms' time complexity, aiding in understanding their performance characteristics.

    NotationMeaningIntuition
    Θ\ThetaTight BoundEquivalent to "="
    OOUpper BoundLess than or equal to
    ooLoose Upper BoundLess than
    Ω\OmegaLower BoundGreater than or equal to
    ω\omegaLoose Lower BoundGreater than

    Average, Best, Worst Cases

    Time complexity is generally used to compare different algorithms, such as bubble sort, insertion sort, and heap sort. However, for the same algorithm, depending on the input, we can also consider average, best, and worst case time complexities. For example, the time complexity of a binary search tree varies between a singly linked list and a balanced tree.

    Average Time Complexity: A measure of the weighted average of an algorithm's running time over all possible input cases. Let T(n)T(n) be the running time of the algorithm for an input size of nn. The average time complexity Tˉ(n)\bar{T}(n) is defined as

    Tˉ(n)=i=1kpiTi(n)\bar{T}(n) = \sum_{i=1}^{k} p_i \cdot T_i(n)

    where Ti(n)T_i(n) is the running time for the iith input case, pip_i is the probability of the iith case, and kk is the total number of possible input cases.

    Best Case Time Complexity: The minimum running time required by an algorithm in the most ideal scenario.

    Worst Case Time Complexity: The maximum running time required by an algorithm in the least ideal scenario.