### 2. Linear Order $O(n)$ {data-toc-label="Linear Order"}
### 2. Linear Order $O(n)$ {data-toc-label="2. Linear Order"}
Linear order is common in arrays, linked lists, stacks, queues, etc., where the number of elements is proportional to $n$:
Linear order is common in arrays, linked lists, stacks, queues, etc., where the number of elements is proportional to $n$:
@ -1589,7 +1589,7 @@ As shown below, this function's recursive depth is $n$, meaning there are $n$ in
<palign="center"> Figure 2-17 Recursive Function Generating Linear Order Space Complexity </p>
<palign="center"> Figure 2-17 Recursive Function Generating Linear Order Space Complexity </p>
### 3. Quadratic Order $O(n^2)$ {data-toc-label="Quadratic Order"}
### 3. Quadratic Order $O(n^2)$ {data-toc-label="3. Quadratic Order"}
Quadratic order is common in matrices and graphs, where the number of elements is quadratic to $n$:
Quadratic order is common in matrices and graphs, where the number of elements is quadratic to $n$:
@ -2025,7 +2025,7 @@ As shown below, the recursive depth of this function is $n$, and in each recursi
<palign="center"> Figure 2-18 Recursive Function Generating Quadratic Order Space Complexity </p>
<palign="center"> Figure 2-18 Recursive Function Generating Quadratic Order Space Complexity </p>
### 4. Exponential Order $O(2^n)$ {data-toc-label="Exponential Order"}
### 4. Exponential Order $O(2^n)$ {data-toc-label="4. Exponential Order"}
Exponential order is common in binary trees. Observe the below image, a "full binary tree" with $n$ levels has $2^n - 1$ nodes, occupying $O(2^n)$ space:
Exponential order is common in binary trees. Observe the below image, a "full binary tree" with $n$ levels has $2^n - 1$ nodes, occupying $O(2^n)$ space:
@ -2225,7 +2225,7 @@ Exponential order is common in binary trees. Observe the below image, a "full bi
<palign="center"> Figure 2-19 Full Binary Tree Generating Exponential Order Space Complexity </p>
<palign="center"> Figure 2-19 Full Binary Tree Generating Exponential Order Space Complexity </p>
### 5. Logarithmic Order $O(\log n)$ {data-toc-label="Logarithmic Order"}
### 5. Logarithmic Order $O(\log n)$ {data-toc-label="5. Logarithmic Order"}
Logarithmic order is common in divide-and-conquer algorithms. For example, in merge sort, an array of length $n$ is recursively divided in half each round, forming a recursion tree of height $\log n$, using $O(\log n)$ stack frame space.
Logarithmic order is common in divide-and-conquer algorithms. For example, in merge sort, an array of length $n$ is recursively divided in half each round, forming a recursion tree of height $\log n$, using $O(\log n)$ stack frame space.
<palign="center"> Figure 2-9 Common Types of Time Complexity </p>
<palign="center"> Figure 2-9 Common Types of Time Complexity </p>
### 1. Constant Order $O(1)$ {data-toc-label="Constant Order"}
### 1. Constant Order $O(1)$ {data-toc-label="1. Constant Order"}
Constant order means the number of operations is independent of the input data size $n$. In the following function, although the number of operations `size` might be large, the time complexity remains $O(1)$ as it's unrelated to $n$:
Constant order means the number of operations is independent of the input data size $n$. In the following function, although the number of operations `size` might be large, the time complexity remains $O(1)$ as it's unrelated to $n$:
@ -1167,7 +1167,7 @@ Constant order means the number of operations is independent of the input data s
### 2. Linear Order $O(n)$ {data-toc-label="Linear Order"}
### 2. Linear Order $O(n)$ {data-toc-label="2. Linear Order"}
Linear order indicates the number of operations grows linearly with the input data size $n$. Linear order commonly appears in single-loop structures:
Linear order indicates the number of operations grows linearly with the input data size $n$. Linear order commonly appears in single-loop structures:
@ -1538,7 +1538,7 @@ Operations like array traversal and linked list traversal have a time complexity
It's important to note that **the input data size $n$ should be determined based on the type of input data**. For example, in the first example, $n$ represents the input data size, while in the second example, the length of the array $n$ is the data size.
It's important to note that **the input data size $n$ should be determined based on the type of input data**. For example, in the first example, $n$ represents the input data size, while in the second example, the length of the array $n$ is the data size.
### 3. Quadratic Order $O(n^2)$ {data-toc-label="Quadratic Order"}
### 3. Quadratic Order $O(n^2)$ {data-toc-label="3. Quadratic Order"}
Quadratic order means the number of operations grows quadratically with the input data size $n$. Quadratic order typically appears in nested loops, where both the outer and inner loops have a time complexity of $O(n)$, resulting in an overall complexity of $O(n^2)$:
Quadratic order means the number of operations grows quadratically with the input data size $n$. Quadratic order typically appears in nested loops, where both the outer and inner loops have a time complexity of $O(n)$, resulting in an overall complexity of $O(n^2)$:
@ -2073,7 +2073,7 @@ For instance, in bubble sort, the outer loop runs $n - 1$ times, and the inner l
### 4. Exponential Order $O(2^n)$ {data-toc-label="Exponential Order"}
### 4. Exponential Order $O(2^n)$ {data-toc-label="4. Exponential Order"}
Biological "cell division" is a classic example of exponential order growth: starting with one cell, it becomes two after one division, four after two divisions, and so on, resulting in $2^n$ cells after $n$ divisions.
Biological "cell division" is a classic example of exponential order growth: starting with one cell, it becomes two after one division, four after two divisions, and so on, resulting in $2^n$ cells after $n$ divisions.
@ -2491,7 +2491,7 @@ In practice, exponential order often appears in recursive functions. For example
Exponential order growth is extremely rapid and is commonly seen in exhaustive search methods (brute force, backtracking, etc.). For large-scale problems, exponential order is unacceptable, often requiring dynamic programming or greedy algorithms as solutions.
Exponential order growth is extremely rapid and is commonly seen in exhaustive search methods (brute force, backtracking, etc.). For large-scale problems, exponential order is unacceptable, often requiring dynamic programming or greedy algorithms as solutions.
### 5. Logarithmic Order $O(\log n)$ {data-toc-label="Logarithmic Order"}
### 5. Logarithmic Order $O(\log n)$ {data-toc-label="5. Logarithmic Order"}
In contrast to exponential order, logarithmic order reflects situations where "the size is halved each round." Given an input data size $n$, since the size is halved each round, the number of iterations is $\log_2 n$, the inverse function of $2^n$.
In contrast to exponential order, logarithmic order reflects situations where "the size is halved each round." Given an input data size $n$, since the size is halved each round, the number of iterations is $\log_2 n$, the inverse function of $2^n$.
@ -2861,7 +2861,7 @@ Logarithmic order is typical in algorithms based on the divide-and-conquer strat
This means the base $m$ can be changed without affecting the complexity. Therefore, we often omit the base $m$ and simply denote logarithmic order as $O(\log n)$.
This means the base $m$ can be changed without affecting the complexity. Therefore, we often omit the base $m$ and simply denote logarithmic order as $O(\log n)$.
### 6. Linear-Logarithmic Order $O(n \log n)$ {data-toc-label="Linear-Logarithmic Order"}
Linear-logarithmic order often appears in nested loops, with the complexities of the two loops being $O(\log n)$ and $O(n)$ respectively. The related code is as follows:
Linear-logarithmic order often appears in nested loops, with the complexities of the two loops being $O(\log n)$ and $O(n)$ respectively. The related code is as follows:
@ -3076,7 +3076,7 @@ The image below demonstrates how linear-logarithmic order is generated. Each lev
Mainstream sorting algorithms typically have a time complexity of $O(n \log n)$, such as quicksort, mergesort, and heapsort.
Mainstream sorting algorithms typically have a time complexity of $O(n \log n)$, such as quicksort, mergesort, and heapsort.
### 7. Factorial Order $O(n!)$ {data-toc-label="Factorial Order"}
### 7. Factorial Order $O(n!)$ {data-toc-label="7. Factorial Order"}
Factorial order corresponds to the mathematical problem of "full permutation." Given $n$ distinct elements, the total number of possible permutations is:
Factorial order corresponds to the mathematical problem of "full permutation." Given $n$ distinct elements, the total number of possible permutations is: