Fix toc for the webpage of the chapter of computational complexity (#1107)

* fix the math formula in TOC

* Update space_complexity.md

* Update time_complexity.md

* Update space_complexity.md

* Update time_complexity.md

---------

Co-authored-by: Yudong Jin <krahets@163.com>
pull/1170/head
None 8 months ago committed by GitHub
parent 6069cb89a7
commit 739ee24751
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -717,7 +717,7 @@ $$
![Common Types of Space Complexity](space_complexity.assets/space_complexity_common_types.png) ![Common Types of Space Complexity](space_complexity.assets/space_complexity_common_types.png)
### Constant Order $O(1)$ ### Constant Order $O(1)$ {data-toc-label="Constant Order"}
Constant order is common in constants, variables, objects that are independent of the size of input data $n$. Constant order is common in constants, variables, objects that are independent of the size of input data $n$.
@ -727,7 +727,7 @@ Note that memory occupied by initializing variables or calling functions in a lo
[file]{space_complexity}-[class]{}-[func]{constant} [file]{space_complexity}-[class]{}-[func]{constant}
``` ```
### Linear Order $O(n)$ ### Linear Order $O(n)$ {data-toc-label="Linear Order"}
Linear order is common in arrays, linked lists, stacks, queues, etc., where the number of elements is proportional to $n$: Linear order is common in arrays, linked lists, stacks, queues, etc., where the number of elements is proportional to $n$:
@ -743,7 +743,7 @@ As shown below, this function's recursive depth is $n$, meaning there are $n$ in
![Recursive Function Generating Linear Order Space Complexity](space_complexity.assets/space_complexity_recursive_linear.png) ![Recursive Function Generating Linear Order Space Complexity](space_complexity.assets/space_complexity_recursive_linear.png)
### Quadratic Order $O(n^2)$ ### Quadratic Order $O(n^2)$ {data-toc-label="Quadratic Order"}
Quadratic order is common in matrices and graphs, where the number of elements is quadratic to $n$: Quadratic order is common in matrices and graphs, where the number of elements is quadratic to $n$:
@ -759,7 +759,7 @@ As shown below, the recursive depth of this function is $n$, and in each recursi
![Recursive Function Generating Quadratic Order Space Complexity](space_complexity.assets/space_complexity_recursive_quadratic.png) ![Recursive Function Generating Quadratic Order Space Complexity](space_complexity.assets/space_complexity_recursive_quadratic.png)
### Exponential Order $O(2^n)$ ### Exponential Order $O(2^n)$ {data-toc-label="Exponential Order"}
Exponential order is common in binary trees. Observe the below image, a "full binary tree" with $n$ levels has $2^n - 1$ nodes, occupying $O(2^n)$ space: Exponential order is common in binary trees. Observe the below image, a "full binary tree" with $n$ levels has $2^n - 1$ nodes, occupying $O(2^n)$ space:
@ -769,7 +769,7 @@ Exponential order is common in binary trees. Observe the below image, a "full bi
![Full Binary Tree Generating Exponential Order Space Complexity](space_complexity.assets/space_complexity_exponential.png) ![Full Binary Tree Generating Exponential Order Space Complexity](space_complexity.assets/space_complexity_exponential.png)
### Logarithmic Order $O(\log n)$ ### Logarithmic Order $O(\log n)$ {data-toc-label="Logarithmic Order"}
Logarithmic order is common in divide-and-conquer algorithms. For example, in merge sort, an array of length $n$ is recursively divided in half each round, forming a recursion tree of height $\log n$, using $O(\log n)$ stack frame space. Logarithmic order is common in divide-and-conquer algorithms. For example, in merge sort, an array of length $n$ is recursively divided in half each round, forming a recursion tree of height $\log n$, using $O(\log n)$ stack frame space.

@ -938,7 +938,7 @@ $$
![Common Types of Time Complexity](time_complexity.assets/time_complexity_common_types.png) ![Common Types of Time Complexity](time_complexity.assets/time_complexity_common_types.png)
### Constant Order $O(1)$ ### Constant Order $O(1)$ {data-toc-label="Constant Order"}
Constant order means the number of operations is independent of the input data size $n$. In the following function, although the number of operations `size` might be large, the time complexity remains $O(1)$ as it's unrelated to $n$: Constant order means the number of operations is independent of the input data size $n$. In the following function, although the number of operations `size` might be large, the time complexity remains $O(1)$ as it's unrelated to $n$:
@ -946,7 +946,7 @@ Constant order means the number of operations is independent of the input data s
[file]{time_complexity}-[class]{}-[func]{constant} [file]{time_complexity}-[class]{}-[func]{constant}
``` ```
### Linear Order $O(n)$ ### Linear Order $O(n)$ {data-toc-label="Linear Order"}
Linear order indicates the number of operations grows linearly with the input data size $n$. Linear order commonly appears in single-loop structures: Linear order indicates the number of operations grows linearly with the input data size $n$. Linear order commonly appears in single-loop structures:
@ -962,7 +962,7 @@ Operations like array traversal and linked list traversal have a time complexity
It's important to note that **the input data size $n$ should be determined based on the type of input data**. For example, in the first example, $n$ represents the input data size, while in the second example, the length of the array $n$ is the data size. It's important to note that **the input data size $n$ should be determined based on the type of input data**. For example, in the first example, $n$ represents the input data size, while in the second example, the length of the array $n$ is the data size.
### Quadratic Order $O(n^2)$ ### Quadratic Order $O(n^2)$ {data-toc-label="Quadratic Order"}
Quadratic order means the number of operations grows quadratically with the input data size $n$. Quadratic order typically appears in nested loops, where both the outer and inner loops have a time complexity of $O(n)$, resulting in an overall complexity of $O(n^2)$: Quadratic order means the number of operations grows quadratically with the input data size $n$. Quadratic order typically appears in nested loops, where both the outer and inner loops have a time complexity of $O(n)$, resulting in an overall complexity of $O(n^2)$:
@ -980,7 +980,7 @@ For instance, in bubble sort, the outer loop runs $n - 1$ times, and the inner l
[file]{time_complexity}-[class]{}-[func]{bubble_sort} [file]{time_complexity}-[class]{}-[func]{bubble_sort}
``` ```
### Exponential Order $O(2^n)$ ### Exponential Order $O(2^n)$ {data-toc-label="Exponential Order"}
Biological "cell division" is a classic example of exponential order growth: starting with one cell, it becomes two after one division, four after two divisions, and so on, resulting in $2^n$ cells after $n$ divisions. Biological "cell division" is a classic example of exponential order growth: starting with one cell, it becomes two after one division, four after two divisions, and so on, resulting in $2^n$ cells after $n$ divisions.
@ -1000,7 +1000,7 @@ In practice, exponential order often appears in recursive functions. For example
Exponential order growth is extremely rapid and is commonly seen in exhaustive search methods (brute force, backtracking, etc.). For large-scale problems, exponential order is unacceptable, often requiring dynamic programming or greedy algorithms as solutions. Exponential order growth is extremely rapid and is commonly seen in exhaustive search methods (brute force, backtracking, etc.). For large-scale problems, exponential order is unacceptable, often requiring dynamic programming or greedy algorithms as solutions.
### Logarithmic Order $O(\log n)$ ### Logarithmic Order $O(\log n)$ {data-toc-label="Logarithmic Order"}
In contrast to exponential order, logarithmic order reflects situations where "the size is halved each round." Given an input data size $n$, since the size is halved each round, the number of iterations is $\log_2 n$, the inverse function of $2^n$. In contrast to exponential order, logarithmic order reflects situations where "the size is halved each round." Given an input data size $n$, since the size is halved each round, the number of iterations is $\log_2 n$, the inverse function of $2^n$.
@ -1030,7 +1030,7 @@ Logarithmic order is typical in algorithms based on the divide-and-conquer strat
This means the base $m$ can be changed without affecting the complexity. Therefore, we often omit the base $m$ and simply denote logarithmic order as $O(\log n)$. This means the base $m$ can be changed without affecting the complexity. Therefore, we often omit the base $m$ and simply denote logarithmic order as $O(\log n)$.
### Linear-Logarithmic Order $O(n \log n)$ ### Linear-Logarithmic Order $O(n \log n)$ {data-toc-label="Linear-Logarithmic Order"}
Linear-logarithmic order often appears in nested loops, with the complexities of the two loops being $O(\log n)$ and $O(n)$ respectively. The related code is as follows: Linear-logarithmic order often appears in nested loops, with the complexities of the two loops being $O(\log n)$ and $O(n)$ respectively. The related code is as follows:
@ -1044,7 +1044,7 @@ The image below demonstrates how linear-logarithmic order is generated. Each lev
Mainstream sorting algorithms typically have a time complexity of $O(n \log n)$, such as quicksort, mergesort, and heapsort. Mainstream sorting algorithms typically have a time complexity of $O(n \log n)$, such as quicksort, mergesort, and heapsort.
### Factorial Order $O(n!)$ ### Factorial Order $O(n!)$ {data-toc-label="Factorial Order"}
Factorial order corresponds to the mathematical problem of "full permutation." Given $n$ distinct elements, the total number of possible permutations is: Factorial order corresponds to the mathematical problem of "full permutation." Given $n$ distinct elements, the total number of possible permutations is:

@ -716,7 +716,7 @@ $$
![常见的空间复杂度类型](space_complexity.assets/space_complexity_common_types.png) ![常见的空间复杂度类型](space_complexity.assets/space_complexity_common_types.png)
### 常数阶 $O(1)$ ### 常数阶 $O(1)$ {data-toc-label="常数阶"}
常数阶常见于数量与输入数据大小 $n$ 无关的常量、变量、对象。 常数阶常见于数量与输入数据大小 $n$ 无关的常量、变量、对象。
@ -726,7 +726,7 @@ $$
[file]{space_complexity}-[class]{}-[func]{constant} [file]{space_complexity}-[class]{}-[func]{constant}
``` ```
### 线性阶 $O(n)$ ### 线性阶 $O(n)$ {data-toc-label="线性阶"}
线性阶常见于元素数量与 $n$ 成正比的数组、链表、栈、队列等: 线性阶常见于元素数量与 $n$ 成正比的数组、链表、栈、队列等:
@ -742,7 +742,7 @@ $$
![递归函数产生的线性阶空间复杂度](space_complexity.assets/space_complexity_recursive_linear.png) ![递归函数产生的线性阶空间复杂度](space_complexity.assets/space_complexity_recursive_linear.png)
### 平方阶 $O(n^2)$ ### 平方阶 $O(n^2)$ {data-toc-label="平方阶"}
平方阶常见于矩阵和图,元素数量与 $n$ 成平方关系: 平方阶常见于矩阵和图,元素数量与 $n$ 成平方关系:
@ -758,7 +758,7 @@ $$
![递归函数产生的平方阶空间复杂度](space_complexity.assets/space_complexity_recursive_quadratic.png) ![递归函数产生的平方阶空间复杂度](space_complexity.assets/space_complexity_recursive_quadratic.png)
### 指数阶 $O(2^n)$ ### 指数阶 $O(2^n)$ {data-toc-label="指数阶"}
指数阶常见于二叉树。观察下图,层数为 $n$ 的“满二叉树”的节点数量为 $2^n - 1$ ,占用 $O(2^n)$ 空间: 指数阶常见于二叉树。观察下图,层数为 $n$ 的“满二叉树”的节点数量为 $2^n - 1$ ,占用 $O(2^n)$ 空间:
@ -768,7 +768,7 @@ $$
![满二叉树产生的指数阶空间复杂度](space_complexity.assets/space_complexity_exponential.png) ![满二叉树产生的指数阶空间复杂度](space_complexity.assets/space_complexity_exponential.png)
### 对数阶 $O(\log n)$ ### 对数阶 $O(\log n)$ {data-toc-label="对数阶"}
对数阶常见于分治算法。例如归并排序,输入长度为 $n$ 的数组,每轮递归将数组从中点处划分为两半,形成高度为 $\log n$ 的递归树,使用 $O(\log n)$ 栈帧空间。 对数阶常见于分治算法。例如归并排序,输入长度为 $n$ 的数组,每轮递归将数组从中点处划分为两半,形成高度为 $\log n$ 的递归树,使用 $O(\log n)$ 栈帧空间。

@ -940,7 +940,7 @@ $$
![常见的时间复杂度类型](time_complexity.assets/time_complexity_common_types.png) ![常见的时间复杂度类型](time_complexity.assets/time_complexity_common_types.png)
### 常数阶 $O(1)$ ### 常数阶 $O(1)$ {data-toc-label="常数阶"}
常数阶的操作数量与输入数据大小 $n$ 无关,即不随着 $n$ 的变化而变化。 常数阶的操作数量与输入数据大小 $n$ 无关,即不随着 $n$ 的变化而变化。
@ -950,7 +950,7 @@ $$
[file]{time_complexity}-[class]{}-[func]{constant} [file]{time_complexity}-[class]{}-[func]{constant}
``` ```
### 线性阶 $O(n)$ ### 线性阶 $O(n)$ {data-toc-label="线性阶"}
线性阶的操作数量相对于输入数据大小 $n$ 以线性级别增长。线性阶通常出现在单层循环中: 线性阶的操作数量相对于输入数据大小 $n$ 以线性级别增长。线性阶通常出现在单层循环中:
@ -966,7 +966,7 @@ $$
值得注意的是,**输入数据大小 $n$ 需根据输入数据的类型来具体确定**。比如在第一个示例中,变量 $n$ 为输入数据大小;在第二个示例中,数组长度 $n$ 为数据大小。 值得注意的是,**输入数据大小 $n$ 需根据输入数据的类型来具体确定**。比如在第一个示例中,变量 $n$ 为输入数据大小;在第二个示例中,数组长度 $n$ 为数据大小。
### 平方阶 $O(n^2)$ ### 平方阶 $O(n^2)$ {data-toc-label="平方阶"}
平方阶的操作数量相对于输入数据大小 $n$ 以平方级别增长。平方阶通常出现在嵌套循环中,外层循环和内层循环的时间复杂度都为 $O(n)$ ,因此总体的时间复杂度为 $O(n^2)$ 平方阶的操作数量相对于输入数据大小 $n$ 以平方级别增长。平方阶通常出现在嵌套循环中,外层循环和内层循环的时间复杂度都为 $O(n)$ ,因此总体的时间复杂度为 $O(n^2)$
@ -984,7 +984,7 @@ $$
[file]{time_complexity}-[class]{}-[func]{bubble_sort} [file]{time_complexity}-[class]{}-[func]{bubble_sort}
``` ```
### 指数阶 $O(2^n)$ ### 指数阶 $O(2^n)$ {data-toc-label="指数阶"}
生物学的“细胞分裂”是指数阶增长的典型例子:初始状态为 $1$ 个细胞,分裂一轮后变为 $2$ 个,分裂两轮后变为 $4$ 个,以此类推,分裂 $n$ 轮后有 $2^n$ 个细胞。 生物学的“细胞分裂”是指数阶增长的典型例子:初始状态为 $1$ 个细胞,分裂一轮后变为 $2$ 个,分裂两轮后变为 $4$ 个,以此类推,分裂 $n$ 轮后有 $2^n$ 个细胞。
@ -1004,7 +1004,7 @@ $$
指数阶增长非常迅速,在穷举法(暴力搜索、回溯等)中比较常见。对于数据规模较大的问题,指数阶是不可接受的,通常需要使用动态规划或贪心算法等来解决。 指数阶增长非常迅速,在穷举法(暴力搜索、回溯等)中比较常见。对于数据规模较大的问题,指数阶是不可接受的,通常需要使用动态规划或贪心算法等来解决。
### 对数阶 $O(\log n)$ ### 对数阶 $O(\log n)$ {data-toc-label="对数阶"}
与指数阶相反,对数阶反映了“每轮缩减到一半”的情况。设输入数据大小为 $n$ ,由于每轮缩减到一半,因此循环次数是 $\log_2 n$ ,即 $2^n$ 的反函数。 与指数阶相反,对数阶反映了“每轮缩减到一半”的情况。设输入数据大小为 $n$ ,由于每轮缩减到一半,因此循环次数是 $\log_2 n$ ,即 $2^n$ 的反函数。
@ -1034,7 +1034,7 @@ $$
也就是说,底数 $m$ 可以在不影响复杂度的前提下转换。因此我们通常会省略底数 $m$ ,将对数阶直接记为 $O(\log n)$ 。 也就是说,底数 $m$ 可以在不影响复杂度的前提下转换。因此我们通常会省略底数 $m$ ,将对数阶直接记为 $O(\log n)$ 。
### 线性对数阶 $O(n \log n)$ ### 线性对数阶 $O(n \log n)$ {data-toc-label="线性对数阶"}
线性对数阶常出现于嵌套循环中,两层循环的时间复杂度分别为 $O(\log n)$ 和 $O(n)$ 。相关代码如下: 线性对数阶常出现于嵌套循环中,两层循环的时间复杂度分别为 $O(\log n)$ 和 $O(n)$ 。相关代码如下:
@ -1048,7 +1048,7 @@ $$
主流排序算法的时间复杂度通常为 $O(n \log n)$ ,例如快速排序、归并排序、堆排序等。 主流排序算法的时间复杂度通常为 $O(n \log n)$ ,例如快速排序、归并排序、堆排序等。
### 阶乘阶 $O(n!)$ ### 阶乘阶 $O(n!)$ {data-toc-label="阶乘阶"}
阶乘阶对应数学上的“全排列”问题。给定 $n$ 个互不重复的元素,求其所有可能的排列方案,方案数量为: 阶乘阶对应数学上的“全排列”问题。给定 $n$ 个互不重复的元素,求其所有可能的排列方案,方案数量为:

Loading…
Cancel
Save