网易首页
72. Linear Programming and Two-Person Games - 2
2023年9月23日 1091观看
艾伦·爱德曼和茱莉亚
大学课程 / 外语
https://ocw.mit.edu/18-065S18 MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018 Professor Strang describes the four topics of the course: Linear Algebra, Deep Learning, Optimization, and Statistics.
共102集
11.5万人观看
1
Course Introduction of 18.065 by Professor Strang
07:03
2
The Column Space of A Contains All Vectors Ax - 1
17:27
3
The Column Space of A Contains All Vectors Ax - 2
17:28
4
The Column Space of A Contains All Vectors Ax - 3
17:23
5
Multiplying and Factoring Matrices - 1
16:11
6
Multiplying and Factoring Matrices - 2
16:14
7
Multiplying and Factoring Matrices - 3
16:03
8
Orthonormal Columns in Q Give Q'Q = I - 1
16:31
9
Orthonormal Columns in Q Give Q'Q = I - 2
16:38
10
Orthonormal Columns in Q Give Q'Q = I - 3
16:28
11
Eigenvalues and Eigenvectors - 1
16:21
12
Eigenvalues and Eigenvectors - 2
16:22
13
Eigenvalues and Eigenvectors - 3
16:21
14
Positive Definite and Semidefinite Matrices - 1
15:12
15
Positive Definite and Semidefinite Matrices - 2
15:19
16
Positive Definite and Semidefinite Matrices - 3
15:03
17
Singular Value Decomposition (SVD) - 1
17:54
18
Singular Value Decomposition (SVD) - 2
17:59
19
Singular Value Decomposition (SVD) - 3
17:51
20
Eckart-Young - The Closest Rank k Matrix to A - 1
15:48
21
Eckart-Young - The Closest Rank k Matrix to A - 2
15:49
22
Eckart-Young - The Closest Rank k Matrix to A - 3
15:46
23
Norms of Vectors and Matrices - 1
16:30
24
Norms of Vectors and Matrices - 2
16:30
25
Norms of Vectors and Matrices - 3
16:26
26
Four Ways to Solve Least Squares Problems - 1
16:40
27
Four Ways to Solve Least Squares Problems - 2
16:41
28
Four Ways to Solve Least Squares Problems - 3
16:32
29
Survey of Difficulties with Ax = b - 1
16:35
30
Survey of Difficulties with Ax = b - 2
16:39
31
Survey of Difficulties with Ax = b - 3
16:27
32
Minimizing _x_ Subject to Ax = b - 1
16:50
33
Minimizing _x_ Subject to Ax = b - 2
16:52
34
Minimizing _x_ Subject to Ax = b - 3
16:46
35
Computing Eigenvalues and Singular Values - 1
16:32
36
Computing Eigenvalues and Singular Values - 2
16:38
37
Computing Eigenvalues and Singular Values - 3
16:29
38
Randomized Matrix Multiplication - 1
17:31
39
Randomized Matrix Multiplication - 2
17:36
40
Randomized Matrix Multiplication - 3
17:29
41
Low Rank Changes in A and Its Inverse - 1
16:54
42
Low Rank Changes in A and Its Inverse - 2
16:55
43
Low Rank Changes in A and Its Inverse - 3
16:49
44
Matrices A(t) Depending on t, Derivative = dA_dt - 1
17:00
45
Matrices A(t) Depending on t, Derivative = dA_dt - 2
17:01
46
Matrices A(t) Depending on t, Derivative = dA_dt - 3
16:54
47
Derivatives of Inverse and Singular Values - 1
14:25
48
Derivatives of Inverse and Singular Values - 2
14:32
49
Derivatives of Inverse and Singular Values - 3
14:25
50
Rapidly Decreasing Singular Values - 1
16:54
51
Rapidly Decreasing Singular Values - 2
16:56
52
Rapidly Decreasing Singular Values - 3
16:52
53
Counting Parameters in SVD, LU, QR, Saddle Points - 1
16:23
54
Counting Parameters in SVD, LU, QR, Saddle Points - 2
16:24
55
Counting Parameters in SVD, LU, QR, Saddle Points - 3
16:16
56
Saddle Points Continued, Maxmin Principle - 1
17:27
57
Saddle Points Continued, Maxmin Principle - 2
17:32
58
Saddle Points Continued, Maxmin Principle - 3
17:27
59
Definitions and Inequalities - 1
18:23
60
Definitions and Inequalities - 2
18:30
61
Definitions and Inequalities - 3
18:19
62
Minimizing a Function Step by Step - 1
17:57
63
Minimizing a Function Step by Step - 2
18:02
64
Minimizing a Function Step by Step - 3
17:50
65
Gradient Descent - Downhill to a Minimum - 1
17:37
66
Gradient Descent - Downhill to a Minimum - 2
17:39
67
Gradient Descent - Downhill to a Minimum - 3
17:36
68
Accelerating Gradient Descent (Use Momentum) - 1
16:23
69
Accelerating Gradient Descent (Use Momentum) - 2
16:23
70
Accelerating Gradient Descent (Use Momentum) - 3
16:23
71
Linear Programming and Two-Person Games - 1
17:54
72
Linear Programming and Two-Person Games - 2
18:00
73
Linear Programming and Two-Person Games - 3
17:52
74
Stochastic Gradient Descent - 1
17:43
75
Stochastic Gradient Descent - 2
17:49
76
Stochastic Gradient Descent - 3
17:37
77
Structure of Neural Nets for Deep Learning - 1
17:48
78
Structure of Neural Nets for Deep Learning - 2
17:54
79
Structure of Neural Nets for Deep Learning - 3
17:47
80
Backpropagation - Find Partial Derivatives - 1
17:35
81
Backpropagation - Find Partial Derivatives - 2
17:35
82
Backpropagation - Find Partial Derivatives - 3
17:36
83
Completing a Rank-One Matrix, Circulants! - 1
16:40
84
Completing a Rank-One Matrix, Circulants! - 2
16:44
85
Completing a Rank-One Matrix, Circulants! - 3
16:34
86
Eigenvectors of Circulant Matrices - Fourier Matrix - 1
17:35
87
Eigenvectors of Circulant Matrices - Fourier Matrix - 2
17:36
88
Eigenvectors of Circulant Matrices - Fourier Matrix - 3
17:28
89
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 1
15:49
90
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 2
15:50
91
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 3
15:43
92
Neural Nets and the Learning Function - 1
18:45
93
Neural Nets and the Learning Function - 2
18:48
94
Neural Nets and the Learning Function - 3
18:44
95
Distance Matrices, Procrustes Problem - 1
14:40
96
Distance Matrices, Procrustes Problem - 3
14:37
97
Finding Clusters in Graphs - 1
11:39
98
Finding Clusters in Graphs - 2
11:40
99
Finding Clusters in Graphs - 3
11:35
100
Alan Edelman and Julia Language - 1
12:46
101
Alan Edelman and Julia Language - 2
12:50
102
Alan Edelman and Julia Language - 3
12:45
相关视频
05:51
【访谈】儒学的来源和儒学的意义
2022年11月1日
2110观看
11:38
【文化】中庸的精神实质(全五集)(一) - 3
2022年10月29日
4933观看
03:53
《中庸》的五达道说的是什么?
轻知识
1年前
1680观看
第8/43集 · 06:31
孔子从政 - 3
大学课程
2022年10月27日
8618观看
1:07:42
一口气读懂《中庸》思想,领悟2000多年前人类智慧结晶
轻知识
11月前
1952观看
02:32
汉武帝罢黜百家,独尊儒术,是真儒学吗
轻知识
10月前
1067观看
第8/51集 · 09:29
朱熹的理学思想 - 3
大学课程
2022年10月27日
3640观看
05:22
孔子的救世秘籍!仁爱,礼治,德治,中庸~孔子 - 3
轻知识
1年前
3238观看
第2/8集 · 09:55
儒学中的普世价值(一) - 3
大学课程
2022年10月27日
3073观看
15:27
【转载】直播回放:浙江省台州市高中历史陈家华网络名师工作室系列课程(全4讲)(汉代儒学) - 1
2023年8月8日
1497观看
01:45
傅佩荣:真正懂孔子的最多两个人,第一个好猜,第二个很难猜得到
轻知识
10月前
1497观看
02:54
博物致知丨何以中国 何以我们
轻知识
1年前
1642观看
第21/33集 · 12:20
武汉大学《说文解字》:宫室与数目类基础字形讲析 - 3
大学课程
2022年10月19日
3095观看
05:14
何以中国 | 当孔子与朱熹坐论千年
轻知识
7月前
790观看
13:06
孟子的智慧05_仁政的理想 - 3
2022年10月31日
1447观看
第11/38集 · 13:21
《论语》研读(一) - 1
大学课程
2022年10月27日
4631观看