网易首页
66. Gradient Descent - Downhill to a Minimum - 2
2023年9月23日 577观看
艾伦·爱德曼和茱莉亚
大学课程 / 外语
https://ocw.mit.edu/18-065S18 MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018 Professor Strang describes the four topics of the course: Linear Algebra, Deep Learning, Optimization, and Statistics.
共102集
11.5万人观看
1
Course Introduction of 18.065 by Professor Strang
07:03
2
The Column Space of A Contains All Vectors Ax - 1
17:27
3
The Column Space of A Contains All Vectors Ax - 2
17:28
4
The Column Space of A Contains All Vectors Ax - 3
17:23
5
Multiplying and Factoring Matrices - 1
16:11
6
Multiplying and Factoring Matrices - 2
16:14
7
Multiplying and Factoring Matrices - 3
16:03
8
Orthonormal Columns in Q Give Q'Q = I - 1
16:31
9
Orthonormal Columns in Q Give Q'Q = I - 2
16:38
10
Orthonormal Columns in Q Give Q'Q = I - 3
16:28
11
Eigenvalues and Eigenvectors - 1
16:21
12
Eigenvalues and Eigenvectors - 2
16:22
13
Eigenvalues and Eigenvectors - 3
16:21
14
Positive Definite and Semidefinite Matrices - 1
15:12
15
Positive Definite and Semidefinite Matrices - 2
15:19
16
Positive Definite and Semidefinite Matrices - 3
15:03
17
Singular Value Decomposition (SVD) - 1
17:54
18
Singular Value Decomposition (SVD) - 2
17:59
19
Singular Value Decomposition (SVD) - 3
17:51
20
Eckart-Young - The Closest Rank k Matrix to A - 1
15:48
21
Eckart-Young - The Closest Rank k Matrix to A - 2
15:49
22
Eckart-Young - The Closest Rank k Matrix to A - 3
15:46
23
Norms of Vectors and Matrices - 1
16:30
24
Norms of Vectors and Matrices - 2
16:30
25
Norms of Vectors and Matrices - 3
16:26
26
Four Ways to Solve Least Squares Problems - 1
16:40
27
Four Ways to Solve Least Squares Problems - 2
16:41
28
Four Ways to Solve Least Squares Problems - 3
16:32
29
Survey of Difficulties with Ax = b - 1
16:35
30
Survey of Difficulties with Ax = b - 2
16:39
31
Survey of Difficulties with Ax = b - 3
16:27
32
Minimizing _x_ Subject to Ax = b - 1
16:50
33
Minimizing _x_ Subject to Ax = b - 2
16:52
34
Minimizing _x_ Subject to Ax = b - 3
16:46
35
Computing Eigenvalues and Singular Values - 1
16:32
36
Computing Eigenvalues and Singular Values - 2
16:38
37
Computing Eigenvalues and Singular Values - 3
16:29
38
Randomized Matrix Multiplication - 1
17:31
39
Randomized Matrix Multiplication - 2
17:36
40
Randomized Matrix Multiplication - 3
17:29
41
Low Rank Changes in A and Its Inverse - 1
16:54
42
Low Rank Changes in A and Its Inverse - 2
16:55
43
Low Rank Changes in A and Its Inverse - 3
16:49
44
Matrices A(t) Depending on t, Derivative = dA_dt - 1
17:00
45
Matrices A(t) Depending on t, Derivative = dA_dt - 2
17:01
46
Matrices A(t) Depending on t, Derivative = dA_dt - 3
16:54
47
Derivatives of Inverse and Singular Values - 1
14:25
48
Derivatives of Inverse and Singular Values - 2
14:32
49
Derivatives of Inverse and Singular Values - 3
14:25
50
Rapidly Decreasing Singular Values - 1
16:54
51
Rapidly Decreasing Singular Values - 2
16:56
52
Rapidly Decreasing Singular Values - 3
16:52
53
Counting Parameters in SVD, LU, QR, Saddle Points - 1
16:23
54
Counting Parameters in SVD, LU, QR, Saddle Points - 2
16:24
55
Counting Parameters in SVD, LU, QR, Saddle Points - 3
16:16
56
Saddle Points Continued, Maxmin Principle - 1
17:27
57
Saddle Points Continued, Maxmin Principle - 2
17:32
58
Saddle Points Continued, Maxmin Principle - 3
17:27
59
Definitions and Inequalities - 1
18:23
60
Definitions and Inequalities - 2
18:30
61
Definitions and Inequalities - 3
18:19
62
Minimizing a Function Step by Step - 1
17:57
63
Minimizing a Function Step by Step - 2
18:02
64
Minimizing a Function Step by Step - 3
17:50
65
Gradient Descent - Downhill to a Minimum - 1
17:37
66
Gradient Descent - Downhill to a Minimum - 2
17:39
67
Gradient Descent - Downhill to a Minimum - 3
17:36
68
Accelerating Gradient Descent (Use Momentum) - 1
16:23
69
Accelerating Gradient Descent (Use Momentum) - 2
16:23
70
Accelerating Gradient Descent (Use Momentum) - 3
16:23
71
Linear Programming and Two-Person Games - 1
17:54
72
Linear Programming and Two-Person Games - 2
18:00
73
Linear Programming and Two-Person Games - 3
17:52
74
Stochastic Gradient Descent - 1
17:43
75
Stochastic Gradient Descent - 2
17:49
76
Stochastic Gradient Descent - 3
17:37
77
Structure of Neural Nets for Deep Learning - 1
17:48
78
Structure of Neural Nets for Deep Learning - 2
17:54
79
Structure of Neural Nets for Deep Learning - 3
17:47
80
Backpropagation - Find Partial Derivatives - 1
17:35
81
Backpropagation - Find Partial Derivatives - 2
17:35
82
Backpropagation - Find Partial Derivatives - 3
17:36
83
Completing a Rank-One Matrix, Circulants! - 1
16:40
84
Completing a Rank-One Matrix, Circulants! - 2
16:44
85
Completing a Rank-One Matrix, Circulants! - 3
16:34
86
Eigenvectors of Circulant Matrices - Fourier Matrix - 1
17:35
87
Eigenvectors of Circulant Matrices - Fourier Matrix - 2
17:36
88
Eigenvectors of Circulant Matrices - Fourier Matrix - 3
17:28
89
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 1
15:49
90
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 2
15:50
91
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 3
15:43
92
Neural Nets and the Learning Function - 1
18:45
93
Neural Nets and the Learning Function - 2
18:48
94
Neural Nets and the Learning Function - 3
18:44
95
Distance Matrices, Procrustes Problem - 1
14:40
96
Distance Matrices, Procrustes Problem - 3
14:37
97
Finding Clusters in Graphs - 1
11:39
98
Finding Clusters in Graphs - 2
11:40
99
Finding Clusters in Graphs - 3
11:35
100
Alan Edelman and Julia Language - 1
12:46
101
Alan Edelman and Julia Language - 2
12:50
102
Alan Edelman and Julia Language - 3
12:45
相关视频
01:36
把“黯然失色”写成“黑暗的暗”对嘛
轻知识
4月前
1188观看
03:43
后室是怎么出现的?后室编年史(1970—1989)!
轻知识
11月前
5466观看
51:33
不为人知的罪恶,2亿美元的史诗,2万字解读《花月杀手》
轻知识
1年前
3139观看
05:48
第3集 神话故事之女娲补天
轻知识
1年前
770观看
17:39
并非传统的王子复仇记,看科幻巨著《沙丘》背后隐含的深邃宇宙
轻知识
12月前
1205观看
04:12
详看金瓶梅 | 第二十九回(中集):预言再准谁会信?
轻知识
1年前
967观看
33:29
埃德加·凯西:号称最伟大预言家的男人,究竟有着怎样的真实故事
轻知识
2023年9月11日
1691观看
17:33
《招魂》宇宙编年史:名场面无数,跨越千年的恐怖
轻知识
2023年10月16日
1941观看
07:04
丑恶,秽乱,“文坛毒药”到“现代诗歌的灵魂”|波德莱尔《恶之花》
轻知识
2023年8月8日
8543观看
02:48
中国说|马丁·雅克:“一带一路”如同一部史诗
轻知识
2023年10月18日
735观看
34:33
《灵笼》编年史(完整)0626v1
轻知识
5月前
2019观看
04:50
鲜为人知的中华史诗《黑暗传》,离奇诅咒之谜!讲述了怎样的故事
轻知识
8月前
1073观看
02:41
加缪的荒诞哲学给了我们什么启示?西西弗斯神话的启示又是什么?
轻知识
2023年10月10日
4839观看
00:27
科幻小说里的罗马帝国史
轻知识
1年前
1595观看
01:31
史诗历程!在超高速星生存的智慧文明会经历什么?
轻知识
1年前
1053观看
04:24
究竟是什么,让普罗米修斯成为最受爱戴的希腊神话人物?
轻知识
2023年8月8日
941观看