网易首页
38. Randomized Matrix Multiplication - 1
1年前 821观看
艾伦·爱德曼和茱莉亚
大学课程 / 外语
https://ocw.mit.edu/18-065S18 MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018 Professor Strang describes the four topics of the course: Linear Algebra, Deep Learning, Optimization, and Statistics.
共102集
10.8万人观看
1
Course Introduction of 18.065 by Professor Strang
07:03
2
The Column Space of A Contains All Vectors Ax - 1
17:27
3
The Column Space of A Contains All Vectors Ax - 2
17:28
4
The Column Space of A Contains All Vectors Ax - 3
17:23
5
Multiplying and Factoring Matrices - 1
16:11
6
Multiplying and Factoring Matrices - 2
16:14
7
Multiplying and Factoring Matrices - 3
16:03
8
Orthonormal Columns in Q Give Q'Q = I - 1
16:31
9
Orthonormal Columns in Q Give Q'Q = I - 2
16:38
10
Orthonormal Columns in Q Give Q'Q = I - 3
16:28
11
Eigenvalues and Eigenvectors - 1
16:21
12
Eigenvalues and Eigenvectors - 2
16:22
13
Eigenvalues and Eigenvectors - 3
16:21
14
Positive Definite and Semidefinite Matrices - 1
15:12
15
Positive Definite and Semidefinite Matrices - 2
15:19
16
Positive Definite and Semidefinite Matrices - 3
15:03
17
Singular Value Decomposition (SVD) - 1
17:54
18
Singular Value Decomposition (SVD) - 2
17:59
19
Singular Value Decomposition (SVD) - 3
17:51
20
Eckart-Young - The Closest Rank k Matrix to A - 1
15:48
21
Eckart-Young - The Closest Rank k Matrix to A - 2
15:49
22
Eckart-Young - The Closest Rank k Matrix to A - 3
15:46
23
Norms of Vectors and Matrices - 1
16:30
24
Norms of Vectors and Matrices - 2
16:30
25
Norms of Vectors and Matrices - 3
16:26
26
Four Ways to Solve Least Squares Problems - 1
16:40
27
Four Ways to Solve Least Squares Problems - 2
16:41
28
Four Ways to Solve Least Squares Problems - 3
16:32
29
Survey of Difficulties with Ax = b - 1
16:35
30
Survey of Difficulties with Ax = b - 2
16:39
31
Survey of Difficulties with Ax = b - 3
16:27
32
Minimizing _x_ Subject to Ax = b - 1
16:50
33
Minimizing _x_ Subject to Ax = b - 2
16:52
34
Minimizing _x_ Subject to Ax = b - 3
16:46
35
Computing Eigenvalues and Singular Values - 1
16:32
36
Computing Eigenvalues and Singular Values - 2
16:38
37
Computing Eigenvalues and Singular Values - 3
16:29
38
Randomized Matrix Multiplication - 1
17:31
39
Randomized Matrix Multiplication - 2
17:36
40
Randomized Matrix Multiplication - 3
17:29
41
Low Rank Changes in A and Its Inverse - 1
16:54
42
Low Rank Changes in A and Its Inverse - 2
16:55
43
Low Rank Changes in A and Its Inverse - 3
16:49
44
Matrices A(t) Depending on t, Derivative = dA_dt - 1
17:00
45
Matrices A(t) Depending on t, Derivative = dA_dt - 2
17:01
46
Matrices A(t) Depending on t, Derivative = dA_dt - 3
16:54
47
Derivatives of Inverse and Singular Values - 1
14:25
48
Derivatives of Inverse and Singular Values - 2
14:32
49
Derivatives of Inverse and Singular Values - 3
14:25
50
Rapidly Decreasing Singular Values - 1
16:54
51
Rapidly Decreasing Singular Values - 2
16:56
52
Rapidly Decreasing Singular Values - 3
16:52
53
Counting Parameters in SVD, LU, QR, Saddle Points - 1
16:23
54
Counting Parameters in SVD, LU, QR, Saddle Points - 2
16:24
55
Counting Parameters in SVD, LU, QR, Saddle Points - 3
16:16
56
Saddle Points Continued, Maxmin Principle - 1
17:27
57
Saddle Points Continued, Maxmin Principle - 2
17:32
58
Saddle Points Continued, Maxmin Principle - 3
17:27
59
Definitions and Inequalities - 1
18:23
60
Definitions and Inequalities - 2
18:30
61
Definitions and Inequalities - 3
18:19
62
Minimizing a Function Step by Step - 1
17:57
63
Minimizing a Function Step by Step - 2
18:02
64
Minimizing a Function Step by Step - 3
17:50
65
Gradient Descent - Downhill to a Minimum - 1
17:37
66
Gradient Descent - Downhill to a Minimum - 2
17:39
67
Gradient Descent - Downhill to a Minimum - 3
17:36
68
Accelerating Gradient Descent (Use Momentum) - 1
16:23
69
Accelerating Gradient Descent (Use Momentum) - 2
16:23
70
Accelerating Gradient Descent (Use Momentum) - 3
16:23
71
Linear Programming and Two-Person Games - 1
17:54
72
Linear Programming and Two-Person Games - 2
18:00
73
Linear Programming and Two-Person Games - 3
17:52
74
Stochastic Gradient Descent - 1
17:43
75
Stochastic Gradient Descent - 2
17:49
76
Stochastic Gradient Descent - 3
17:37
77
Structure of Neural Nets for Deep Learning - 1
17:48
78
Structure of Neural Nets for Deep Learning - 2
17:54
79
Structure of Neural Nets for Deep Learning - 3
17:47
80
Backpropagation - Find Partial Derivatives - 1
17:35
81
Backpropagation - Find Partial Derivatives - 2
17:35
82
Backpropagation - Find Partial Derivatives - 3
17:36
83
Completing a Rank-One Matrix, Circulants! - 1
16:40
84
Completing a Rank-One Matrix, Circulants! - 2
16:44
85
Completing a Rank-One Matrix, Circulants! - 3
16:34
86
Eigenvectors of Circulant Matrices - Fourier Matrix - 1
17:35
87
Eigenvectors of Circulant Matrices - Fourier Matrix - 2
17:36
88
Eigenvectors of Circulant Matrices - Fourier Matrix - 3
17:28
89
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 1
15:49
90
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 2
15:50
91
ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule - 3
15:43
92
Neural Nets and the Learning Function - 1
18:45
93
Neural Nets and the Learning Function - 2
18:48
94
Neural Nets and the Learning Function - 3
18:44
95
Distance Matrices, Procrustes Problem - 1
14:40
96
Distance Matrices, Procrustes Problem - 3
14:37
97
Finding Clusters in Graphs - 1
11:39
98
Finding Clusters in Graphs - 2
11:40
99
Finding Clusters in Graphs - 3
11:35
100
Alan Edelman and Julia Language - 1
12:46
101
Alan Edelman and Julia Language - 2
12:50
102
Alan Edelman and Julia Language - 3
12:45
相关视频
第37/43集 · 09:07
君子喻于义,小人喻于利
大学课程
2022年10月27日
4975观看
04:23
君子生于小国,非君子之过也
轻知识
12月前
3730观看
03:13
孔子有句名言:“君子周而不比,小人比而不周”,周和比啥意思?
轻知识
10月前
3117观看
00:26
易中天这段话,让你豁然开朗!君子当坚定如磐,行止由己
轻知识
1月前
1280观看
12:53
【河南历史纪录片之人物篇 - 韩非子】韩非子-02 - 1
纪录片
2022年11月3日
4242观看
04:12
第4集 韩非子-存韩
轻知识
1年前
4996观看
05:52
圣人真的会横空出世吗?
轻知识
10月前
1305观看
03:27
为什么说《孟子》是观其发越呢?
轻知识
7月前
5167观看
00:20
荀子还在想善恶呢,孟子已经想到物种上了 (cr
轻知识
4月前
1809观看
12:49
天地大儒王船山(缺第1集)【更新中】(君子小人) - 3
2022年11月5日
2096观看
04:28
孟子是如何理解“不孝”二字的?
轻知识
7月前
1782观看
19:24
《中国古代文化先贤》第一集 墨子 - 1
纪录片
2022年10月27日
2.2万观看
第5/16集 · 09:04
《礼记》五章(三) - 1
大学课程
2022年10月31日
3364观看
05:30
兼爱 非攻 到底是什么意思呢,墨子 - 3
轻知识
10月前
4304观看
06:23
第7集 至圣先师孔子的当今
轻知识
1年前
3993观看
04:09
为什么“民”这个字在《孟子》中出现了209次?
轻知识
6月前
2048观看