学习方案是最有价值的。很多“民间科学家”或是思而不学则殆,或是追捧一些不学就会的天才传说,并不一定是好逸恶劳,可能只是不得其法。
隔壁生物医学工程系有一个成像科学的博士专业,招来的学生本科专业很杂,医学、生物、物理、数学、计算机……数学水平参差不齐。
为了解决这个问题,让科研牛马们能尽快出活儿,他们在正规课程之外加开了一个数学速成班 (math crash course)。
计划用 2 个月的时间让没学过的人速通/学过的人复习线性代数和微积分,其中线性代数 5 个星期,微积分 4 个星期。
总的学习内容被分成了若干模块,通过跳过部分内容,可以按照两种难度(入门/进阶),三种强度(每周 5 小时/ 10 小时/ 15 小时)定出 6 种学习方案。
学习方案是最有价值的。很多“民间科学家”或是思而不学则殆,或是追捧一些不学就会的天才传说,并不一定是好逸恶劳,可能只是不得其法。
一天之内通宵达旦听起来很唬人,但每个工作日坚持一小时要可贵得多,也难得多,哪怕后者甚至不耽误周末和假期。
模块
| 模块 | 初学 | 进阶 |
|---|---|---|
| M1 | 线性方程系统 | 线性方程系统 |
| M2 | 行消去、阶梯型矩阵 | 线性相关、线性变换 |
| M3 | 向量方程组 | 矩阵运算、转置、逆 |
| M4 | 矩阵方程 Ax=b | 向量空间和子空间 |
| M5 | 线性系统的解集 | 线性无关、秩 |
| M6 | 线性相关 | 本征向量、本征值 |
| M7 | 线性变换(介绍) | 对角化 |
| M8 | 线性变换的矩阵 | 本征向量和线性变换 |
| M9 | 矩阵运算 | 对阵矩阵的对角化 |
| M10 | 矩阵的逆 | 奇异值分解 SVD |
| M11 | 可逆矩阵的性质 | 主成分分析 PCA |
| M12 | 行列式 | 编程实现 PCA |
| M13 | 向量空间和子空间 | 使用现成的 PCA |
| M14 | 核、列空间、线性变换 | 奇异值和主成分的可视化 |
| M15 | 线性无关集、基 | PCA 论文讨论 |
| M16 | 本征向量、本征值 1 | 正交 |
| M17 | 本征向量、本征值 2 | 正交投影、内积空间 |
| M18 | 特征方程 | 二次型、条件优化 |
| M19 | 对角化 1 | 子空间、超平面 |
| M20 | 对角化 2 | 线性可分、支持向量机 SVM |
| M21 | 本征向量和线性变换 1 | 编程实现 SVM |
| M22 | 本征向量和线性变换 2 | 使用现成的 SVM |
| M23 | 内积、长度、正交 | 调整 SVM 参数 |
| M24 | 正交集 | 将 SVM 拓展到额外数据 |
| M25 | MATLAB 练习 | SVM 论文讨论 |
学习方案
| 难度 | 精力 | 第 1 周 | 第 2 周 | 第 3 周 | 第 4 周 | 第 5 周 |
|---|---|---|---|---|---|---|
| 入门 | 每周 5 小时 | M1, M2 | M4, M6 | M7, M9 | M10, M13 | M14, M15 |
| 每周 10 小时 | M1, M2, M3 | M4, M5, M6 | M7, M8, M9 | M10, M13, M14 | M15, M16, M17 | |
| 每周 15 小时 | M1, M2, M3, M4 | M5, M6, M7, M8 | M9, M10, M11, M12 | M13, M14, M15, M16 | M17, M21, M22 | |
| 可以略过 | M17, M19, M20, M23, M24, M25 | |||||
| 进阶 | 每周 5 小时 | M2, M3 | M4, M5 | M6, M7 | M10, M11 | M16, M17 |
| 每周 10 小时 | M1, M2, M3 | M4, M5, M6 | M7, M8, M9 | M10, M11, M16 | M17, M19, M20 | |
| 每周 15 小时 | M1, M2, M3, M4 | M5, M6, M7, M8 | M9, M10, M11, M12 | M13, M14,M15, M16 | M17, M19, M20 | |
| 实用 | PCA & SVM | M6, M8, M10 | M11, M12, M13, M15 | M16, M17, M19 | M20, M21, M22 | M23, M24, M25 |
笔记
课程使用的课本是:
Lay, D. C., Lay, S. R., & McDonald, J. J. (2016). Linear Algebra and its applications 5th edition. Pearson.
另外主办人推荐的一套 YouTube 上的视频课程是:
https://youtube.com/playlist?list=PLl-gb0E4MII03hiCrZa7YqxUMEeEPmZqK&si=TFnQCipewVN5WCdM
以下是我做的一点笔记
Beginner Level
- Solving Linear Equations
- [LA-1] M1: Systems of Linear Equations
- [LA-1] M2: Row Reduction and Echelon Forms
- [LA-1] M3: Vector Equations
- [LA-1] M4: The Matrix Equation Ax = b
- [LA-1] M5: Solution Sets Of Linear Systems
- Transforms and Inversion
- [LA-1] M6: Linear Independence
- [LA-1] M7: Introduction To Linear Transformations
- [LA-1] M8: The Matrix of a Linear Transformation
- [LA-1] M9: Matrix Operations
- [LA-1] M10: The Inverse of a Matrix
- Inverse, Vector Spaces, Linear Independence
- [LA-1] M11: Characterizations of Invertible Matrices
- [LA-1] M12: Introduction to Determinants
- [LA-1] M13: Vector Spaces and Subspaces
- [LA-1] M14: Null Spaces, Column Spaces, and Linear Transformations
- [LA-1] M15: Linearly Independent Sets; Bases
- Fundamentals of Eigenvectors, Eigenvalues, and Diagonalization
- [LA-1] M16: Eigenvectors and Eigenvalues 1
- [LA-1] M17: Eigenvectors and Eigenvalues 2
- [LA-1] M18: The Characteristic Equation
- [LA-1] M19: Diagonalization 1
- [LA-1] M20: Diagonalization 2
- Eigenvectors, Orthogonality and MATLAB Practice
- [LA-1] M21: Eigenvectors and Linear Transformations 1
- [LA-1] M22: Eigenvectors and Linear Transformations 2
- [LA-1] M23: Inner Product, Length, and Orthogonality
- [LA-1] M24: Orthogonal Sets
- [LA-1] M25: MATLAB Practice
Advanced Level
解非齐次线性方程 Ax=b:
- write the equation into an augmented matrix [A|b], b是等号右边的系数,不取负值
- reduce the row echelon forms
- read from last row above to determine x.
Fundamentals and Key Concepts
- [LA-2] Key Terms and Definitions before you start
- Rectangular Matrix & Echelon Form & Pivot: Read p. 13-14 and go over the definitions of the terms.
- [LA-2] M1: System of Linear Equations
- TOPICS:
- Existence of Solutions
- Computation of Ax
- Properties of the Matrix
- Homogeneous Linear Systems
- Parametric Vector Form
- Solutions of Nonhomogeneous Systems
- EXERCISES
- The Matrix Equation: Read Section 1.4 (p. 35-40) and do problems 5, 13 & 24 (p. 40-41).
- Solution Sets Of Linear Systems: Read Section 1.5 (p. 43-47) and do problems 5, 23 & 24 (p. 48).
- COMPLEMENTARY MATERIALS: ****
- TOPICS:
- [LA-2] M2: Linear Dependence and Linear Transformations
- TOPICS:
- Linear Independence of Matrix Columns
- Sets of One or Two Vectors
- Matrix Transformations
- Linear Transformations
- Geometric Linear Transformations of R2
- Existence and Uniqueness Questions
- EXERCISES:
- Linear Dependence: Read Section 1.7 (p. 56-61) and do problems 22, 34 & 36 (p. 62).
- Linear Transformations: Read Section 1.8 (p. 65-69) and do problems 18 & 28 (p. 70).
- The Matrix of Linear Transformations: Read Section 1.9 (p. 71-78) and do problem 24 (p. 79-80).
- COMPLEMENTARY MATERIALS: ****
- TOPICS:
- [LA-2] M3: Matrix Operations, the Transpose, and Inverse of a Matrix, and Determinants
- TOPICS:
- Matrix Operations
- The Transpose of a Matrix
- Elementary Matrices
- An Algorithm for Finding the Inverse
- Another View of Matrix Inversion
- Determinants
- EXERCISES:
- Matrix Operations: Read Section 2.1 (p. 94-102) and do problems 11 & 16 (p. 102-103).
- The Inverse of a Matrix: Read Section 2.3 (p. 114-116) and do problems 12 & 27 (p. 117-118).
- Determinants: Read Section 3.1 (p. 166-169) and do problems 37 & 39. (p. 170-171).
- COMPLEMENTARY MATERIALS: ****
- TOPICS:
- [LA-2] M4: Vector Spaces and Subspaces
- TOPICS:
- Subspaces
- A Subspace Spanned by a Set
- The Null Space of a Matrix
- An Explicit Description of Nul A
- The Column Space of a Matrix
- The Contrast Between Nul A and Col A
- Kernel and Range of a Linear Transformation
- EXERCISES:
- Vector Spaces: Read Section 4.1 (p. 192-197) and do problems 1, 24 & 31 (p. 199-200)
- Null Spaces, Column Spaces, And Linear Transformations: Read Section 4.2 (p. 200-207) and do problems 4, 25 & 38 (p. 208-209).
- COMPLEMENTARY MATERIALS
- TOPICS:
- [LA-2] M5: Linear Independence and Rank
- TOPICS:
- The Spanning Set Theorem
- Bases for NulA and ColA
- Two Views of a Basis
- The Row Space
- The Rank Theorem
- Rank and the Invertible Matrix Theorem
- EXERCISES:
- Linear Independent Sets, Bases: Read Section 4.3 (p. 210-215) and do problems 7, 8, 22, 31 (p. 215-216)
- Rank: Read Section 4.6 (p. 232-238) and do problems 12 & 18 (p. 238-240).
- COMPLEMENTARY MATERIALS:
- TOPICS:
Fundamentals for Principal Component Analysis
- [LA-2] M6: Eigenvectors and Eigenvalues
- TOPICS: Eigenvectors and Difference Equations
- EXERCISES: Read Section 5.1 (p. 268-273) and do problems 8, 12, 20, 22, 27 & 35. (p. 273-275)
- COMPLEMENTARY MATERIALS: YouTube Video: Eigenvectors and eigenvalues (3Blue1Brown)
- [LA-2] M7: Diagonalization
- TOPICS:
- Diagonalizing Matrices
- Matrices Whose Eigenvalues Are Not Distinct
- EXERCISES: Read Section 5.3 (p. 283-288). Next, do the following exercises (p. 288-289): 8, 10, 18, 21, 23, 24, 27, 28 & 31.
- COMPLEMENTARY MATERIALS: YouTube Video: Visualizing Diagonalization (QualityMathVisuals)
- TOPICS:
- [LA-2] M8: Eigenvectors and Linear Transformations
- TOPICS:
- Diagonalizing Matrices
- Matrices Whose Eigenvalues Are Not Distinct
- EXERCISES: Read Section 5.4 (p. 290-295) and do problems 6, 7, 8, 17, 18, 19, 20, 25 & 28 (p. 295-296)
- COMPLEMENTARY MATERIALS: YouTube Video: Linear transformations and matrices (3Blue1Brown)
- TOPICS:
- [LA-2] M9: Diagonalization of symmetric matrices
- TOPICS:
- Diagonalizing Matrices
- Matrices Whose Eigenvalues Are Not Distinct
- EXERCISES: Read Section 7.1 (p. 397-401) and do problems 4, 8, 16, 26, 28 & 37 (p. 401-402).
- COMPLEMENTARY MATERIALS: YouTube Video: Can you diagonalize every matrix? (QualityMathVisuals)
- TOPICS:
- [LA-2] M10: The Singular Value Decomposition
- TOPICS:
- Singular vectors and singular values
- Singular Value Decomposition (SVD)
- EXERCISES: Read Section 7.4 (pp. 416-425) and solve problems 3, 13, 18, 19, 22, and 29 (pp. 425-426)
- COMPLEMENTARY MATERIALS:
- TOPICS:
Principal Component Analysis
- [LA-2] M11: Principal Component Analysis [Applied]
- TOPICS: Principal Component Analysis
- EXERCISES:
- Read Section 7.5 (p. 426-431) and do problems 1-4, 7 & 9 (p. 432-433).
- Read the Nature Methods article: https://doi.org/10.1038/nmeth.4346
- COMPLEMENTARY:
- [LA-2] M12: Coding the PCA Algorithm [Applied]
- TOPICS
- Step-by-Step Guide for Coding Principal Component Analysis
- Write Your Own PCA Function
- Apply your PCA Function on the IRIS dataset
- [OPTIONAL] Part 4: Apply your PCA Function on Your Own Dataset
- COMPLEMENTARY
- TOPICS
- [LA-2] M13: Using Built-In PCA Functions [Applied]
- TOPICS: compare your own PCA function with the built-in of the python sklearn package
- COMPLEMENTARY: Utilizing PCA (StackOverflow)
- [LA-2] M14: Visualization of Singular Values and Principal Components [Applied]
- TOPICS:
- Compress a single gray scale image using PCA
- How any principal components are needed to efficiently compress the image?
- Singular Vectors and Principal Components
- [OPTIONAL] Part 4: Image Compression Using PCA of an image data set
- TOPICS:
- [LA-2] M15: PCA Journal Discussion [Applied]
- TOPICS:
- S. Baronti, A. Casini, F. Lotti, and S. Porcinai, "Multispectral imaging system for the mapping of pigments in works of art by use of principal-component analysis," Appl. Opt. 37, 1299-1309 (1998)
- Yeung KY, Ruzzo WL. Principal component analysis for clustering gene expression data. Bioinformatics. 2001 Sep;17(9):763-74. doi: 10.1093/bioinformatics/17.9.763. PMID: 11590094
- Choose your own paper related to PCA!
- QUESTIONS:
- What was the application of PCA in this journal article?
- How was PCA applied to this problem? Did the authors expand upon the PCA methods we discussed last week?
- What are some other applications you can think of for PCA after reading this paper? How might you use PCA in your research, or related to your research interests?
- COMPLEMENTARY
- TOPICS:
Fundamentals for SVM
- [LA-2] M16: Orthogonality
- TOPICS: Eigenvectors and Difference Equations
- EXERCISES:
- Inner Product, Length, and Orthogonality: Read Section 6.1 (p. 332-338) and do problems 19, 20, 24 & 29. (p. 338-339).
- Orthogonal Sets: Read Section 6.2 (p. 340-346) and do problems 21, 24, 26, 27 & 29 (p. 346-348).
- COMPLEMENTARY:
- [LA-2] M17: Orthogonal Projections and Inner Product Spaces
- TOPICS:
- Orthogonal Projections
- Inner Product Spaces
- Applications of Inner Product Spaces (optional)
- EXERCISES:
- Orthogonal Projections: Read Section 6.3 (p. 349-353) and do problems 8, 16, 21 & 22 (p. 354-355).
- Inner Product Spaces: Read Section 6.7 (p. 378-385) and do problems 1, 14, 16, 18 & 25 (p. 384-385).
- Applications of Inner Product Spaces (Optional): Read Section 6.8 (p. 385-390) and do problems 5-8 (p. 391).
- COMPLEMENTARY
- TOPICS:
- [LA-2] M18: Quadratic Forms and Constrained Optimization
- TOPICS:
- Quadratic Forms
- Constrained Optimization
- EXERCISES:
- Quadratics Forms: Read Section 7.2 (p. 403-401) and do problems 2, 6, 16, 22 & 23 (p. 408-409).
- Constrained Optimization: Read Section 7.3 (p. 408-415) and do problems 1, 2, 11, 12 & 13 (p. 415-416).
- COMPLEMENTARY:
- TOPICS:
- [LA-2] M19: Subspaces and Hyperplanes
- TOPICS:
- Vector Spaces and Subspaces
- Null Spaces, Column Spaces, and Linear Transformations
- Linearly Independent Sets; Bases
- Hyperplanes
- EXERCISES:
- Vector Spaces and Subspaces: Read Section 4.1 (p. 192-197) and do problem 33 (p. 197-200).
- Null Spaces, Columns Spaces, and Linear Transformation: Read Section 4.2 (p. 200-207) and do problem 2 (p. 207-209).
- Linearly Independent Sets; Bases: Read Section 4.3 (p. 210-215) and do problems 3 & 21 (p. 215-217).
- Hyperplanes: Read Section 8.4 (p. 463-469) and do problems 3, 11, 18, 21 & 26 (p. 469-471).
- COMPLEMENTARY
- TOPICS:
- [LA-2] M20: Linear Separability and Support Vector Machines
- TOPICS: Linear Separable Support Vector Machines
- EXERCISES: Linear Separable Support Vector Machines
- Support Vector Machines - Part 1: Introduction
- Support Vector Machines - Part 2a: Linearly Separable Case
- Support Vector Machines - Part 2b: Linearly Separable Case (cont'd)
- COMPLEMENTARY
- Support Vector Machine. (Wikipedia)
- Support Vector Machine(SVM): A Complete guide for beginners. (Data Analytics Article, by Saini, 2021)
- Support vector machines: The linearly separable case. (Book Chapter, Support Vector Machines and Machine Learning on Documents, 2023)
- Support Vector Machines. (Lecture Notes, Zhu, X., 2010)
- REFERENCE
- Chen, L. (2019, January 7). Support Vector Machine — Simply Explained - Towards Data Science. Medium; Towards Data Science. https://towardsdatascience.com/support-vector-machine-simply-explained-fee28eba5496
- Introduction to Information Retrieval. (2013). Stanford.edu. https://nlp.stanford.edu/IR-book/
- Saini, A. (2021, October 12). Support Vector Machine(SVM): A Complete guide for beginners. Analytics Vidhya. https://www.analyticsvidhya.com/blog/2021/10/support-vector-machinessvm-a-complete-guide-for-beginners/
- Zhu, X. (2010). Support Vector Machines. https://pages.cs.wisc.edu/~jerryzhu/cs769/svm.pdf
Support Vector Machines
- [LA-2] M21: Coding the SVM Algorithm [Applied]
- [LA-2] M22: Using Built-In SVM Functions [Applied]
- PRACTICE: https://colab.research.google.com/drive/1vfWNKsp5l5q2HrhaZ0sx4SfsB1vJFb2s?usp=sharing
- COMPLEMENTARY: SVM - scikit-learn
- [LA-2] M23: SVM Feature Selection [Applied]
- [LA-2] M24: Expanding SVM to Additional Datasets [Applied]
- [LA-2] M25: SVM Journal Discussion [Applied]
- PAPERS:
- Wang, S. et al. Abnormal regional homogeneity as potential imaging biomarker for psychosis risk syndrome: a resting-state fMRI study and support vector machine analysis. Sci. Rep. 6, 27619; doi: 10.1038/srep27619
- F. Melgani and L. Bruzzone, "Classification of hyperspectral remote sensing images with support vector machines," in IEEE Transactions on Geoscience and Remote Sensing, vol. 42, no. 8, pp. 1778-1790, Aug. 2004, doi: 10.1109/TGRS.2004.831865.
- Wu, L.C., Kuo, C., Loza, J. et al. Detection of American Football Head Impacts Using Biomechanical Features and Support Vector Machine Classification. Sci Rep 8, 855 (2018). https://doi.org/10.1038/s41598-017-17864-3
- Choose your own paper related to SVM!
- QUESTIONS:
- What was the application of SVM in this journal article?
- How was SVM applied to this problem? Did the authors expand upon the SVM methods we discussed last week?
- What are some other applications you can think of for SVM after reading this paper? How might you use SVM in your research, or related to your research interests?
- PAPERS:
本文收录于以下合集: