学习方案是最有价值的。很多“民间科学家”或是思而不学则殆,或是追捧一些不学就会的天才传说,并不一定是好逸恶劳,可能只是不得其法。

隔壁生物医学工程系有一个成像科学的博士专业,招来的学生本科专业很杂,医学、生物、物理、数学、计算机……数学水平参差不齐。

为了解决这个问题,让科研牛马们能尽快出活儿,他们在正规课程之外加开了一个数学速成班 (math crash course)。

计划用 2 个月的时间让没学过的人速通/学过的人复习线性代数和微积分,其中线性代数 5 个星期,微积分 4 个星期。

总的学习内容被分成了若干模块,通过跳过部分内容,可以按照两种难度(入门/进阶),三种强度(每周 5 小时/ 10 小时/ 15 小时)定出 6 种学习方案。

学习方案是最有价值的。很多“民间科学家”或是思而不学则殆,或是追捧一些不学就会的天才传说,并不一定是好逸恶劳,可能只是不得其法。

一天之内通宵达旦听起来很唬人,但每个工作日坚持一小时要可贵得多,也难得多,哪怕后者甚至不耽误周末和假期。

模块

模块初学进阶
M1线性方程系统线性方程系统
M2行消去、阶梯型矩阵线性相关、线性变换
M3向量方程组矩阵运算、转置、逆
M4矩阵方程 Ax=b向量空间和子空间
M5线性系统的解集线性无关、秩
M6线性相关本征向量、本征值
M7线性变换(介绍)对角化
M8线性变换的矩阵本征向量和线性变换
M9矩阵运算对阵矩阵的对角化
M10矩阵的逆奇异值分解 SVD
M11可逆矩阵的性质主成分分析 PCA
M12行列式编程实现 PCA
M13向量空间和子空间使用现成的 PCA
M14核、列空间、线性变换奇异值和主成分的可视化
M15线性无关集、基PCA 论文讨论
M16本征向量、本征值 1正交
M17本征向量、本征值 2正交投影、内积空间
M18特征方程二次型、条件优化
M19对角化 1子空间、超平面
M20对角化 2线性可分、支持向量机 SVM
M21本征向量和线性变换 1编程实现 SVM
M22本征向量和线性变换 2使用现成的 SVM
M23内积、长度、正交调整 SVM 参数
M24正交集将 SVM 拓展到额外数据
M25MATLAB 练习SVM 论文讨论

学习方案

难度精力第 1 周第 2 周第 3 周第 4 周第 5 周
入门每周 5 小时M1, M2M4, M6M7, M9M10, M13M14, M15
每周 10 小时M1, M2, M3M4, M5, M6M7, M8, M9M10, M13, M14M15, M16, M17
每周 15 小时M1, M2, M3, M4M5, M6, M7, M8M9, M10, M11, M12M13, M14, M15, M16M17, M21, M22
可以略过M17, M19, M20, M23, M24, M25
进阶每周 5 小时M2, M3M4, M5M6, M7M10, M11M16, M17
每周 10 小时M1, M2, M3M4, M5, M6M7, M8, M9M10, M11, M16M17, M19, M20
每周 15 小时M1, M2, M3, M4M5, M6, M7, M8M9, M10, M11, M12M13, M14,M15, M16M17, M19, M20
实用PCA & SVMM6, M8, M10M11, M12, M13, M15M16, M17, M19M20, M21, M22M23, M24, M25

笔记

课程使用的课本是:

Lay, D. C., Lay, S. R., & McDonald, J. J. (2016). Linear Algebra and its applications 5th editionPearson.

另外主办人推荐的一套 YouTube 上的视频课程是:

https://youtube.com/playlist?list=PLl-gb0E4MII03hiCrZa7YqxUMEeEPmZqK&si=TFnQCipewVN5WCdM

以下是我做的一点笔记

Beginner Level

  • Solving Linear Equations
    • [LA-1] M1: Systems of Linear Equations
    • [LA-1] M2: Row Reduction and Echelon Forms
    • [LA-1] M3: Vector Equations
    • [LA-1] M4: The Matrix Equation Ax = b
    • [LA-1] M5: Solution Sets Of Linear Systems
  • Transforms and Inversion
    • [LA-1] M6: Linear Independence
    • [LA-1] M7: Introduction To Linear Transformations
    • [LA-1] M8: The Matrix of a Linear Transformation
    • [LA-1] M9: Matrix Operations
    • [LA-1] M10: The Inverse of a Matrix
  • Inverse, Vector Spaces, Linear Independence
    • [LA-1] M11: Characterizations of Invertible Matrices
    • [LA-1] M12: Introduction to Determinants
    • [LA-1] M13: Vector Spaces and Subspaces
    • [LA-1] M14: Null Spaces, Column Spaces, and Linear Transformations
    • [LA-1] M15: Linearly Independent Sets; Bases
  • Fundamentals of Eigenvectors, Eigenvalues, and Diagonalization
    • [LA-1] M16: Eigenvectors and Eigenvalues 1
    • [LA-1] M17: Eigenvectors and Eigenvalues 2
    • [LA-1] M18: The Characteristic Equation
    • [LA-1] M19: Diagonalization 1
    • [LA-1] M20: Diagonalization 2
  • Eigenvectors, Orthogonality and MATLAB Practice
    • [LA-1] M21: Eigenvectors and Linear Transformations 1
    • [LA-1] M22: Eigenvectors and Linear Transformations 2
    • [LA-1] M23: Inner Product, Length, and Orthogonality
    • [LA-1] M24: Orthogonal Sets
    • [LA-1] M25: MATLAB Practice

Advanced Level

解非齐次线性方程 Ax=b:

  • write the equation into an augmented matrix [A|b], b是等号右边的系数,不取负值
  • reduce the row echelon forms
  • read from last row above to determine x.

Fundamentals and Key Concepts

Fundamentals for Principal Component Analysis

Principal Component Analysis

  • [LA-2] M11: Principal Component Analysis [Applied]
  • [LA-2] M12: Coding the PCA Algorithm [Applied]
  • [LA-2] M13: Using Built-In PCA Functions [Applied]
  • [LA-2] M14: Visualization of Singular Values and Principal Components [Applied]
    • TOPICS:
      • Compress a single gray scale image using PCA
      • How any principal components are needed to efficiently compress the image?
      • Singular Vectors and Principal Components
      • [OPTIONAL] Part 4: Image Compression Using PCA of an image data set
  • [LA-2] M15: PCA Journal Discussion [Applied]
    • TOPICS:
      • S. Baronti, A. Casini, F. Lotti, and S. Porcinai, "Multispectral imaging system for the mapping of pigments in works of art by use of principal-component analysis," Appl. Opt. 37, 1299-1309 (1998)
      • Yeung KY, Ruzzo WL. Principal component analysis for clustering gene expression data. Bioinformatics. 2001 Sep;17(9):763-74. doi: 10.1093/bioinformatics/17.9.763. PMID: 11590094
      • Choose your own paper related to PCA!
    • QUESTIONS:
      • What was the application of PCA in this journal article?
      • How was PCA applied to this problem? Did the authors expand upon the PCA methods we discussed last week?
      • What are some other applications you can think of for PCA after reading this paper? How might you use PCA in your research, or related to your research interests?
    • COMPLEMENTARY

Fundamentals for SVM

Support Vector Machines

本文收录于以下合集: