Understand and apply basic concepts of Linear Algebra
You understand
- vectors in 2D, 3D, and higher dimensions
- vector arithmetic
- distances between vectors
- dot product and orthogonality
- matrix multiplication and its geometric interpretation
- the notion of distance and nearest neighbors
- the curse of dimensionality
- the effect of increasing dimensionality to the notion of
nearness
- principle component analysis using SVD (aka principal components)
- linear discriminant analysis
- notion of purity and splitting criteria
- hyperplanes and linear separability
- optimization using gradient descent
- ensemble methods: random forest, gradient boosted trees
You are able to
- represent data as vectors
- use nearest neighbors for classification/regression
- use decision trees for classification/regression
- use SVD to examine the inherent dimensionality of the data
- perform dimensionality reduction using SVD or LDA
- use support vector machines for classification
- use perceptron and neural networks for
classification/regression
Referencing modules:
Linear Algebra