Learning Objectives
- Use Advanced Matrix Notation and Operators
- Use SVD to Design Linear Hidden Unit Basis Functions
- Use SVD to Design Linear Learning Machines
The objective of Chapter 4 is to introduce advanced matrix operators and the singular value decomposition method for supporting the analysis and design of linear machine learning algorithms. More specifically, Chapter 4 begins with a review of basic matrix operations and definitions while introducing more advanced matrix operators for supporting the matrix calculus theory supported in Chapter 5. Definitions of the vec function, matrix multiplication, trace operator, Hadamard product, duplication matrix, positive definite matrices, matrix inverse, and matrix pseudoinverse are provided. The chapter also includes a collection of useful Kronecker matrix product identities to support advanced matrix calculus operations. In addition, Chapter 4 introduces singular value decomposition (SVD) as a tool for the analysis and design of machine learning algorithms. Applications of eigenvector analysis and singular value decomposition for latent semantic Indexing (LSI) and solving linear regression problems are provided. The chapter ends with a discussion of how the Sherman-Morrison Theorem may be applied to design adaptive learning algorithms that incorporate matrix inversion. In particular, the Sherman-Morrison Theorem provides a computational shortcut for recursively computing the inverse of a matrix generated from n+1 data points from the inverse of a matrix generated from n of the n+1 data points.
The podcast LM101-082: Ch4: How to Analyze and Design Linear Machines provides an overview of the main of this book chapter, some tips for students to help them read this chapter, as well as some guidance to instructors for teaching this chapter to students.