**Description:**

This
thoroughly revised second edition provides an updated treatment of numerical
linear algebra techniques for solving problems in data mining and pattern
recognition. Adopting an application-oriented approach, the author introduces
matrix theory and decompositions, describes how modern matrix methods can be
applied in real life scenarios, and provides a set of tools that students can
modify for a particular application.

Building
on material from the first edition, the author discusses basic graph concepts
and their matrix counterparts. He introduces the graph Laplacian and properties
of its eigenvectors needed in spectral partitioning and describes spectral
graph partitioning applied to social networks and text classification. Examples
are included to help readers visualize the results. This new edition also
presents matrix-based methods that underlie many of the algorithms used for big
data.

The
book provides a solid foundation to further explore related topics and presents
applications such as classification of handwritten digits, text mining, text
summarization, PageRank computations related to the Google search engine, and
facial recognition.

Contents:

**Prefaces**

__Part I: Linear Algebra Concepts and Matrix
Decompositions__

**Chapter 1.** **Vectors and Matrices in Data Mining and Pattern
Recognition** • Data Mining and Pattern Recognition • Vectors and Matrices •
Purpose of the Book • Programming Environments • Floating Point Computations •
Notation and Conventions

**Chapter 2.** **Vectors and Matrices** • Matrix-Vector
Multiplication • Matrix-Matrix Multiplication • Inner Product and Vector Norms
• Matrix Norms • Linear Independence: Bases • The Rank of a Matrix

**Chapter 3.** **Linear Systems and Least Squares** • LU
Decomposition • Symmetric, Positive Definite Matrices • Perturbation Theory and
Condition Number • Rounding Errors in Gaussian Elimination • Banded Matrices •
The Least Squares Problem

**Chapter 4.** **Orthogonality** • Orthogonal Vectors and
Matrices • Elementary Orthogonal Matrices • Number of Floating Point Operations
• Orthogonal Transformations in Floating Point Arithmetic

**Chapter 5.** **QR Decomposition **• Orthogonal Transformation
to Triangular Form • Solving the Least Squares Problem • Computing or Not
Computing *Q* • Flop Count for QR Factorization • Error in the Solution of
the Least Squares Problem • Updating the Solution of a Least Squares Problem

**Chapter 6.** **Singular Value Decomposition** • The
Decomposition • Fundamental Subspaces • Matrix Approximation • Principal
Component Analysis • Solving Least Squares Problems • Condition Number and
Perturbation Theory for the Least Squares Problem • Rank-Deficient and
Underdetermined Systems • Computing the SVD • The Eigenvalue Decomposition of a
Symmetric Matrix • Complete Orthogonal Decomposition

**Chapter 7.** **Reduced-Rank Least Squares Models **• Truncated
SVD: Principal Component Regression • A Krylov Subspace Method

**Chapter 8.** **Tensor Decomposition** • Introduction • Basic
Tensor Concepts • A Tensor SVD • Approximating a Tensor by HOSVD

**Chapter 9.** **Clustering and Nonnegative Matrix Factorization**
• The *k*-Means Algorithm • Nonnegative Matrix Factorization

**Chapter 10. Graphs
and Matrices** • Graphs and
Adjacency Matrices • Connectedness and Reducibility • Graph Laplacians and
Spectral Partitioning • Bipartite Graphs

__Part II: Data Mining Applications__

**Chapter 11.** **Classification of Handwritten Digits** •
Handwritten Digits and a Simple Algorithm • Classification Using SVD Bases •
Tangent Distance

**Chapter 12.** **Text Mining** • Preprocessing the Documents and
Queries • The Vector Space Model • Latent Semantic Indexing • Clustering •
Nonnegative Matrix Factorization • LGK Bidiagonalization • Average Performance

**Chapter 13.** **Page Ranking for a Web Search Engine** •
PageRank • Random Walk and Markov Chains • The Power Method for PageRank Computation
• HITS

**Chapter 14.** **Automatic Key Word and Key Sentence Extraction **•
Saliency Score • Key Sentence Extraction from a Rank-*k* Approximation

**Chapter** **15.** **Face Recognition Using Tensor SVD** •
Tensor Representation • Face Recognition • Face Recognition with HOSVD
Compression

**Chapter 16.** **Spectral Graph Partitioning** • Large and Sparse
Laplacians • A Network of Political Blogs • Text Classification • Multiway
Partitioning

__Part III: Computing the Matrix Decompositions__

**Chapter 17.** **Computing Eigenvalues and Singular Values** •
Perturbation Theory • The Power Method and Inverse Iteration • Similarity
Reduction to Tridiagonal Form • The QR Algorithm for a Symmetric Tridiagonal
Matrix • Computing the SVD • The Nonsymmetric Eigenvalue Problem • Sparse
Matrices • The Arnoldi and Lanczos Methods • Software

**Bibliography**

**Index**

**About the Author:**

**Lars Eldén** is a retired professor of
scientific computing at Linköping University in Sweden, where he was head of
the mathematics department and director of the National Supercomputer Centre.
He is the author, along with L. Wittmeyer-Koch and H. Bruun Nielsen, of* Introduction
to Numerical Computation: Analysis and MATLAB Illustrations *(Studentlitteratur
AB, 2004).

**Target Audience:**

This
book is primarily for undergraduate students who have previously taken an
introductory scientific computing/numerical analysis course and graduate
students in data mining and pattern recognition areas who need an introduction
to linear algebra techniques.