The store will not work correctly when cookies are disabled.
Wir verwenden Cookies und Analyse-Tools, um die Nutzerfreundlichkeit der Internet-Seite zu verbessern und für Marketingzwecke.Wenn Sie fortfahren, diese Seite zu verwenden, nehmen wir an, dass Sie damit einverstanden sind. Zur Datenschutzerklärung.
Geliefert zwischen Mi., 26.11.2025 und Do., 27.11.2025
Details
This book is designed for students who have never been exposed to the topics in a linear algebra course. The text is filled with interesting and diverse application sections but is also a theoretical text which aims to train students to do succinct computation in a knowledgeable way.
Introduction to Linear Algebra: Computation, Application, and Theory is designed for students who have never been exposed to the topics in a linear algebra course. The text is filled with interesting and diverse application sections but is also a theoretical text which aims to train students to do succinct computation in a knowledgeable way. After completing the course with this text, the student will not only know the best and shortest way to do linear algebraic computations but will also know why such computations are both e ective and successful. ** Features**:
Includes cutting edge applications in machine learning and data analytics
Suitable as a primary text for undergraduates studying linear algebra
Requires very little in the way of pre-requisites
Autorentext
Mark J. DeBonis received his PhD in Mathematics from the University of California, Irvine, USA. He began his career as a theoretical mathematician in the field of group theory and model theory, but in later years switched to applied mathematics, in particular to machine learning. He spent some time working for the US Department of Energy at Los Alamos National Lab as well as the US Department of Defense at the Defense Intelligence Agency as an applied mathematician of machine learning. He is an Associate Professor of Mathematics at Manhattan College in New York City and is also currently working for the US Department of Energy at Sandia National Lab as a Principal Data Analyst. His research interests include machine learning, statistics, and computational algebra.
Inhalt
Examples of Vector Spaces.
1.1. First Vector Space: Tuples. 1.2. Dot Product. 1.3. Application: Geometry. 1.4. Second Vector Space: Matrices. 1.5. Matrix Multiplication. 2. Matrices and Linear Systems. 2.1. Systems of Linear Equations. 2.2. Gaussian Elimination. 2.3. Application: Markov Chains. 2.4. Application: The Simplex Method. 2.5. Elementary Matrices and Matrix Equivalence. 2.6. Inverse of a Matrix. 2.7. Application: The Simplex Method Revisited. 2.8. Homogeneous/Nonhomogeneous Systems and Rank. 2.9. Determinant. 2.10. Applications of the Determinant. 2.11. Application: Lu Factorization. 3. Vector Spaces. 3.1. Definition and Examples. 3.2. Subspace. 3.3. Linear Independence. 3.4. Span. 3.5. Basis and Dimension. 3.6. Subspaces Associated with a Matrix. 3.7. Application: Dimension Theorems. 4. Linear Transformations. 4.1. Definition and Examples. 4.2. Kernel and Image. 4.3. Matrix Representation. 4.4. Inverse and Isomorphism. 4.5. Similarity of Matrices. 4.6. Eigenvalues and Diagonalization. 4.7. Axiomatic Determinant. 4.8. Quotient Vector Space. 4.9. Dual Vector Space. 5. Inner Product Spaces. 5.1. Definition, Examples and Properties. 5.2. Orthogonal and Orthonormal. 5.3. Orthogonal Matrices. 5.4. Application: QR Factorization. 5.5. Schur Triangularization Theorem. 5.6. Orthogonal Projections and Best Approximation. 5.7. Real Symmetric Matrices. 5.8. Singular Value Decomposition. 5.9. Application: Least Squares Optimization. 6. Applications in Data Analytics. 6.1. Introduction. 6.2. Direction of Maximal Spread. 6.3. Principal Component Analysis. 6.4. Dimensionality Reduction. 6.5. Mahalanobis Distance. 6.6. Data Sphering. 6.7. Fisher Linear Discriminant Function. 6.8. Linear Discriminant Functions in Feature Space. 6.9. Minimal Square Error Linear Discriminant Function. 7. Quadratic Forms. 7.1. Introduction to Quadratic Forms. 7.2. Principal Minor Criterion. 7.3. Eigenvalue Criterion. 7.4. Application: Unconstrained Nonlinear Optimization. 7.5. General Quadratic Forms. Appendix A. Regular Matrices. Appendix B. Rotations and Reflections in Two Dimensions. Appendix C. Answers to Selected Exercises.