English Version
館藏查詢
他校館藏
  
   系統號碼807060
   書刊名Linear algebra and optimization for machine learning : a textbook /
   主要著者Aggarwal, Charu C., author.
   出版項Cham, Switzerland : Springer Nature Switzerland AG, [2020]
   索書號QA184.2.A44 2020
   ISBN3030403432
   標題Algebras, Linear.
Machine learning-Mathematics.
Algebra.-fast-(OCoLC)fst00804885
Computers.-fast-(OCoLC)fst00872776
Machine learning.-fast-(OCoLC)fst01004795
   
    
   分享▼ 
網站搜尋           

 資料類型狀態應還日期預約人數館藏地索書號條碼號
找書圖書在架上0總館
西文圖書區 Shelf
QA184.2 .A44 2020W113342

內容簡介"This textbook introduces linear algebra and optimization in the context of machine learning. Examples and exercises are provided throughout the book. A solution manual for the exercises at the end of each chapter is available to teaching instructors. This textbook targets graduate level students and professors in computer science, mathematics and data science. Advanced undergraduate students can also use this textbook. The chapters for this textbook are organized as follows: 1. Linear algebra and its applications: The chapters focus on the basics of linear algebra together with their common applications to singular value decomposition, matrix factorization, similarity matrices (kernel methods), and graph analysis. Numerous machine learning applications have been used as examples, such as spectral clustering, kernel-based classification, and outlier detection. The tight integration of linear algebra methods with examples from machine learning differentiates this book from generic volumes on linear algebra. The focus is clearly on the most relevant aspects of linear algebra for machine learning and to teach readers how to apply these concepts. . Optimization and its applications: Much of machine learning is posed as an optimization problem in which we try to maximize the accuracy of regression and classification models. The 'parent problem' of optimization-centric machine learning is least-squares regression. Interestingly, this problem arises in both linear algebra and optimization, and is one of the key connecting problems of the two fields. Least-squares regression is also the starting point for support vector machines, logistic regression, and recommender systems. Furthermore, the methods for dimensionality reduction and matrix factorization also require the development of optimization methods. A general view of optimization in computational graphs is discussed together with its applications to back propagation in neural networks" -- Provided by publisher.

讀者書評

尚無書評,


  
Copyright © 2007 元智大學(Yuan Ze University) ‧ 桃園縣中壢市 320 遠東路135號 ‧ (03)4638800