spring 2016

 

csce 689 -- sparse matrix algorithms

Catalog description:  Many applications in computational science rely on algorithms for large-scale sparse matrices (circuit simulation, finite-element methods, 'big data', Google StreetView, etc).  Course equips students to understand and design methods that exploit sparsity in matrix computations.  Focus is direct methods, which rely on combinatorics, graph theory and algorithms, and numerical methods. Prerequisites: undergraduate-level numerical linear algebra and data structures/algorithms.


Course overview:  Students in any discipline that uses scientific computing methods will benefit from this course. A wide range of problems in science and engineering require fundamental matrix computations for their solution, and these matrices are often mostly zero. The objective of this course is to equip the student with the background necessary to understand and design algorithms for solving sparse matrix problems.  The world's largest sparse matrix arises in Google's web page ranking problem. About once a month, Google finds an eigenvector of sparse matrix that represents the connectivity of the web (of size billions-by-billions). Other sparse matrix problems arise in circuit and device simulation, finite element methods, linear programming and other optimization problems, acoustics, electromagnetics, computational fluid dynamics, 'big data', financial portfolio optimization, structural analysis, and quantum mechanics, to name a few.


Taking effective advantage of the structure of a sparse matrix requires a combination of numerical and combinatorial methods (graph algorithms and related data structures). For example, finding the best ordering to factorize a matrix is an NP-complete graph problem. Topics focus on direct methods, but with some application to iterative methods: sparse matrix-vector multiply, matrix-matrix multiply and transpose, forward/backsolve, LU and Cholesky factorization, singular value decomposition, reordering methods (including the use of graph partitioning methods), and updating/downdating a sparse Cholesky factorization. Projects in the course include programming assignments in C and MATLAB.


Meets from 3:55 to 5:50pm in HRBB 425A.


Links and other resources:

  1. Syllabus:  csce689_Spring2016_syllabus_Davis.pdf (PDF).

  2. about the instructor:

  3. College of Engineering faculty web page

  4. Computer Science and Engineering faculty web page 

  5. Parasol lab web page 

  6. research

  7. background and contact info

  8. All lectures are available to the right (top).  See also the SIAM Plenary talk (2006) and a talk given at Stanford (2013) for an overview of the topic.

  9. CSparse_318.zip: version 3.1.8.  Use this version as the starting point for all class projects (Project 2 and following).

  10. Piazza: for class discussions, announcements, and posting of resources.


About the instructor:


Tim Davis is a Professor in the Computer Science and Engineering Department at Texas A&M University.  His primary scholarly contribution is the creation of widely-used sparse matrix algorithms and software.  His software is relied upon by a vast array of commercial, government lab, and open source applications, including MATLAB (x=A\b when A is sparse), Mathematica, Google (Street View, Photo Tours, and 3D Earth), Octave, ANSYS, Cadence, MSC NASTRAN, Mentor Graphics, and many more.  As an NVIDIA Academic Partner, he is creating a new suite of highly-parallel sparse direct methods that can exploit the high computational throughput of recent GPUs.


He elected in 2013 as a Fellow of the Society for Industrial and Applied Mathematics (SIAM), "for contributions to sparse matrix algorithms and software including the University of Florida Sparse Matrix Collection," in 2014 as a Fellow of the Association for Computing Machinery (ACM), and in 2016 as a Fellow of the Institute for Electrical and Electronics Engineers (IEEE).

The first lecture (above) given in an earlier semester.  All 42 lectures are recorded and available on youtube.  See sparse_lectures.zip for all lecture notes.

SIAM Plenary talk given in 2006.  This talk is a good overview of the topics covered in the course.

A talk given at Stanford in June 2013, where I presented my current research in sparse matrix algorithms.