Rodolphe Sepulchre, June 2013: Difference between revisions

From Murray Wiki
Jump to navigationJump to search
No edit summary
No edit summary
Line 18: Line 18:


=== Abstract ===
=== Abstract ===
<center>
'''The geometry of (thin) SVD revisited for large-scale computations'''
Rodolphe Sepulchre<br>
University of Liege, Belgium
</center>
The talk will introduce a riemannian framework for large-scale  computations over the
set of low-rank  matrices. The foundation is geometric  and the motivation
is algorithmic, with a bias towards efficient computations in large-scale problems.
We will explore how classical matrix factorizations connect the riemannian geometry of the set of
fixed-rank matrices  to two well-studied manifolds: the Grassmann manifold of linear subspaces and the cone
of positive definite matrices. The theory will be illustrated on various applications, including
low-rank Kalman filtering,  linear regression with low-rank priors, matrix completion,  and  the choice of a suitable

Revision as of 10:25, 28 May 2013

Rodolphe Sepulchre will visit Caltech on 3 June 2013 (Mon).

Agenda

9:30a   Richard Murray, 109 Steele Lab
9:45a   Meet with Richard's NCS group, 110 Steele
  • 9:45-10:45: Necmiye, Mumu, Eric, Ivan
  • 10:45-11:45: Enoch, Marcella, Anandh, Dan
11:45   Seminar setup
12:00p   Seminar, 213 ANB
1:00p   Lunch with Venkat, CMS faculty
2:15p   Open
3:00p   Open
3:45p   Open
4:30p   Open
5:15p   Done

Abstract

The geometry of (thin) SVD revisited for large-scale computations

Rodolphe Sepulchre
University of Liege, Belgium

The talk will introduce a riemannian framework for large-scale computations over the set of low-rank matrices. The foundation is geometric and the motivation is algorithmic, with a bias towards efficient computations in large-scale problems. We will explore how classical matrix factorizations connect the riemannian geometry of the set of fixed-rank matrices to two well-studied manifolds: the Grassmann manifold of linear subspaces and the cone of positive definite matrices. The theory will be illustrated on various applications, including low-rank Kalman filtering, linear regression with low-rank priors, matrix completion, and the choice of a suitable