Fitting surface models to data: Accuracy, Speed, Robustness

<html>&nbsp;</html>

<html><h2><a href=“http://research.microsoft.com/en-us/um/people/awf/”>Andrew Fitzgibbon</a>, <a href=“http://research.microsoft.com”>Microsoft</a></h2></html>

<html><h2><a href=“http://www.cs.toronto.edu/~jtaylor”>Jonathan Taylor</a>, <a href=“http://www.perceptiveio.com”>PerceptiveIO</a></h2></html>

Slides: PDF (16MB) PPTX (385MB) GitHub

In vision and machine learning, almost everything we do may be considered to be a form of model fitting. Whether estimating the parameters of a convolutional neural network, computing structure and motion from image collections, tracking objects in video, computing low-dimensional representations of datasets, estimating parameters for an inference model such as Markov random fields, or extracting shape spaces such as active appearance models, it almost always boils down to minimizing an objective containing some parameters of interest as well as some latent or nuisance parameters. This tutorial will describe several tools and techniques for solving such optimization problems, with a focus on fitting 3D smooth-surface models, such as subdivision surfaces, to 2D and 3D data.

Agenda

Note: We will stop for questions at any time, and may choose to go slower or faster over some points, and we reserve the right to slip some topics across session boundaries. This means that if you want to attend just one specific session, you might want to allow a 15-30 minute buffer afterwards.

0900 Intro: Applications in vision and graphics.

  • Lots of exciting and inspirational examples of model fitting:
    • Kinetre (Siggraph 12)
    • Dolphins (PAMI 13)
    • Nonrigid tracking (Siggraph 14)
    • FlexSense (CHI 15)
    • Hand tracking (Siggraph 16)
  • Preview of the day

0920 Session I: Matrix and vector calculus, nonlinear optimization

  • vector functions and the Jacobian, generalized Jacobian
  • advanced matrix operations: block operations, kronecker products etc
  • derivatives of matrix expressions
  • sparse matrices and sparse storage
  • finite-difference versus symbolic derivatives
  • nonlinear optimization, Gauss-Newton and Levenberg-Marquardt algorithms
  • gradient descent vs Newton
  • linear vs quadratic convergence

1030 Coffee

1045 Session II: Curves and Correspondences

  • What is a curve? Parametric descriptions of curves and surfaces
  • Curves and data points: closest point operations
  • Fitting curves to data: correspondences
    • Iterated closest points
    • “Lifting” correspondences
  • Worked example: Gauss's Ceres problem

1140 Break and stretch

1145 Session III: Surfaces

  • Splines and subdivision surfaces in 3D
  • Optimizing with subdivision
  • Implementing for speed

1230 Lunch

1400 Session IV: Robustness and speed

  • Models
    • LBS, blendshapes, NURBS, lower order…
  • Priors/smoothers/convergers
    • ARAP
    • Background - DT ok for tracking, not for personalization
    • Priors on correspondences, e.g. piecewise continuous contour generator
  • Exposing Structure in Sum of Squares Form
  • Error metric
    • Robust terms
    • square root trick
    • A great example of where “lifting” really helps

1500 Coffee/Stretch

1515 Session V: Software

  • OpenSubdiv
  • Eigen
  • Ceres
  • Opt (Guest lecture from Matthias Niessner)
  • AD tools: Theano etc

1615 More coffee, more stretching

1630 Session VI: Conclusions, open problems, misc…

  • Topology adaptation
  • Where are the local minima?
  • And where lifting really hurts: VarPro algorithms
  • Implementing rotations: quaternions vs infinitesimals with recentering
  • derivatives of minimization problems
  • Schur complement QR

1715 Close

cvpr16_tutorial.txt · Last modified: 2017/02/20 15:11 by awf
CC Attribution 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0