CVPR 2012 Day 2: optimize, optimize, optimize

CVPR 2012 Day 2: optimize, optimize, optimize - Selamat datang di situs media global terbaru Xivanki, Pada halaman ini kami menyajikan informasi tentang CVPR 2012 Day 2: optimize, optimize, optimize !! Semoga tulisan dengan kategori action recognition !! CVPR 2012 !! day 2 !! dennis strelow !! faces !! google research !! horst bichof !! hossein mobahi !! martial hebert !! metric learning !! netflix !! overview !! papers !! photobios !! ini bermanfaat bagi anda. Silahkan sebarluaskan postingan CVPR 2012 Day 2: optimize, optimize, optimize ini ke social media anda, Semoga rezeki berlimpah ikut dimudahkan Allah bagi anda, Lebih jelas infonya lansung dibawah -->


Due to popular request, here is my overview of some of the coolest stuff from Day 2 of CVPR 2012 in Providence, RI.  While the Lobster dinner was the highlight for many of us, there were also some serious learning/optimization-based papers presented during Day 2 worthy of sharing.  Here are some of the papers which left me with a very positive impression.


Dennis Strelow of Google Research in Mountain View presented a general framework for Wiberg minimization.  This is a strategy for minimizing objective functions with multiple variables -- objectives which are typically tackled in an EM-style fashion.  The idea is to express one of the variables as a linear function of the other variable, effectively making the problem depend on only one set of variables.  The technique is quite general and has been shown to produce state-of-the-art results on a bundle adjustment problem.  I know Dennis from my second internship at Google where we worked on some sparse-coding problems.  If you perform lots of matrix decomposition problems, check out his paper!


Dennis Strelow
General and Nested Wiberg Minimization
CVPR 2012


Another cool paper which is all about learning is Hossein Mobahi's algorithm for optimizing objectives by smoothing them to avoiding getting stuck in local minima.  This paper is not about blurry images, but about applying Gaussians to objective functions.  In fact, for the problem of image alignment, Hossein provides closed form versions of image operators.  Now when you apply these operators to images, you efficiently smooth the underlying cross-correlation alignment objective.  You decrease the blur, while following the optimum path, and get much nicer answers that doing naive image alignment.


Hossein Mobahi, C. Lawrence Zitnick, Yi Ma
Seeing through the Blur
CVPR 2012


Ira Kemelmacher-Shlizerman, of Photobios fame, showed a really cool algorithm for computing optical flow between two different faces based on learning a subspace (using a large database of faces).  The ideas is quite simple and allows for flowing between two very different faces where the underlying operation produces a sequence of intermediate faces in an interpolation-like manner.  She shared this video with us during her presentation, but it is on Youtube, so now you can enjoy it for yourself.


Ira Kemelmacher-Shlizerman, Steven M. Seitz
Collection Flow
CVPR 2012



Now talk about cool ideas!  Pyry, of CMU fame, presented a recommendation engine for classifiers.  The idea is to take techniques from collaborative filtering (think Netflix!) and apply then to the classifier selection problem.  Pyry has been working on action recognition and the ideas presented in this work are not only quite general, but have are quite intuitive and likely to benefit anybody working with large collections of classifiers.

Pyry Matikainen, Rahul Sukthankar, Martial Hebert
Model Recommendation for Action Recognition
CVPR 2012


And finally, a super-easy algorithm presented for metric learning by Martin Köstinger had me intrigued!  This a Mahalanobis distance metric learning paper which uses equivalence relationships.  This means that you are given pairs of similar items and pairs of dissimilar items.  The underlying algorithm is really not much more than fitting two covariance matrices, one to the positive equivalence relations, and another to the non-equivalence relations.  They have lots of code online, and if you don't believe that such a simple algorithm can beat LMNN (Large-Margin Nearest Neighbor from Killian Weinberger), then get their code and hack away!

Martin Köstinger, Martin Hirzer, Paul Wohlhart, Peter M. Roth, Horst Bischof
Large Scale Metric Learning from Equivalence Constraints
CVPR 2012



CVPR 2012 gave us many very math-oriented papers, and while I cannot list of all of them, I hope you found my short list useful.





Demikian info CVPR 2012 Day 2: optimize, optimize, optimize, Semoga dengan adanya postingan ini, Anda sudah benar benar menemukan informasi yang memang sedang anda butuhkan saat ini. Bagikan informasi CVPR 2012 Day 2: optimize, optimize, optimize ini untuk orang orang terdekat anda, Bagikan infonya melalui fasilitas layanan Share Facebook maupun Twitter yang tersedia di situs ini.

أحدث أقدم