Sunday, December 12, 2010

NIPS 2010 highlights

A few "connection to other fields" papers I found interesting

  • Variational Inference over Combinatorial Spaces. Alexandre Bouchard-Cote, Michael Jordan:
    Extend variational formulation to upper bound the number of combinatorial structures with global constraints, like perfect matchings and TSP tours. The trick is to represent the space of structures as an intersection of spaces that are tractable to count. For perfect matchings, it is tractable to count edge sets that satisfy perfect matching constraint on one side, so you can represent the total space as intersection of perfect matchings that are "left-legal" and "right-legal". In experiments, the algorithm gave same accuracy for perfect matchings as existing PTAS based on Jerrum's technique, but 10 times faster.

  • Learning Efficient Markov Networks. Vibhav Gogate, William Webb, Pedro Domingos:
    Learn high tree-width models and ensure efficient inference by restricting potential functions to a compact representation as a tree of AND/OR predicates. The problem of learning is reduced to symmetric submodular function minimization which can be done in $O(n^3)$ time with Queyranne's algorithm. For experiments, $O(n^3)$ was too slow, but they give a heuristic which worked to give state-of-the-art results in a practical dataset.

  • MAP estimation in Binary MRFs via Bipartite Multi-cuts. Sashank Jakkam Reddi, Sunita Sarawagi, Sundar Vishwanathan:
    Reduces MAP inference in binary Markov Fields to multi-commodity flow problem

  • Worst-case bounds on the quality of max-product fixed-points. Meritxell Vinyals, Jesús Cerquides, Alessandro Farinelli, Juan Antonio Rodríguez-Aguilar:
    Uses an inequality recently obtained in multi-agent literature to bound difference between true max-marginal and max-product max-marginal in terms of graph structure


Some interesting applied papers

  • Segmentation as maximum weight independent set. William Brendel, Sinisa Todorovic:
    You want to segment your image into regions, but the initial candidate regions are overlapping fragments. Real segmentations don't have overlaps, turn this into MWIS problem to find good solution satisfying these mutual exclusivity constraints

  • Efficient Minimization of Decomposable Submodular Functions. Peter Stobbe, Andreas Krause:
    Applies some recent techniques from non-smooth optimization to Lovasz extension of submodular function

  • Optimal Web-Scale Tiering as a Flow Problem. Gilbert Leung, Novi Quadrianto, Alexander Smola, Kostas Tsioutsiouliklis:
    Linear integer programming to optimize cache arrangement for 84 million web-pages on Yahoo servers



At least four groups (Bach, Mosci, Jia, Micchelli) had papers on "structured sparsity". Traditional approach to sparse learning is to maximize the number of 0's in the parameter vector. The new idea is to incorporate knowledge on the pattern of 0 values.

Andy Mueller has some more highlights

3 comments:

  1. THANKS FOR THE INFORMATION...
    <a href="http://www.chloros.in/digital-marketing-internship.htmlhttp://www.chloros.in/digital-marketing-internship.html>Digital Marketing Internship Program in Bangalore</a>

    ReplyDelete
  2. At least four groups (Bach, Mosci, Jia, Micchelli) had papers on "structured sparsity". Traditional approach to sparse learning is to maximize the number of 0's in the parameter vector. The new idea is to incorporate knowledge on the pattern of 0 values. bedsheet online purchase , queen bed comforter set , jersey bed sheets queen , fancy double bed sheet , rose gold bed set , vicky razai set , lounge suite covers , velvet bedding , single bed razai cover

    ReplyDelete