Thursday, November 03, 2005

NIPS pre-prints

A google search reveals the following preprints associated with NIPS 2005:



  • Lafferty, Blei, Correlated Topic Models
  • Lafferty, Wasserman, Rodeo - Sparse Nonparametric Regression in High Dimensions
  • Tong Zhang and Rie K. Ando. Analysis of Spectral Kernel Design based Semi-supervised Learning.
  • Philipp Häfliger, et al, AER Building Blocks for Multi-Layer Multi-Chip Neuromorphic Vision Systems
  • Paninski, L. - Inferring prior probabilities from Bayes-optimal behavior
  • Ahrens, M., Huys, Q. & Paninski, L.. Large-scale biophysical parameter estimation in single neurons via constrained linear regression
  • Jack M. Wang, David J. Fleet, Aaron Hertzmann. Gaussian Process Dynamical Models.
    J.-P. Vert, R. Thurman and W. S. Noble, "Kernels for gene regulatory regions",
  • Firas Hamze and Nando de Freitas. Hot Coupling: A Particle Approach to Inference and Normalization on Pairwise Undirected Graphs
  • Jason E. Holt, On the Job Training
  • Marco Cuturi, Kenji Fukumizu - Multiresolution Kernels
  • Fan Li, Yiming Yang, Eric P. Xing. From lasso regression to feature vector machine.
  • Jian Zhang, Zoubin Ghahramani and Yiming Yang. Learning Multiple Related Tasks using Latent Independent Component Analysis.
  • James Diebel, An Application of Markov Random Fields to Range Sensing
  • A Computational Model of Eye Movements during Object Class Detection, Wei Zhang, Dimitris Samaras, Hyejin Yang, Greg Zelinsky
  • Cue Integration in Figure/Ground Labeling Xiaofeng Ren, Charless Fowlkes
  • Nelson, JD; Cottrell, GW; Filimon, F; Sejnowski, T (2005, Dec). Optimal experimental design models of naive human information acquisition. NIPS 2005
  • W. Zhang, D. Samaras, H. Yang and G. Zelinsky. A Computational Model of Eye Movements during Object Class Detection
  • Peter Gehler and max Welling Products of "Edge-Perts"
  • Cesa-Bianchi, Improved risk tail bounds for on-line algorithms
  • Semi-supervised Learning with Penalized Probabilistic Clustering. NIPS 2005 Zhengdong Lu, Todd Leen
  • LeCun, et al - Off-Road Obstacle Avoidance through End-to-End Learning
  • Sparse Covariance Selection via Robust Maximum Likelihood Estimation Authors: Onureena Banerjee, Alexandre d'Aspremont, Laurent El Ghaoui
  • Glenn Fung, Romer Rosales, Balaji Krishnapuram - Learning Rankings via Convex Hull Separations
  • Maximum Margin Semi-Supervised Learning for Structured Variables - Y. Altun, D. McAllester, M. Belkin
  • A Domain Decomposition Method for Fast Manifold Learning. Zhenyue Zhang and Hongyuan Zha
  • C. Scott and R. Nowak, ``Learning Minimum Volume Sets,"
  • huys qjm / zemel r / natarajan r / dayan pm - fast population coding
  • Two view learning: SVM-2K, Theory and Practice" by Jason D. R. Farquhar, David R. Hardoon, Hongying Meng, John Shawe-Taylor and Sandor Szedmak
  • N. Loeff and A.Sorokin and H. Arora and D.A. Forsyth, ``Efficient Unsupervised Learning for Localization and Detection in Object Categories'', NIPS 2005
  • T. Roos, , P. Grünwald, P. Myllymäki and H.Tirri. Generalization to Unseen Case
  • Mooy, J., & Kappen, H.J. Validity estimates for loopy belief propagation on binary real-world networks
  • Jorge G. Silva, Jorge S. Marques, João M. Lemos, Selecting Landmarkpoints for Sparse manifold Learning
  • Fleuret, F. and Blanchard, G. - Pattern Recognition from One Example by Chopping
  • A Probabilistic Approach for Optimizing Spectral Clustering - R. Jin ,C.Ding and F.Kang
  • Keiji Miura, Masato Okada, Shun-ichi Amari - Unbiased Estimator of Shape Parameter for Spiking Irregularities under Changing Environments
  • Blatt D. and Hero A. O., "From weighted classification to policy search"
  • A Connectionist Model for Constructive Modal Reasoning. A.d'Avila Garcez. Luis C. Lamb and Dov Gabbay
  • J. Ting, A. D'Souza, K. Yamamoto, T. Yoshioka, D. Hoffman, S. Kakei, L. Sergio, J. Kalaska, M. Kawato, P. Strick and S. Schaal. Predicting EMG Data from M1 Neurons with Variational Bayesian Least Squares.
  • G. L. Li, T.-Y. Leong, Learning Causal Bayesian Network with Constraints from Domain Knowledge
  • G. L. Li, T.-Y. Leong, Learning Bayesian Networks with Variable Grouping Methods.
  • L. Itti and P. Baldi. "Bayesian Surprise Attracts Human Attention"
  • Q-Clustering M. Narasimhan, N. Jojic and J. Bilmes
  • Nando de Freitas, Yang Wang, Maryam Mahdaviani, Dustin Lang. Fast Krylov Methods for N-Body Learning.
  • Consistency of One-Class SVM and Related Algorithms, R. Vert and J.-P. Vert
  • A. J. Bell and L. C. Parra, "Maximising sensitivity in a spiking network,"
  • R. S. Zemel, Q. J. M. Huys, R. Natarajan, and P. Dayan, "Probabilistic computation
  • in spiking populations," in Advances in NIPS, 2005
  • C. Yang, R. Duraiswami and L. Davis: "Efficient Kernel Machines Using the Improved Fast Gauss Transform", NIPS 2005
  • F. Orabona. Object-based Model of the Visual Attention for Imitation
  • Jieping Ye, Ravi Janardan, Qi Li. Two-Dimensional Linear Discriminant Analysis
  • L. Song, E. Gordon, and E. Gysels, "Phase Synchrony Rates for the Recognition of Motor Imageries in BCIs"
  • Kakade, S., M. Seeger and D. Foster: Worst-Case Bounds for Gaussian Process Models.
  • Shen, Y., A. Ng and M. Seeger: Fast Gaussian Process Regression using KD-Trees.
  • Kevin J. Lang - Fixing two weakness of the Spectral Method
  • Kari Torkkola and Eugene Tuv. Ecumenical kernels from random forests
  • C. Molter and U. Salihoglu and H. Bersini, Storing information through complex dynamics in Recurrent Neural Networks
  • J. Ting, A. D'Souza, K. Yamamoto, T. Yoshioka, D. Hoffman, S. Kakei, L. Sergio, J. Kalaska, M. Kawato, P. Strick and S. Schaal. Predicting EMG Data from M1 Neurons with Variational Bayesian Least Squares
  • Z. Nenadic, D.S. Rizzuto, R.A. Andersen, and J.W. Burdick, "Discriminat basedfeature selection with information theoretic objective
  • Mooy, J., & Kappen, H.J. (2004). Validity estimates for loopy belief propagation on binary real-world networks
  • Lackey, J. and Colagrosso, M (2005). A Kernel Method for Polychotomous Classification
  • J. A. Palmer, K. Kreutz-Delgado, D. P. Wipf, and B. D. Rao, Variational EM Algorithms for Non-Gaussian Latent Variable Models
  • Singh, S., Barto, A., and Chentanez, N. (2005). Intrinsically motivated reinforcement learning
  • Yun-Gang Zhang, Changshui Zhang, Separation of Music Signals by Harmonic Structure Modeling.
  • Rob Powers, Yoav Shoham, New Criteria and a New Algorithm for Learning in Multi-Agent Systems
  • Wood, F., Roth, S., and Black, M. J., "Modeling neural population spiking activity with Gibbs distributions,
  • Hinton, G. E. and Nair, V. Inferring motor programs from images of handwritten digits.
  • Onureena Banerjee, Alexandre d'Aspremont, Laurent El Ghaoui - Sparse Covariance Selection via Robust Maximum Likelihood Estimation.
  • "Measuring Shared Information and Coordinated Activity in Neuronal Networks." K. Klinkner, C. Shalizi, and M. Camperi, NIPS, 2005.
  • Generalisation Error Bounds for Classifiers Trained with Interdependent Data Usunier N., Amini M.-R., Gallinari P
  • Brafman, R. I. and Shani, G. "Resolving perceptual asliasing with noisy sensors"
  • Griffiths, T.L., and Ghahramani, Z. (to appear) Infinite Latent Feature Models and the Indian Buffet Process
  • Zhang, J., Ghahramani, Z. and Yang, Y. (to appear) Learning Multiple Related Tasks using Latent Independent Component Analysis.
  • Murray, I., MacKay, D.J.C., Ghahramani, Z. and Skilling, J. (to appear) Nested Sampling for Potts Models.
  • Snelson, E. and Ghahramani, Z. (to appear) Sparse Parametric Gaussian Processes
  • Ghahramani, Z. and Heller, K.A. (to appear) Bayesian Sets
  • On the Convergence of Eigenspaces in Kernel Principal Component Analysis: G. Blanchard and L. Zwald
  • Poupart, P. and Boutilier, C. VDCBPI: an Approximate Scalable Algorithm for Large Scale POMDPs
  • T. Murayama and P. Davis, Rate distortion codes for sensor networks: A system-level analysis,
  • Damian Eads, Karen Glocer, Simon Perkins, and James Theiler (2005). Grammar-guided feature extraction for time series classification.
  • Doi, E., Balcan, D. C., & Lewicki, M. S. A theoretical analysis of robust coding over noisy overcomplete channels
  • Nadler, Boaz; Lafon, Stephane; Coifman, Ronald R.; Kevrekidis, Ioannis G. - Diffusion Maps, Spectral Clustering and Eigenfunctions of Fokker-Planck operators
  • "Abstractions and Hierarchies for Learning and Planning" Lihong Li
  • "Walk-sum interpretation and analysis of Gaussian belief propagation ," Jason K. Johnson, Dmitry M. Malioutov, and Alan S. Willsky
  • Baker C., Tenenbaum J., & Saxe R. - Bayesian models of perceiving.intentional action
  • Tensor Subspace Analysis - Xiaofei He, Deng Cai, and Partha Niyogi
  • Laplacian Score for Feature Selection - Xiaofei He, Deng Cai, and Partha Niyogi.
  • Shunji Satoh ``Long-Range Horizontal Connections in V1 Serve to Multi-Scale Image Reconstruction
  • "Faster Rates in Regression via Active Learning" Rebecca Willett with R. Castro and R. Nowak,
  • Hyperparameter and Kernel Learning for Graph Based Semi-Supervised Classification - Ashish Kapoor, Yuan (Alan) Qi, Hyungil Ahn and Rosalind W. Picard
  • Maxim Raginsky, Svetlana Lazebnik - Estimation of intrinsic dimensionality using high-rate vector quantization
  • Ofer Dekel, Shai Shalev-Shwartz, Yoram Singer - The Forgetron: A Kernel-Based Perceptron on a Fixed Budget
  • Patrick Flaherty, Michael Jordan, Adam Arkin - Robust Design of Biological Experiments
  • Yael Niv, Nathaniel Daw, Peter Dayan - How Fast to Work: Response Vigor, Motivation and Tonic Dopamine
  • Learning from Data of Variable Quality - Koby Crammer, Michael Kearns, and Jennifer Wortman
  • R. Kondor and T. Jebara - Gaussian and Wishart Hyperkernels
  • L. Liao, D. Fox, and H. Kautz. - Location-Based Activity Recognition Reinforcement Learning of Local Shape in the Game of Atari-Go David Silver, Richard Sutton, Martin Müller, Markus Enzenberger


16 comments:

  1. Is this relevent to anything?

    ReplyDelete
  2. Anonymous2:25 PM

    This is relevant as NIPS is one of the principal machine learning conferences in the world. Thanks for compiling this list Yaroslav.

    ReplyDelete
  3. Actually this list seems to be out of date already since complete list of accepted papers is posted here:
    http://www.nips.cc/Conferences/2005/Posters/

    ReplyDelete
  4. Anonymous9:30 AM

    Such a wonderful article, thankd for sharing writing skills with me.

    ReplyDelete
  5. Yeah the list is so long and can't understand the logic. However a good platform always provide you best kind information. http://www.endocrinologyfellowship.com/our-personal-statement-endocrinology-writing-services/ is a best platform for students to understand the things.

    ReplyDelete
  6. UWM is really doing great for giving the student a chance to do their courses by sitting at home via the online way they have. check more info for your help and hope you'll love it.

    ReplyDelete
  7. The best area for the machine learning which is the field of the computer science. See the best techniques from which you can get the comfort in your life. Just visit the page of NIPS pre-prints and see the best dimensions.

    ReplyDelete
  8. This is my thinking. Why I hide this one from everyone. I know well that what I'm doing. I know I don't do anything wrong. SO why I hide this. I do what I like to do.

    ReplyDelete
  9. Here you have talking about NIPS pre-prints with a wonderful list. Also I have lots of valuable article in this blog which should helpful for engineering students. I think this list should helpful for those students also. You can take paper writing help from our services at https://www.capstonepaper.net/find-out-how-many-pages-is-my-paper/

    ReplyDelete
  10. So nicely written by you. You writing skill is so good and you are the best one in writing. I'm also trying to be a writer. That's why I'm trying hard to do better in this situation. Great job dude.

    ReplyDelete
  11. Such an important post this is. I don't find anything wrong here. The blog owner is so brilliant that's why he writes like this. So great to read this post. Keep writing like this dude.

    ReplyDelete
  12. Learning is always pleased for me. I love to learn new things. This is the best site where i visit for a minute. In this site, I find many new things that I don't know at first. This website really a better place for the learner.

    ReplyDelete
  13. Indeed this is interesting to see the results. Students visit the site just for their knowledge and also for the reputition of such blogs. I really love to see satisfied students.

    ReplyDelete
  14. Really this number is very long in addition to are not able to fully grasp this reason. Even so a superb software generally provide you very best form facts. http://www.bibhelper.net is usually a very best software intended for learners to recognize what.

    ReplyDelete