- Scalable Training of L1-regularized Log-linear ModelsThe main idea is to do L-BFGS in an orthant where the gradient of the L1 loss doesn't change. Each time BFGS tries to step out of that orthant, project it's new point on the old orthant, and figure out the new orthant to explore
- Discriminative Learning for Differing Training and Test DistributionsIn addition to learning P(Y|X,t1), also learn P(this point is from test data|X). You can do logistic regression to model P(this point is from test data|X,t2), and then weight each point in training set by that value when learning P(Y|X). Alternatively you can learn both distributions simultaneously by maximizing P(Y|X,t1,t2) on test data, which gives even better results
- On One Method of Non-Diagonal Regularization in Sparse Bayesian Learning Relevance Vector Machine "fits" a diagonal Gaussian prior to data by maximizing P(data|prior).
In the paper they get a tractable method of fitting Laplace/Gaussian priors with non-diagonal matrices by first transforming parameters to a basis which uncorrelates the parameters at the point of maximum likelihood. - Piecewise Pseudolikelihood for Efficient Training of Conditional Random FieldsDoing pseudo-likelihood training (replacing p(y1,y2|x) with p(y1|y2,x)p(y2|y1,x)) on small pieces of the graph (piece-wise training) gives better accuracy than pseudo-likelihood training on the true graph
- CarpeDiem: an Algorithm for the Fast Evaluation of SSL Classifiers -- a useful trick for doing Viterbi faster -- don't bother computing forward values for nodes which are certain to not be included in the best path. You know a node will not be included in the search path if the a+b+c is smaller than some other forward value on the same level. a+b+c is largest forward value on previous level, b is largest possible transition weight, c is the "emission" weight for that node
Thursday, June 28, 2007
Cool Papers at ICML 07
Here are a few that caught my eye:
Subscribe to:
Post Comments (Atom)
4 comments:
Excellent machine learning blog,thanks for sharing...
Seo Internship in Bangalore
Smo Internship in Bangalore
Digital Marketing Internship Program in Bangalore
ISI Journal Publications gives an opportunity to the students to get research paper publication help. This works as an opportunity for the students where they can get recognized through their quality research and contribution in the relevant field. We also provide proofreading and formatting services. This involves the editing of the artwork and formatting of the manuscript according to the guidelines of the target journal ensuring the quality of work and providing you with an error-free document. This will help you to reach out to the maximum audience so that they can experience your research in the best possible way.
Best Assignment Writers have the best team of expert writers. These online assignment writers are completely qualified and have experience in the respective field of academic writing. They have the abilities and the right amount of experience along with the tools to produce the best and the standard quality of the assignment.
I DONT LIKE THHAT TYPE OF BLOGPOST.
https://peakupdates.com/
Post a Comment