Porqueroles May 30th and 31th

Your venue

Poqueroles is close to Marseille (at least w.r.t. Lille...)

Pour se rendre à Porquerolles depuis la gare d'Hyères, il faut d'abord prendre le bus 67 jusqu'à la tour fondue ou partager un taxi (je n'ai pas les tarifs ; voir http://www.taxis-hyeres.com/).

Pour toutes informations sur la traversée : http://www.porquerolles.com/accueil/comment-venir/. Elle dure 20 minutes en général. Mais il faut savoir qu'en cas de conditions meteo difficiles, certaines traversées peuvent être annulées.

La fiche horaire des traversées est disponible : http://www.calameo.com/read/001296734faf460fc78aa

Fiche horaire du bus 67 : Hyères -> Tour fondue also http://www.reseaumistral.com/horaires_ligne/index.asp?rub_code=6&thm_id=1150&gpl_id=0&lign_id=12&pa_id=1734&sens=1&date=12%2F04%2F2013

Talks

Tursday

15:00

Ludovic apprentissage séquentiel pour les données "quelconque" avec contraintes de budget

15:30

Thomas Graph Laplacian Combinations

16:00

David semi-supervised Spectral Graph Clustering

16:30

Discussions

Friday

9:30

Daniil Time-series information and unsupervised representation learning

10:00

Hachem Multiple Operator-valued Kernel Learning

10:30

Emilie A PAC-Bayesian Approach for Domain Adaptation with Specialization to Linear Classifiers

11:00

Pause

11:30

Marc Quadratic Program for a priori Constrained Weighted Majority Vote - Application to Nearest Neighbor Classifiers

14:00

Mattias Randomized methods and matrices

14:30

Olivier Offline evaluation of contextual bandits algorithm using CTR data : reducing the bias

15:00

Gabriel Proper latent representations for visual control using RL. Work in progress!

15:30

Philippe Bandits et chauffage des systèmes de recommandation

Participants

There are 20 rooms (most of them are double rooms, so we can be more than 20 if you accept to share one). We are more than 20. Who wants to share its room?

  1. Patrick Gallinari
  2. Ludovic Denoyer
  3. Gabriel Dulac Arnold
  4. Francois Denis
  5. Liva Ralaivola
  6. Hachem Kadri
  7. Marc Tommasi
  8. Rémi Gilleron
  9. David Chatel
  10. Thomas Ricatte
  11. Antonino Freno
  12. Olivier Nicol
  13. Philippe Preux
  14. Daniil Ryabko
  15. Marc Sebban
  16. Amaury Habrard
  17. Émilie Morvant
  18. Jean-Philippe Peyrache
  19. Julien Audiffren
  20. Mattias Gybels

Lille June 19th and 20th

Location

Register

Program

We plan to start à 14:00 on Tuesday and end at 16:00 on Wednesday. If you arrive in the morning, please join me in my office at INRIA.

Tuesday 19th

13:00

Lunch Pariselle (Restaurant universitaire).

14:30

Raphaël Bailly: A spectral learning framework for graphical models

15:15

Jean-Philippe Peyrache: Boosting for Domain Adaptation with Theoretical Guarantees

16:00

Aurélien Bellet : Similarity Learning for Provably Accurate Sparse Linear Classification

16:45

Coffee break

17:00-18:00

Discussions

20:00

Dinner in Lille

Wednesday 20th

9:00

Émilie Morvant: PAC-Bayes Bound and Multi-Class Classification

9:45

Michal Valko: Scaling Graph-Based Learning

10:30

Coffee Break

11:00

Gabriel Dulac-Arnold : Fast RL with MDP Factorization

11:45

Antoine Ndione : Introduction to property testing

12:30

Lunch Pariselle (Restaurant universitaire).

14:15

Antonino Freno: Large-Scale Random Network Modeling via the Fiedler Delta Statistic

15:00-16:00

Discussions


.


June 30th and July 1st

Meeting in Saint Etienne.

Program

Jeudi 30 juin

11:30

Liva Ralaivola Playing with posteriors in online learning

12:30-14:00

Lunch

14:00-16:00

A. Bellet Learning Good Edit Similarities with Generalization Guarantees.

JP Peyrache Domain Adaptation with Good Edit Similarities

E. Morvant

16:00-16:30

Coffee break

16:30-18:00

Walk around and free discussions

18:00-19:30

Scientific commitee on ANR-Poster (cahier)

20:00

Dinner


Vendredi 1 juillet

9:00-11:00

L. Denoyer, G. Dulac et S. Peter

11:00-11:30

Coffee break

11:30-12:15

G. Garriga: on graph laplacians and and machine learning on graphs in Lille

Scientific commitee on ANR meeting in september

12:15

Lunch

14:00-14:45

Colin de la Higuera: Finding the most probable (consensus) string: an algorithmic study Colin de la Higuera and Jose Oncina

15:00-16:00

A. Freno: Bolzmann machines for learning to rank

16:00

Return trip


Abstracts

Finding the most probable (consensus) string: an algorithmic study

Colin de la Higuera and Jose Oncina

The problem of finding the most probable string for a distribution generated by a weighted finite automaton is related to a number of important questions: computing the distance between two distributions or finding the best translation (the most probable one) given a probabilistic finite state transducer. The problem is undecidable with general weights and is NP-hard if the automaton is probabilistic. In this paper we give a pseudo-polynomial algorithm which computes the most probable string in time polynomial in the inverse of the probability of this string itself. We also give a randomised algorithm solving the same problem and discuss the case where the distribution is generated by other types of machines.

Domain Adaptation with Good Edit Similarities

Jean-Philippe Peyrache

In many real-life applications, the available source training information is either too small or not representative enough of the underlying target test problem. In the past few years, a new line of machine learning research has been developed to overcome such awkward situations, called Domain Adaptation (DA), giving rise to many adaptation algorithms and theoretical results in the form of generalization bounds. In this paper, a novel contribution is proposed in the form of a DA algorithm dealing with string-structured data, inspired from the DA support vector machine (SVM) technique introduced in [Bruzzone et al, PAMI 2010]. To ensure the convergence of SVM-based learning, the similarity functions involved in the process must be valid kernels, i.e. positive semi-definite (PSD) and symmetric. However, in the string-based context that we are considering in this paper, this condition is often not satisfied. Indeed, it has been proven that most string similarity functions based on the edit distance are not PSD. To overcome this drawback, we make use in this paper of the new theory of learning with good similarity functions introduced by Balcan et al., which (i) does not require the use of a valid kernel to learn well and (ii) allows us to induce sparser models. We take advantage of this theoretical framework to propose a new DA algorithm using good edit similarity functions. Using a suitable string-representation of handwritten digits, we show that are our new algorithm is very efficient to deal with the scaling and rotation problems usually encountered in image classification.

People

  1. Marc Sebban
  2. Aurélien Bellet
  3. Jean-Philippe Peyrache
  4. JC Janodet
  5. Amaury Habrard
  6. François Denis
  7. Rémi Eyraud
  8. Liva Ralaivola
  9. Marc Bernard
  10. Marc Tommasi
  11. Rémi Gilleron
  12. Mikaela Keller
  13. Antonino Freno
  14. Gemma Garriga
  15. Jean Baptiste Faddoul
  16. Colin de la Higuera
  17. Ludovic Denoyer
  18. Patrick Gallinari
  19. Gabriel Dulac
  20. Stéphane Peters

Reading group on graphs in Lille

Web site

September 2 and 3

Start time: 11:00 (thursday) End: 16:00 (friday)

Travel

A bus starts at 9:30 from Lille Flandres station (Av St Venant, near flunch) and leaves Cassel on Friday at 16:00.

If you cannot take the bus, please consider 3 solutions:

  • Take a taxi
  • Take a train from Cassel station (not very close to the hotel, not so much trains)
  • Take a taxi to go to Hazebrouck station and then a train (quite high frequencies to get to Lille).

Accomodation

In Cassel (near Lille)

participants

  1. Riad Akrour (LNE)
  2. Aurélien Bellet (LaHC)
  3. Jean Decoster (LNE)
  4. Ludovic Denoyer (LIP6)
  5. Gabriel Dulac-Arnold (LIP6)
  6. Rémi Eyraud (LIF)
  7. Jean-Baptiste Faddoul (LNE)
  8. Patrick Gallinari (LIP6)
  9. Mohammad Ghavamzadeh (LNE)
  10. Édouard Gilbert (LNE)
  11. Rémi Gilleron (LNE)
  12. Amaury Habrard (LIF)
  13. Jean-Christophe Janodet (LaHC)
  14. Aurélien Lemay (LNE)
  15. Jérémie Mary (LNE)
  16. Olivier Nicol (LNE)
  17. Stéphane Peters (LIP6)
  18. Jean-Philippe Peyrache (LaHC)
  19. Nicolas Pinchaud (LIP6)
  20. Philippe Preux (LNE)
  21. Daniil Ryabko (LNE)
  22. Christophe Salperwyck (LNE)
  23. Marc Sebban (LaHC)
  24. Marc Tommasi (LNE)
  25. Fabien Torre (LNE)

Program

Slides are mostly in the private space.

Thursday morning 11:00

  • Opening and discussions about lampada and project teams
  • Task 1: Large datasets
    • Philippe Preux: Supervised learning: Back to simple things.

Thursday afternoon 14:30

  • Task 2: Learning on streams
    • Daniil Ryabko on Clustering time series data.
    • Ludovic Denoyer Modèles régularisés pour la classif dans les flux
  • Break
    • Christophe Salperwyck: on supervised and incremental learning and anytime algorithms
  • Task 3: Similarities, rational series, densities, kernels
    • Amaury Habrard: on kernels and learning with similarity functions.
  • Break, walk and dinner
  • Task 2: Incremental methods
    • Mohammad Ghavamzadeh: introduction to reinforcement Learning

Friday

  • Task 2: Incremental methods
    • Mohammad Ghavamzadeh: classification based approach to policy iteration
    • break
    • Ludovic Denoyer: RL for features extraction/selection/creation
  • lunch
  • Task 1: Composition of learning methods
    • Jean Baptiste Faddoul: 'Boosting Multi-Task Weak Learners with Applications to Textual and Social Data' (Slides).

June 9 - start 10:00

be careful : new location Jussieu salle 102 en 26-00, first tower on the left.

Welcome!

Nicolas Pinchaud has joined the group. Nicolas works on discovering features, transfert learning, complex actions in reinforcement learning.


January 8

Plennary Lampada meeting in Paris. Start : 10 am End 7pm

Program


January 5-8

STIC French Colloquium


December 18

First meeting on reinforcement learning. Lille. Inria