Navigation

Skiplinks

Contact

Head of lab

Prof. Dr. Jörg Lücke

+49 441 798 5486

+49 441 798-3902

W30 2-201

 

Secretary

Imke Brumund

+49 441 798-3709

+49 441 798-3902

W30 2-202

 

Postal Address

Prof. Dr. Jörg Lücke
Arbeitsgruppe Machine Learning
Exzellenzcluster Hearing4all und
Department für Medizinische Physik und Akustik
Fakultät für Medizin und Gesundheitswissenschaften
Carl von Ossietzky Universität Oldenburg
D-26111 Oldenburg

Office Address

Room 201 (2nd floor)
Building W30 (NeSSy)  
Küpkersweg 74
26129 Oldenburg

News

19 March 2018

Our paper "Truncated Variational Sampling for ‘Black Box’ Optimization of Generative Models" (Lücke et al.) has been accepted for LVA/ICA 2018.
 

5 March 2018
Our paper "Neural Simpletrons - Learning in the Limit of Few Labels with Directed Generative Networks" (Forster et al.) has been accepted by Neural Computation.
 

22 Dec 2017
Our paper "Can clustering scale sublinearly with its clusters?" (Forster & Lücke) has been accepted for AISTATS 2018.
 

30 June 2017
Our paper "Discrete Sparse Coding" (Exarchakis & Lücke) has been accepted by Neural Computation.
 

7 June 2017
Our paper "Models of acetylcholine and dopamine signals differentially improve neural representations" (Holca-Lamarre et al.) has been accepted by the journal Frontiers in Neuroscience.
 

25 May 2017
Our paper "Binary non-negative matrix deconvolution for audio dictionary learning" (Drgas et al.) has been accepted by the journal IEEE Transactions on Audio, Speech and Language Processing.
 

8 Mar 2017
Our paper "GP-select: Accelerating EM using adaptive subspace preselection" (Shelton et al.) has been accepted for publication by Neural Computation.
 

4 Feb 2017
Our paper "Truncated Variational EM for Semi-Supervised Neural Simpletrons" (Forster & Lücke) has been accepted for the IJCNN 2017.
 

Machine Learning

Machine Learning

Our Research

Based on first theoretical principles, our group develops novel efficient learning algorithms for standard and novel data models. The resulting algorithms are then applied to a range of different domains including acoustic data, visual data, medical data and data of general pattern recognition tasks. Alongside these theoretical and practical algorithm development, we investigate advanced Machine Learning methods as models for neural information processing; and, visa versa, use ideas and insights from the neurosciences to motivate novel research directions in Machine Learning.

We pursue and conduct projects on non-linear dictionary learning, large-scale unsupervised and semi-supervised learning, and autonomous learning.

We are part of the cluster of excellence Hearing4all and the Department of Medical Physics and Acoustics at the School of Medicine and Health Sciences.

For any inquiries please contact Jörg Lücke.

News

5 July 2018
Our paper "Truncated Variational Sampling for ‘Black Box’ Optimization of Generative Models" has been presented at the LVA/ICA 2018.
 

3 July 2018
Our paper "Optimal neural inference of stimulus intensities" (Monk et al.) has been published by Nature's Scientific Reports.
 

24 March 2018
Our paper "Evolutionary Expectation Maximization" (Guiraud et al.) has been accepted for GECCO 2018.
 

 

 

 

For further news see bottom of page

Selected Recent Publications

T. Monk, C. Savin and J. Lücke (2018).
Optimal neural inference of stimulus intensities.
Scientific Reports 8: 10038 (online access, bibtex)

D. Forster and J. Lücke (2018).
Can clustering scale sublinearly with its clusters? A variational EM acceleration of GMMs and k-means.
International Conference on Artificial Intelligence and Statistics (AISTATS), in press (online access)

D. Forster, A.-S. Sheikh and J. Lücke (2018).
Neural Simpletrons - Learning in the Limit of Few Labels with Directed Generative Networks
Neural Computation, 30:2113–2174 (online access, bibtex)

R. Holca-Lamarre, J. Lücke* and K. Obermayer* (2017).
Models of Acetylcholine and Dopamine Signals Differentially Improve Neural Representations.
Frontiers in Computational Neuroscience, 11:54 (online accessbibtex)
*joint senior authorship.

J.A. Shelton, J. Gasthaus, Z. Dai, J. Lücke and A. Gretton (2017).
GP-select: Accelerating EM using adaptive subspace preselection.
Neural Computation 29(8):2177-2202(online accessbibtex)

G. Exarchakis and J. Lücke (2017).
Discrete Sparse Coding.
Neural Computation, 29(11):2979-3013. (online accessbibtex)

T. Monk, C. Savin and J. Lücke (2016).
Neurons Equipped with Intrinsic Plasticity Learn Stimulus Intensity Statistics.
Advances in Neural Information Processing Systems (NIPS), 29: 4278-4286. (online accessbibtex)

A.-S. Sheikh and J. Lücke (2016).
Select-and-Sample for Spike-and-Slab Sparse Coding.
Advances in Neural Information Processing Systems (NIPS), 29: 3934-3942. (online accessbibtex)

Z. Dai and J. Lücke (2014).
Autonomous Document Cleaning – A Generative Approach to Reconstruct Strongly Corrupted Scanned Texts.
IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10): 1950-1962. (online access, bibtex)

A.-S. Sheikh, J. A. Shelton, J. Lücke (2014).
A Truncated EM Approach for Spike-and-Slab Sparse Coding.
Journal of Machine Learning Research, 15:2653-2687. (online access, bibtex)

M. Henniges, R. E. Turner, M. Sahani, J. Eggert, J. Lücke (2014).
Efficient Occlusive Components Analysis.
Journal of Machine Learning Research, 15:2689-2722. (online access, bibtex)

COPYRIGHT NOTICE

The papers listed above have been published after peer review in different journals or conference proceedings. These journals or proceedings remain the only definitive repository of the content. Copyright and all rights therein are usually retained by the respective publishers. These materials may not be copied or reposted without their explicit permission. Use for scholarly purposes only.

Webm1ttnaster (petra.t+qwilts@8bouol.7e5jdeuiro) (Changed: 2018-12-17)