Head of lab

Prof. Dr. Jörg Lücke

+49 441 798 5486

+49 441 798-3902

W30 2-201


Nicole Kulbach (temporary)

+49 441 798-3326

+49 441 798-3902

W30 3-316

Postal Address

Prof. Dr. Jörg Lücke
Arbeitsgruppe Machine Learning
Exzellenzcluster Hearing4all und
Department für Medizinische Physik und Akustik
Fakultät für Medizin und Gesundheitswissenschaften
Carl von Ossietzky Universität Oldenburg
D-26111 Oldenburg

Office Address

Room 201 (2nd floor)
Building W30 (NeSSy)  
Küpkersweg 74
26129 Oldenburg


21 Jan 2022
Our paper "Evolutionary Variational Optimization of Generative Models" (Drefs et al.) has been accepted for publication by the Journal of Machine Learning Research (see here).

23 Nov 2021
Our paper "A Variational EM Acceleration for Efficient Clustering at Very Large Scales" (Hirschberger et al.) has been accepted for publication by IEEE Trans. on Pattern Analysis and Machine Intelligence (see here).

12 Oct 2021
Hamid Mousavi successfully defended his doctoral thesis "Non-Linear Latent Variable Models for Inference and Learning from Non-Gaussian Data". Congratulations!

15 Sept 2021
Our paper "Phase transition for parameter learning of hidden Markov models" (Rau et al.) has been accepted by Physical Review E (see here).

15 July 2021
Our paper "Generalizable dimensions of human cortical auditory processing of speech in natural soundscapes: A data-driven ultra high field fMRI approach" (Boos et al.) has been published by NeuroImage.

16 June 2021
The new postdoc Dr. Dmytro Velychko has joint the lab - welcome!

16 June 2021
Our grant "On the Convergence of Variational Deep Learning to Sums of Entropies" has been positively evaluated and is funded by the German Research Foundation (DFG). The grant is jointly with Asja Fischer (RUB). With the grant we are very happy to be part of the DFG priority program 2298 "Theoretical Foundations for Deep Learning".

29 April 2021
Our paper "Inference and Learning in a Latent Variable Model for Beta Distributed Interval Data" (Mousavi et al.) has been published by Entropy.

25 Feb 2021
We have published a bioRxiv paper on our first results in the BMBF special funds grant on COVID-19 research. See paper "Visualization of SARS-CoV-2 Infection Scenes by ‘Zero-Shot’ Enhancements of Electron Microscopy Images" (Drefs et al). See here for an example image. See here for a report on the University website. 

4 February 2021
We hosted a hands-on session on our PyTorch software framework for Truncated Variational Expectation Maximization with members from the Haefner Lab from the University of Rochester.

22 December 2020
Our paper "Evolutionary Variational Optimization of Generative Models" (Drefs et al.) is now available on arXiv.

27 November 2020
Our paper "Direct Evolutionary Optimization of Variational Autoencoders With Binary Latents" (Guiraud et al.) is now available on arXiv.

Machine Learning

Machine Learning

Machine Learning

Our Research

Based on first theoretical principles, our group develops novel efficient learning algorithms for standard and novel data models. The resulting algorithms are applied to a range of different domains including acoustic data, visual data, medical data and data of general pattern recognition tasks. Alongside the theoretical and practical algorithm development, we investigate advanced Machine Learning methods as models for neural information processing; and, visa versa, use ideas and insights from the neurosciences to motivate novel research directions in Machine Learning.

We pursue and conduct projects on efficient generative models (including deep generative models) for large-scale unsupervised and semi-supervised learning, autonomous learning and data enhancement.

We are part of the cluster of excellence Hearing4all and the Department of Medical Physics and Acoustics at the School of Medicine and Health Sciences.

For any inquiries please contact Jörg Lücke.

Selected Publications

(The complete list can be found here.)

Jakob Drefs, Enrico Guiraud and Jörg Lücke (2022).
Evolutionary Variational Optimization of Generative Models.
Journal of Machine Learning Research 23(21):1-51 (online access, bibtex).

F. Hirschberger*, D. Forster* and J. Lücke (2021).
A Variational EM Acceleration for Efficient Clustering at Very Large Scales.
IEEE Transactions on Pattern Analysis and Machine Intelligence, doi: 10.1109/TPAMI.2021.3133763 (online access).
*joint first authorship.

J. Lücke, D. Forster, Z. Dai (2021).
The Evidence Lower Bound of Variational Autoencoders Converges to a Sum of Three Entropies.
arXiv:2010.14860 (arXiv)

J. Lücke and D. Forster (2019).
k-means as a variational EM approximation of Gaussian mixture models.
Pattern Recognition Letters 125:349-356 (online access, bibtexarXiv)

A. S. Sheikh*, N. S. Harper*, J. Drefs, Y. Singer, Z. Dai, R.E. Turner and J. Lücke (2019).
STRFs in primary auditory cortex emerge from masking-based statistics of natural sounds.
PLOS Computational Biology 15(1): e1006595 (online access, bibtex)
*joint first authorship.

T. Monk, C. Savin and J. Lücke (2018).
Optimal neural inference of stimulus intensities.
Scientific Reports 8: 10038 (online access, bibtex)

D. Forster, A.-S. Sheikh and J. Lücke (2018).
Neural Simpletrons - Learning in the Limit of Few Labels with Directed Generative Networks
Neural Computation 30:2113–2174 (online access, bibtex)

R. Holca-Lamarre, J. Lücke* and K. Obermayer* (2017).
Models of Acetylcholine and Dopamine Signals Differentially Improve Neural Representations.
Frontiers in Computational Neuroscience 11:54 (online accessbibtex)
*joint senior authorship.

J.A. Shelton, J. Gasthaus, Z. Dai, J. Lücke and A. Gretton (2017).
GP-select: Accelerating EM using adaptive subspace preselection.
Neural Computation 29(8):2177-2202(online accessbibtex)

G. Exarchakis and J. Lücke (2017).
Discrete Sparse Coding.
Neural Computation 29(11):2979-3013. (online accessbibtex)

T. Monk, C. Savin and J. Lücke (2016).
Neurons Equipped with Intrinsic Plasticity Learn Stimulus Intensity Statistics.
Advances in Neural Information Processing Systems (NIPS) 29: 4278-4286. (online accessbibtex)

A.-S. Sheikh and J. Lücke (2016).
Select-and-Sample for Spike-and-Slab Sparse Coding.
Advances in Neural Information Processing Systems (NIPS) 29: 3934-3942. (online accessbibtex)

Z. Dai and J. Lücke (2014).
Autonomous Document Cleaning – A Generative Approach to Reconstruct Strongly Corrupted Scanned Texts.
IEEE Transactions on Pattern Analysis and Machine Intelligence 36(10): 1950-1962. (online access, bibtex)

A.-S. Sheikh, J. A. Shelton, J. Lücke (2014).
A Truncated EM Approach for Spike-and-Slab Sparse Coding.
Journal of Machine Learning Research 15:2653-2687. (online access, bibtex)


The papers listed above have been published after peer review in different journals or conference proceedings. These journals or proceedings remain the only definitive repository of the content. Copyright and all rights therein are usually retained by the respective publishers. These materials may not be copied or reposted without their explicit permission. Use for scholarly purposes only.

(Changed: 2022-05-18)