Dear colleagues, I recently started a new lab in the Department of Electrical and Computer Engineering at the University of Illinois Urbana-Champaign, working at the intersection of theoretical neuroscience, dynamical systems theory, and machine learning. I am recruiting one to three fully funded PhD students for Fall 2026, with admission through ECE (Electric and Computer Engineering). Our research uses dynamical systems theory to understand the mechanisms and principles of learning and dynamics in neural circuits, bridging biological and artificial systems. We combine mathematical analysis with large-scale simulations to connect neuroscience, physics, and ML. PhD project directions include: 1. Learning dynamics and mechanistic interpretability. Understanding how brains and neural networks learn over time: how synapses change, how activity patterns evolve, and how useful internal representations emerge, with an eye toward opening the black box of both biological and artificial systems (see references [1–2]). 2. Neural dynamics, chaos, and computation. Exploring how network structure and inputs shape complex activity patterns, including chaotic regimes, and how these patterns can be harnessed for computation in both brain-inspired and artificial networks (see references [3–4]). 3. Training algorithms for spiking networks. Developing new ways to train spiking neural networks so that they are both efficient and easy to optimize, building on ideas like Gradient Flossing and fast event-based simulation methods such as SparseProp (see references [5–6]). I welcome applicants from quantitative backgrounds like physics, mathematics, computer science, or engineering, who want to apply quantitative skills to neuroscience questions. Comfort with mathematics and programming is required; you should enjoy thinking with equations and code. I value reliable, self-driven students who enjoy taking projects from idea to finished work. The lab aims for a supportive, collaborative environment that is intellectually intense and kind. We believe great science comes from working together, and we support diversity in backgrounds and ideas while prioritizing well-being and deep curiosity about neural computation. Positions are fully funded with access to exceptional computing resources through the National Center for Supercomputing Applications (NCSA). The lab is part of UIUC's Neuroinformatics cluster with ties to the Beckman Institute and Coordinated Science Laboratory. Co-advising with experimental groups is possible. For Fall 2026 entry: • ECE PhD application deadline: January 15, 2026. In your application, clearly indicate Prof. Rainer Engelken in the "faculty of interest" field and in your Statement of Purpose. Before or after applying, you are welcome to email me with a CV, transcript, and a short note about your background and interests if you think this opening is a good fit. Learn more: https://rainerengelken.github.io/ or https://ece.illinois.edu/about/directory/faculty/engelken If you have strong candidates, please forward this email. Students can also reach out to me directly at engelken@illinois.edu Best regards, Rainer Engelken Assistant Professor The Grainger College of Engineering Department of Electrical and Computer Engineering University of Illinois Urbana-Champaign P.S. Students reading this directly: feel free to email me with questions. I'm especially interested in candidates with strong math/physics training looking to work on neural dynamics and learning. References [1] Engelken, R., Ingrosso, A., Khajeh, R., Goedeke, S., & Abbott, L. F. (2022). Input correlations impede suppression of chaos and learning in balanced firing-rate networks. PLOS Computational Biology, 18(12), e1010590. [2] Engelken, R., & Goedeke, S. (2022). A time-resolved theory of information encoding in recurrent neural networks. Advances in Neural Information Processing Systems, 35, 35490-35503. [3] Engelken, R., Wolf, F., & Abbott, L. F. (2023). Lyapunov spectra of chaotic recurrent neural networks. Physical Review Research, 5(4), 043044. [4] Engelken, R., Monteforte, M., & Wolf, F. (2024). Sparse chaos in cortical circuits. arXiv preprint arXiv:2412.21188. [5] Engelken, R. (2023). Gradient flossing: Improving gradient descent through dynamic control of jacobians. Advances in Neural Information Processing Systems, 36, 10412-10439. [6] Engelken, R. (2023). Sparseprop: Efficient event-based simulation and training of sparse recurrent spiking neural networks. Advances in Neural Information Processing Systems, 36, 3638-3657.