Apologies for cross posting. Please forward to interested students. We have an open PhD student position in my lab (http://fias.uni-frankfurt.de/de/neuro/triesch) at the Frankfurt Institute for Advanced Studies (FIAS) in Frankfurt, Germany to explore the new area of active efficient coding (AEC), a recently formulated generalization of the efficient coding hypothesis to active perception. The basic idea of AEC is that sensory systems learn to use their motor degrees of freedom to contribute to the efficient encoding of sensory signals. Along these lines, we have developed computational models for the self-calibration of active stereo and motion vision [1,2,3]. These models simultaneously learn a sensory representation with sparse coding approaches and controllers for their eye movements through reinforcement learning. Both learning components aim to maximize the overall coding efficiency of the system, which leads to fully self-calibrating sensory-motor loops for active stereo vision (dispartiy tuning and vergence control) and motion vision (motion tuning and pursuit movements). To the best of our knowledge, these models are the first ever to demonstrate how such self-calibration can emerge from a generic efficient coding objective. We have also validated the approach on robots such as the iCub (http://www.icub.org). The work will be performed in a stimulating interdisciplinary environment with ample opportunities for collaboration with neuromorphic engineers, neuroscientists, roboticists, psychologists, and clinicians across the globe. Possible research directions include better understanding how and why such self-calibration can go awry in clinical conditions such as strabism and amblyopia, implementations on neuromorphic hardware and many others. We are seeking an outstanding and highly motivated PhD student for this project. Applicants should have obtained a Masters degree (or equivalent) in Computational Neuroscience or a related field. The ideal candidate will have excellent programming skills (Matlab, Python, C/C++), very good analytic skills, and a broad knowledge of computational neuroscience, machine learning, computer vision, robotics, signal processing and statistics. Furthermore, specific expertise in visual neuroscience, information theory, sparse coding models, and reinforcement learning are desirable. Experience with programming Graphics Processing Units (GPUs) would be a plus. The position can be filled immediately. The Frankfurt Institute for Advanced Studies is a research institution dedicated to fundamental theoretical research in various areas of science. It is embedded into Frankfurt's recently established natural science research campus. Frankfurt itself is the hub of one of the most vibrant metropolitan areas in Europe. It boasts a rich culture and arts community and repeatedly earns highest rankings in worldwide surveys of quality of living. Applications should include a statement of research interests, CV and contact information for at least two references. Send applications to application@fias.uni-frankfurt.de. Review of applications will begin immediately. [1] A Unified Model of the Joint Development of Disparity Selectivity and Vergence Control. Zhao Y, Rothkopf CA, Triesch J, Shi B. IEEE Int. Conf. on Development and Learning and Epigenetic Robotics (ICDL), 2012. (Paper of excellence award.) http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=6400876 [2] Robust active binocular vision through intrinsically motivated learning L. Lonini, S. Forestier, C. Teulière, Y. Zhao, B. Shi, J. Triesch, frontiers in Neurorobotics, 2013. http://journal.frontiersin.org/article/10.3389/fnbot.2013.00020/full [3] Self-calibrating smooth pursuit through active efficient coding C. Teulière, S. Forestier, L. Lonini, C. Zhang, Y. Zhao, B. Shi, J. Triesch, Robotics and Autonomous Systems, 2014 http://www.sciencedirect.com/science/article/pii/S0921889014002486 -- Prof. Dr. Jochen Triesch Johanna Quandt Research Professor Frankfurt Institute for Advanced Studies http://fias.uni-frankfurt.de/~triesch/ Tel: +49 (0)69 798-47531 Fax: +49 (0)69 798-47611