There are few PhD positions with scholarship for developing event-driven sensors and computation for the humanoid robot iCub at the Italian Institute of Technology. The call, application form and tips on the documentation
to send are at the following link:
http://www.iit.it/en/openings/phd-calls/2030-phd-course-topic-information.html
Tutors: Chiara Bartolozzi
Department: iCub Facility (Istituto Italiano di Tecnologia)
http://www.iit.it/en/research/departments/icub-facility.html
Description: Carrying out real-world tasks robustly
and efficiently is one of the major challenges of robotics. Biology clearly outperforms artificial computing and robotic systems in terms of appropriateness of the behavioural response, robustness to interference and noise, adaptation to ever changing environmental
conditions, or energy efficiency. All these properties arise from the characteristics of the radically different style of sensing and computation used by the biological brain.
In conventional robots, sensory information is available in a sequence of static frames and high dynamics can be sensed only by increasing the sampling
rate. Unfortunately the available bandwidth limits the amount of information that can be transmitted forcing a compromise between resolution and speed.
The goal of the proposed theme is the
development of bio-inspired event-driven artificial vision sensors that will be mounted and validated on the humanoid robot iCub. The goal is to convey to the robot the most informative signal that it can use to robustly interact with the world.
The research will focus on the design of mixed-mode analog/digital circuits for photo-transduction and focal-plane processing. Different architectures and
circuits will be considered for the pixel design, evaluating the trade-off between complexity (and fill-in factor) and sensor pre-processing capabilities. Additionally, ad hoc digital asynchronous logic circuits and interfaces based on the “Address Event Representation”
protocol will be developed for optimally interfacing the sensor with the robot.
Requirements: degree in Electronic Engineering, Physics (or equivalent) and background
in Analog and/or Digital Circuit Design, possibly on FPGA programming (VHDL, Verilog). High motivation to work on a robotic platform.
Reference:
C. Posch, D. Matolin, and R. Wohlgenannt, “A QVGA 143 dB Dynamic Range Frame-Free PWM Image Sensor With Lossless Pixel-Level Video Compression and Time-Domain CDS,” IEEE J Solid-State Circuits, vol. 46, no. 1, pp. 259–275, Jan. 2011
Contacts:
chiara.bartolozzi@iit.it
Tutors: Chiara Bartolozzi
Department: iCub Facility (Istituto Italiano di Tecnologia)
http://www.iit.it/en/research/departments/icub-facility.html
Description: Carrying out real-world tasks robustly
and efficiently is one of the major challenges of robotics. Biology clearly outperforms artificial computing and robotic systems in terms of appropriateness of the behavioural response, robustness to interference and noise, adaptation to ever changing environmental
conditions, or energy efficiency. All these properties arise from the characteristics of the radically different style of sensing and computation used by the biological brain.
In conventional robots, sensory information is available in a sequence of static snapshots and high dynamics can be sensed only by increasing the sampling
rate. Unfortunately the available bandwidth limits the amount of information that can be transmitted forcing a compromise between resolution and speed.
The goal of the proposed theme is the
integration of event-driven sensors and computing platforms on the humanoid robot iCub. The goal is to convey to the robot the most informative signal that it can use to robustly interact with the world and provide adequate computing platforms such as
the SpiNNaker system.
The research will focus on the development of the infrastructure for optimally interfacing event-driven hardware modules based on the Address Event Representation
protocol on the iCub.
Requirements: degree in Electronic Engineering, Physics (or equivalent) and background
on FPGA programming (VHDL, Verilog). High motivation to work on a robotic platform.
Reference:
C. Bartolozzi, F. Rea, C. Clercq, M. Hofstätter, D. B. Fasnacht, G. Indiveri, and G. Metta, “Embedded neuromorphic vision for humanoid robots,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2011, pp. 129–135
Contacts:
chiara.bartolozzi@iit.it
Tutors: Chiara Bartolozzi, Lorenzo Natale
Department: iCub Facility (Istituto Italiano di Tecnologia)
http://www.iit.it/en/research/departments/icub-facility.html
Description: Interacting with a dynamical environment
is one of the major challenges of robotics. Biology clearly outperforms robotic systems when acting in real scenarios in terms of appropriateness of the behavioural response, robustness to interference and noise, adaptation to ever changing environmental conditions,
and energy efficiency. All these properties arise from the characteristics of the radically different style of sensing and computation used by the biological brain.
In conventional robots, sensory information is available in a sequence of static snapshots and high dynamics can be sensed only by increasing the sampling
rate. However the available bandwidth limits the amount of information that can be transmitted forcing a compromise between resolution and speed. Event-driven vision sensors transmit information as soon as a change occurs in their visual field, achieving incredibly
high temporal resolution, coupled with extremely low data rate and automatic segmentation of significant events.
The proposed theme aims at the exploitation of highly dynamical information from event-driven sensors for robust interaction of the humanoid robot iCub
with moving objects. The goal is to develop new techniques for the recognition and prediction of trajectories of moving targets including humans and objects and plan for precise reaching movements, by exploiting the high temporal resolution and compressive
signal encoding of event-driven vision sensors mounted on the iCub.
The research will focus on the development of the infrastructure for handling event-driven sensory data and of event-driven vision algorithms for motion
estimation as well as algorithms for trajectory prediction. Real time behaviour will be achieved by using compressive event driven computation and on-board processing on FPGA.
Requirements: degree in Computer Science or Engineering (or
equivalent) and background in Computer Vision and/or Machine Learning. High motivation to work on a robotic platform and good computer and FPGA programming skills
Reference: Benosman, R.; Clercq, C.;
Lagorce, X.; Sio-Hoi Ieng; Bartolozzi, C., "Event-Based Visual Flow," Neural Networks and Learning Systems, IEEE Transactions on , vol.25, no.2, pp.407,417, Feb. 2014, doi: 10.1109/TNNLS.2013.2273537
Contacts:
chiara.bartolozzi@iit.it ,
lorenzo.natale@iit.it
Tutors: Chiara Bartolozzi, Leonardo Badino
Department: iCub Facility (Istituto Italiano di Tecnologia)
http://www.iit.it/en/research/departments/icub-facility.html
Description: Robust speech detection in realistic
environments for human robot interaction is still a challenging task. Vision is currently used for improving speech recognition in noisy acoustic environments, however, the temporal content of the visual information is severely limited by the temporal discretization
of the frame-based acquisition. Event-driven vision sensors transmit information as soon as a change occurs in their visual field, achieving incredibly high temporal resolution, coupled with extremely low data rate and automatic segmentation of significant
events.
The goal of the proposed theme is the
exploitation of highly dynamical information from event-driven vision sensors for robust speech processing for the humanoid robot iCub. The goal is to extract visual features and cues from the stream of visual events that are related to speech production
landmarks and exploit them for further improving speech recognition.
The research will focus on the development of the infrastructure for handling event-driven sensory data and of novel algorithms for extraction of visual
features from event-driven cameras.
Requirements: degree in Computer Science or Engineering (or equivalent) and background
in Auditory Processing and/or Machine Learning. High motivation to work on a robotic platform and good computer and FPGA programming skills.
Reference:
G. Potamianos, C. Neti, G. Gravier, A. Garg, and A.W.Senior in Proceedings of the IEEE Vol. 91 No. 9 September 2003 pp. 1306-1326
Benosman, R.; Clercq, C.; Lagorce, X.; Sio-Hoi Ieng; Bartolozzi, C., "Event-Based Visual Flow," Neural Networks and Learning Systems,
IEEE Transactions on , vol.25, no.2, pp.407,417, Feb. 2014, doi: 10.1109/TNNLS.2013.2273537
Contacts:
chiara.bartolozzi@iit.it ,
leonardo.badino@iit.it
Chiara Bartolozzi
Researcher
Istituto Italiano di Tecnologia
Via Morego, 30
16163 Genoa, Italy
Ph: +39 010 7178-1474