Irène Curie Fellowship
Industrial Engineering and Innovation Sciences
Institutes and others
EAISI - Eindhoven Artificial Intelligence Systems Institute, NWO
Autonomous driving is a key application of artificial intelligence, and machine vision specifically. Although contemporary machine vision systems now routinely outperform their biological counterparts, they are far from perfect, especially when they are integrated with the complex action-selection systems that drive autonomous vehicles.
Methods from explainable AI (XAI) are increasingly used to evaluate and improve the performance of AI systems. Although the technical implementation of these methods is becoming increasingly routine, it remains unclear exactly how these methods can be used most effectively to ensure the safe, responsible, and transparent artificial intelligence. For example, although XAI methods can be used to precisely characterize an AI system's classification performance, it remains unclear how much error can and should be tolerated in this performance. Moreover, although other XAI methods can tell us which factors are actually considered when decisions are being made, it remains unclear which factors are permissible and which ones are not.
This PhD project is designed to identify systemize and evaluate XAI methods, and to identify best-practices for machine vision in the context of autonomous driving, while taking into account human factors and societal norms. To this end, it will be necessary to not only consider mathematical and technical details, but also relevant insights from e.g. psychology of human decision- making, regulation & standardization of explainability, and ethical principles of safety, transparency, privacy, and fairness.
More specifically, research tasks will include:
As this is an inherently interdisciplinary research project, the ideal candidate will combine technical expertise in machine learning and explainable AI (e.g. visualization techniques and feature-importance measures) with an ability to engage relevant issues in social science and humanities (in particular, norms of explainability and AI safety).
The candidate will be integrated in the Mobile Perception Systems (MPS) lab as well as the Philosophy & Ethics (P&E) group. They will be a member of the LTP ROBUST consortium funded by NWO and NXP Semiconductors, and of the EAISI institute at TU/e.Job requirements
A meaningful job in a dynamic and ambitious university, in an interdisciplinary setting and within an international network. You will work on a beautiful, green campus within walking distance of the central train station. In addition, we offer you:
Eindhoven University of Technology is an internationally top-ranking university in the Netherlands that combines scientific curiosity with a hands- on attitude. Our spirit of collaboration translates into an open culture and a top-five position in collaborating with advanced industries. Fundamental knowledge enables us to design solutions for the highly complex problems of today and tomorrow. Curious to hear more about what it's like as a PhD candidate at TU/e? Please view the video.
Do you recognize yourself in this profile and would you like to know more? Please contact the hiring managers Dr. Carlos Zednik, c.a.zednikattue.nl and Dr. Hala Elrofai, h.b.h.elrofaiattue.nl.
Visit our website for more information about the application process or the conditions of employment. You can also contact Hanneke Huijs-Palmen, HR Advisor, j.b.huijs.palmenat ue.nl or +31 40 2472137.
Are you inspired and would like to know more about working at TU/e? Please visit our career page.
We invite you to submit a complete application by using the apply button. The application should include a:
We look forward to receiving your application and will screen it as soon as possible. The vacancy will remain open until the position is filled.