Skip to main content Skip to navigation

EngD in AI Assurance and Verification

EngD in AI Assurance and Verification

Project Overview


Artificial Intelligence plays a significant role in modern society and business, and this is only set to grow. The technology features in the electronic games we play, the cars we drive and the chatbots we engage with. The UK recently hosted leaders around the world for a Global AI Safety Summit, highlighting the importance of understanding and controlling how AI operates and is deployed. The UK has one of the strongest AI markets globally with continuing government support and a thriving academic and innovative technology industry. To successfully bring AI-based systems to the general public and ensure adoption and sufficient trust, there is a need for AI assurance. We are seeking applications from motivated individuals with the potential to contribute to this important scientific effort. The AI Assurance EngD will provide the successful applicant with key skills to unlocking this potential. This EngD studentship aims to train the first generation of technical leaders that will specialise in the area of AI assurance. This will be delivered through Frazer-Nash, Digital Systems Assurance group in partnership with the University of Warwick’s WMG centre. The expected duration of this program and funding is available for 4 years The EngD student will be office based in Bristol (hybrid), in a team of professional engineers and scientists and other recent graduates working on industry interesting and diverse AI and autonomy projects.

WMG, University of Warwick and Frazer-Nash Consultancy WMG is an academic department at the University of Warwick and is the leading international role model for successful collaboration between academia and the public and private sectors, driving innovation in science, technology and engineering, to develop the brightest ideas and talent that will shape our future. Frazer-Nash is a systems-engineering and technology solutions consultancy, supporting key UK and international organisations and sectors such as energy, defence, space, transport and healthcare (https://www.fnc.co.uk/). We are seeing high interest and rapid growth in the area of AI assurance and see this as a major growth area. Research Objectives The assurance of AI-based systems is a complex challenge that will require specialist skills to understand; the specific risks that AI pose, the values and principles that we should be aiming to meet, the tools and techniques needed to assess such systems and an awareness of the current standards and regulations that developers will need to meet. The research objectives are Critical analysis of current research in verification and assurance of AI systems, including tools and techniques for verification of complex systems and techniques for assessing AI systems Develop a framework for risk assessment for AI-based systems Develop and test methods for runtime monitoring of AI systems Analysis of latest AI standards and verification tools or techniques Develop an application mapping strategy that links verification techniques to AI technology and relevant standards.

Essential and Desirable Criteria


Good BSc or higher in Computer Science, Engineering, Cyber Security, Mathematics, Statistics or similar. Experience of working in teams and meeting deadlines.

Funding and Eligibility

Funding is available to eligible Home fee status

To apply

To apply please complete our online enquiry form and upload your CV, transcripts and certificates of previous studies to allow us to assess your suitability for this specific PhD.

Please ensure you meet the minimum requirements before filling in the online form.

Key Information

Funding Source: Company Sponsor

Stipend: £15,285

Supporting company: Fraser-Nash Consulting

Supervisor: Professor Carsten Maple

Available to Home fee status and UK domicile EU students

Start date: June 2024