Skip to content
mimi

Research Engineer, Agentic Critical-Systems Cyber-Security

The Alan Turing Institute

UK · On-site Full-time £46k – £51k/yr Today

About the role

The Role

This Research Engineer role sits within The Alan Turing Institute’s Defence and National Security programme and focuses on designing, building and evaluating LLM-driven agentic systems to strengthen the cyber resilience of critical national infrastructure, particularly legacy operational technology and industrial control systems that cannot easily be patched or upgraded. The postholder will develop autonomous hardening and monitoring capabilities, create realistic digital twin and simulation environments, run robust large-scale evaluations, and produce high-quality research software, technical outputs, and publications, while working closely with multidisciplinary experts across AI, cyber security and critical systems. It’s a highly applied research role suited to someone excited by combining machine learning, software engineering and cyber security to solve complex real-world resilience challenges with national importance.

Your Profile

We’re looking for a technically strong and collaborative research engineer with a solid grounding in AI, machine learning, cyber or information security, computer science, engineering, or a related field, ideally educated to Master’s level or with equivalent experience. The ideal candidate will bring hands-on experience building LLM-powered agentic or tool-calling systems, strong scientific software development skills in Python and related frameworks and a track record of delivering practical research or engineering outputs with measurable impact. They’ll be comfortable working across disciplines, communicating complex ideas clearly to both technical and non-technical audiences and operating with a high degree of autonomy while contributing positively to a team environment. Experience with network security, cyber-physical systems, virtualisation, OT/ICS environments and research publications would be especially valuable, as would a thoughtful, evidence-led approach and eligibility for Security Check clearance.

How You Will Make an Impact • Collaboratively design, build and maintain agentic, AI systems and evaluation suites that support CNI cyber resilience and legacy software hardening against known vulnerabilities. • Develop virtualisation based digital twin environments and implement techniques for assuring operational continuity in the context of CNI. • Develop realistic virtualised environments with legacy OT/ICS hosts, including legacy operating system instances running OT/ICS control software and with representative OT protocol traffic. • Contribute to high-quality collaborative research as part of the Turing’s CNI resilience mission. • Develop CNI-relevant scenarios with clear threat models, operational continuity criteria, and defensible assumptions; document limitations and known failure modes explicitly. • Implement reproducible evaluation pipelines: configuration-driven runs, dataset/version management, baseline implementations and auditable reporting outputs. • Develop and validate metrics and scoring methods. • Carry out analysis that supports credible interpretation of results (failure case analysis, ablations and sensitivity checks).

Application Procedure

If you are interested in this opportunity, please click the apply button below. You will need to register on the applicant portal and complete the application form including your CV and covering letter.

Your covering letter should focus on the following: • Your motivation for applying for this role • An overview of your experience developing software in a scientific computing context • Publication list (if not covered in CV)

If you have questions about the role or would like to apply using a different format, please contact us at recruitment@turing.ac.uk.

CLOSING DATE FOR APPLICATIONS: SUNDAY 12 APRIL 2026 23:59 (LONDON, UK BST)

Terms and Conditions

This post is offered on a full time, fixed-term basis until 31 March 2027. The annual salary is £45,505 - £51,241 plus excellent benefits, including flexible working and family friendly policies, Employee-only benefits guide | The Alan Turing Institute

Eligibility for Security Check (SC) clearance is a requirement for this role. Eligibility criteria and further information on the process can be found on the UK Government security vetting website.

The Alan Turing Institute is based at the British Library, in the heart of London’s Knowledge Quarter. We expect staff to come to our office at least 4 days per month. Some roles may require more days in the office; the hiring manager will be able to confirm this during the interview.

Security Clearance

The successful candidate may be required to undergo a pre-screening check prior to an offer being made. This check will be carried out by HMG Defence and Security Partners. Please be advised, by submitting your application you are consenting to this check, and your personal details (full name, date of birth and home address) to be passed onto our HMG Defence and Security Partners to carry out this check.

Many roles in the Defence and National Security Programme require higher levels of National Security Vetting where applicants must typically have 5 to 10 years of continuous residency in the UK or a NATO country depending on the vetting level required for the role, to allow for meaningful security vetting checks, amongst other factors. These roles are subject to security restrictions by Turing’s partners. The restrictions mean that factors such as your nationality, any nationalities you may have previously held, your foreign connections, and your place of birth can restrict your eligibility to perform the role.

Eligibility criteria and further information on the process can be found on the UK Government security vetting website. Applicants should check whether they are eligible to apply for SC clearance before applying to this role.

Equality, Diversity and Inclusion

We value diversity of background, experience, and perspective, and are proud to be an inclusive employer. We warmly encourage applications from all backgrounds, particularly from groups currently under-represented in our sector. If you feel passionate about this role but don’t meet every single requirement, please apply – we recognise that great candidates may bring strengths beyond the criteria listed.

We are committed to making sure our recruitment process is accessible and inclusive. This includes making reasonable adjustments for candidates who have a disability or long-term condition. Please contact us at recruitment@turing.ac.uk to advise us how we can assist you.

Please note all offers of employment are subject to obtaining and retaining the right to work in the UK and satisfactory pre-employment security screening which includes a DBS Check.

Full details on the pre-employment screening process can be requested from HR@turing.ac.uk.

Requirements

  • Master’s level or with equivalent experience in AI, machine learning, cyber or information security, computer science, engineering, or a related field
  • Hands-on experience building LLM-powered agentic or tool-calling systems
  • Strong scientific software development skills in Python and related frameworks
  • Track record of delivering practical research or engineering outputs with measurable impact

Responsibilities

  • Collaboratively design, build and maintain agentic, AI systems and evaluation suites that support CNI cyber resilience and legacy software hardening against known vulnerabilities.
  • Develop virtualisation based digital twin environments and implement techniques for assuring operational continuity in the context of CNI.
  • Develop realistic virtualised environments with legacy OT/ICS hosts, including legacy operating system instances running OT/ICS control software and with representative OT protocol traffic.
  • Contribute to high-quality collaborative research as part of the Turing’s CNI resilience mission.
  • Develop CNI-relevant scenarios with clear threat models, operational continuity criteria, and defensible assumptions; document limitations and known failure modes explicitly.
  • Implement reproducible evaluation pipelines: configuration-driven runs, dataset/version management, baseline implementations and auditable reporting outputs.
  • Develop and validate metrics and scoring methods.
  • Carry out analysis that supports credible interpretation of results (failure case analysis, ablations and sensitivity checks).

Benefits

Flexible workingFamily friendly policiesEmployee-only benefits guide

Skills

AIMachine learningCyber securitySoftware engineeringPythonNetwork securityCyber-physical systemsVirtualisationOT/ICS environmentsResearch publications

Don't send a generic resume

Paste this job description into Mimi and get a resume tailored to exactly what the hiring team is looking for.

Get started free