<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>8fa7eb38-b7e</externalid>
      <Title>Guidance, Navigation and Control (GNC) Engineer -  Tactical Reconnaissance &amp; Strike</Title>
      <Description><![CDATA[<p>We are seeking a Guidance, Navigation and Control (GNC) Engineer to join our Tactical Recon &amp; Strike team. As a GNC Engineer, you will be responsible for developing guidance algorithms and state estimators for group 1-3 UAV platforms. You will work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Develop guidance algorithms (terminal and midcourse) and state estimators for group 1-3 UAV platforms</li>
<li>Work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines</li>
<li>Review test data captured from the flight control system, subsystems, and other test instrumentation to verify vehicle performance, evaluate GNC algorithm/controller behaviour, or debug incidents during testing or from fielded assets</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>
<li>3+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>
<li>Experience with tactical guidance algorithms and target state estimation development</li>
<li>Experience with state estimation and filtering</li>
<li>Kalman filtering, sensor fusion, complementary filters, etc.</li>
<li>Experience in modelling and simulation of linear and nonlinear dynamic systems and model linearisation</li>
<li>Experience coding in Matlab and Simulink</li>
<li>C/C++ Proficiency</li>
<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Master&#39;s Degree or PhD in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>
<li>5+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>
<li>Experience with flight test and debugging GNC on UAV platforms</li>
<li>Experience with GNC design and analysis for fixed wing, rotary wing aircraft and/or tactical missile systems</li>
<li>Experience with seeker integration</li>
<li>Experience in one or more of the following: VIO, TERCOM, gimbal mount models, detection and tracking, LWIR/EO sensors, RADAR, target motion models</li>
<li>Experience with Simulink Embedded Coder</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$166,000-$220,000 USD</Salaryrange>
      <Skills>Matlab, Simulink, C/C++, Kalman filtering, Sensor fusion, Complementary filters, Model linearisation, Tactical guidance algorithms, Target state estimation development, State estimation and filtering, VIO, TERCOM, Gimbal mount models, Detection and tracking, LWIR/EO sensors, RADAR, Target motion models, Simulink Embedded Coder</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/andurilindustries.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defence technology company that transforms U.S. and allied military capabilities with advanced technology.</Employerdescription>
      <Employerwebsite>https://www.andurilindustries.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/5060107007</Applyto>
      <Location>Costa Mesa, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>19b66bec-a6b</externalid>
      <Title>Research Engineer / Scientist (SLAM)</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Research Engineer/Scientist to design, implement, and advance state-of-the-art simultaneous localization and mapping systems. This role is focused on modern SLAM techniques,both classical and learning-based,with an emphasis on scalable state estimation, sensor fusion, and long-term mapping in complex, dynamic environments.</p>
<p>As a Research Engineer/Scientist, you will:</p>
<ul>
<li>Design and implement modern SLAM systems for real-world environments, including visual, visual-inertial, lidar, or multi-sensor configurations.</li>
<li>Develop robust localization and mapping pipelines, including pose estimation, map management, loop closure, and global optimization.</li>
<li>Research and prototype learning-based or hybrid SLAM approaches that combine classical geometry with modern machine learning methods.</li>
<li>Build and maintain scalable state estimation frameworks, including factor graph optimization, filtering, and smoothing techniques.</li>
<li>Develop sensor fusion strategies that integrate cameras, IMUs, depth sensors, lidar, or other modalities to improve robustness and accuracy.</li>
<li>Analyze failure modes in real-world SLAM deployments (e.g., perceptual aliasing, dynamic scenes, drift) and design principled solutions.</li>
<li>Create evaluation frameworks, benchmarks, and metrics to measure SLAM accuracy, robustness, and performance across large datasets.</li>
<li>Optimize performance across the stack, including real-time constraints, memory usage, and compute efficiency, for large-scale and production systems.</li>
<li>Collaborate with reconstruction, simulation, and infrastructure teams to ensure SLAM outputs integrate cleanly with downstream world modeling and rendering pipelines.</li>
<li>Contribute to technical direction by proposing new research ideas, mentoring teammates, and helping define best practices for localization and mapping across the organization.</li>
</ul>
<p>We&#39;re looking for someone with 6+ years of experience working on SLAM, state estimation, robotics perception, or related areas. A strong foundation in probabilistic estimation, optimization, and geometric vision is required, as well as proficiency in Python and/or C++.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$250,000-$350,000 base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications)</Salaryrange>
      <Skills>SLAM, state estimation, robotics perception, probabilistic estimation, optimization, geometric vision, Python, C++</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>World Labs</Employername>
      <Employerlogo>https://logos.yubhub.co/worldlabs.ai.png</Employerlogo>
      <Employerdescription>World Labs builds foundational world models that can perceive, generate, reason, and interact with the 3D world.</Employerdescription>
      <Employerwebsite>https://worldlabs.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/worldlabs/jobs/4135311009</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>d04b0aff-768</externalid>
      <Title>Senior Engineer, Radar Modeling &amp; Simulation</Title>
      <Description><![CDATA[<p>The Software Integration &amp; Operations group turns frontier autonomy into mission-ready aircraft. We own the commit-to-flight pipeline,deterministic aircraft and mission simulation, HIL/VIL integration, CI/CD, automated flight qualification testing, and release engineering. Our goal is simple: make AI fly,safely, repeatably, and fast.</p>
<p>As a Modeling &amp; Simulation Engineer, you will be responsible for improving and adding to our sensor and communications model suite so that our operator training and internal engineering pipelines have a seamless translation from sim to real results.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Develop and enhance radar sensor models for use in simulation and evaluation of aeronautical vehicles.</li>
<li>Translate theoretical models into efficient, reliable C++ implementations with a focus on numerical accuracy and performance.</li>
<li>Validate models against real-world data and authoritative references, including field test data and calibration procedures.</li>
<li>Collaborate with simulation and training application teams to ensure models integrate cleanly into operator-facing tools.</li>
<li>Design automated validation and regression testing strategies for mathematical models to ensure fidelity across releases.</li>
<li>Prototype and evaluate new modeling techniques (reduced-order models, uncertainty quantification, machine learning–based surrogates) to push the state of the art.</li>
<li>Document assumptions, equations, and validation results so that both engineers and operators can trust model outputs.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>BS or higher in Aerospace Engineering, Applied Math, Physics, or related field with 5+ years of aerospace modeling experience.</li>
<li>C++ foundation with experience implementing numerical methods.</li>
<li>Demonstrated experience with aerospace models such as: Radar sensors, Radio communications systems</li>
<li>Experience validating simulations against real-world or experimental data.</li>
<li>Ability to write clear documentation explaining assumptions, limitations, and expected behaviors of models.</li>
</ul>
<p><strong>Preferences</strong></p>
<ul>
<li>1+ years of experience working on pilot/operator training systems.</li>
<li>Experience with Eigen or SciPy for model prototyping and validation.</li>
<li>Familiarity with state estimation sensor models (GPS, IMU, Gyro, etc) for simulation environments.</li>
<li>Demonstrated experience with payload sensor models including: Laser sensors, IR and optical cameras</li>
<li>Knowledge of uncertainty quantification and statistical analysis methods.</li>
<li>Experience with parallelization or GPU acceleration for compute-heavy models.</li>
<li>Strong problem-solving mindset with a collaborative and detail-oriented approach.</li>
<li>Familiarity with Python for test automation and data analysis pipelines.</li>
<li>Passion for aerospace and autonomous vehicle systems.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$105,000 - $200,000 a year</Salaryrange>
      <Skills>C++, Numerical methods, Aerospace models, Radar sensors, Radio communications systems, Eigen, SciPy, State estimation sensor models, Laser sensors, IR and optical cameras, Uncertainty quantification, Statistical analysis methods, Parallelization, GPU acceleration, Python, Test automation, Data analysis pipelines</Skills>
      <Category>Engineering</Category>
      <Industry>Automotive</Industry>
      <Employername>X-BAT Division – X-BAT Engineering - Software</Employername>
      <Employerlogo>https://logos.yubhub.co/bit.ly.png</Employerlogo>
      <Employerdescription>X-BAT Division develops software for autonomous aircraft systems.</Employerdescription>
      <Employerwebsite>http://bit.ly/shieldai_lever_homepage</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/5118a08f-4ae8-431f-a06f-6dba3eaff113</Applyto>
      <Location>Dallas, Texas / San Diego, California</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>98550091-2de</externalid>
      <Title>Staff Engineer, State Estimation</Title>
      <Description><![CDATA[<p>As a Staff State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimisation, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments.</p>
<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>
<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>
<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>
<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>
<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>
<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>
</ul>
<p><strong>Qualifications:</strong></p>
<ul>
<li>Typically requires a minimum of 7 years of relevant experience with a bachelor’s degree; or 6 years with a master’s degree; or 4 years with a PhD; or equivalent practical experience.</li>
<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>
<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>
<li>Proficient in C++11 or newer in real-time environments.</li>
<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>
<li>Strong written and verbal communication skills with a collaborative mindset.</li>
<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>
</ul>
<p><strong>Salary:</strong></p>
<p>$187,531 - $281,297 a year</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$187,531 - $281,297 a year</Salaryrange>
      <Skills>state estimation, sensor fusion, inertial navigation, Kalman filters, C++11, Linux, visual odometry, computer vision, CUDA, hardware acceleration</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015 with a mission to protect service members and civilians with intelligent systems.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/f8849287-b9ff-4c3e-a37f-be20e39c597b</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5f911dd8-860</externalid>
      <Title>Senior Staff Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>
<li>Ability to obtain a SECRET clearance</li>
</ul>
<p><strong>Preferences:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>
<li>Experience deploying perception software on SWaP-constrained platforms</li>
<li>Familiarity with validating perception systems during flight test events or operational environments</li>
<li>Understanding of sensing challenges in denied or degraded conditions</li>
<li>Exposure to perception applications across air, maritime, and ground platforms</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$220,800 - $331,200 a year</Salaryrange>
      <Skills>algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3dc40911-47e</externalid>
      <Title>Engineering Lead, Autonomy Software</Title>
      <Description><![CDATA[<p>Lead the development of cutting-edge autonomy software that enables unmanned systems to operate intelligently in complex, real-world environments.</p>
<p>In this role, you will guide multidisciplinary teams to design, build, and deploy high-performance autonomy solutions,from algorithm development to system integration and field testing. Working at the intersection of robotics, aerospace, and software engineering, you’ll drive mission-critical capabilities from concept to flight, delivering resilient, scalable systems that perform in dynamic and contested conditions.</p>
<p>Responsibilities:</p>
<ul>
<li><p>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution. Balance hands-on technical oversight with performance optimization, innovation, and clear stakeholder communication.</p>
</li>
<li><p>Design tactical autonomy algorithms to enable unmanned aircraft to perform complex missions across air, land, and sea domains with minimal human supervision.</p>
</li>
<li><p>Develop high-performance software modules that incorporate planning, decision-making, and behavior execution strategies for dynamic and adversarial environments.</p>
</li>
<li><p>Implement and test behavior architectures that enable multi-agent coordination, target engagement, reconnaissance, and survivability in contested scenarios.</p>
</li>
<li><p>Collaborate with cross-functional teams including perception, planning, simulation, hardware, and flight test to ensure seamless integration of autonomy solutions on real-world platforms.</p>
</li>
<li><p>Deploy autonomy capabilities to real platforms and participate in field tests and flight demos, validating performance in operationally relevant conditions.</p>
</li>
</ul>
<p>Required qualifications:</p>
<ul>
<li><p>A tertiary level qualification in Computer Science, Mechatronics, Software Engineering, Robotics or a related field</p>
</li>
<li><p>Significant professional experience in robotics, autonomy, perception or aerospace systems</p>
</li>
<li><p>Strong experience in modern C++</p>
</li>
<li><p>Experience leading teams to delivery engineering projects</p>
</li>
<li><p>Significant experience in building and delivering reliable software systems, ideally in fast-paced environments</p>
</li>
</ul>
<p>Preferred qualifications:</p>
<ul>
<li><p>Prior experience with uncrewed systems, especially in the air domain</p>
</li>
<li><p>Defence industry experience</p>
</li>
<li><p>Significant experience in one or more of the following domains:</p>
<ul>
<li><p>State Estimation</p>
</li>
<li><p>Real-Time Systems</p>
</li>
<li><p>Guidance, Navigation and Control</p>
</li>
<li><p>Path Planning</p>
</li>
</ul>
</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>C++, Robotics, Autonomy, Perception, Aerospace Systems, Software Engineering, Team Leadership, Project Management, Uncrewed Systems, Defence Industry, State Estimation, Real-Time Systems, Guidance, Navigation and Control, Path Planning</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015. It develops intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/28ebb068-b00d-4a34-a4d7-a471c84e09ff</Applyto>
      <Location>Melbourne</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>006babdb-38a</externalid>
      <Title>Principal Engineer, State Estimation</Title>
      <Description><![CDATA[<p>We are looking for an experienced state estimation engineer to design, develop, and support the deployment of safety-critical navigation solutions for aerospace platforms. This role requires deep expertise in state estimation theory, extensive practical experience implementing ownship navigation systems in the aerospace domain, and strong software development skills for production-quality navigation code.</p>
<p><strong>Navigation System Architecture &amp; Design:</strong></p>
<p>Establish navigation performance requirements and error budgets for safety-critical applications. Support decomposition of navigation requirements into allocations for sensors, estimation algorithms, and software components. Design detailed software architecture for state estimation implementations, including module interfaces and data flow.</p>
<p><strong>Algorithm Development &amp; Implementation:</strong></p>
<p>Design and implement Extended Kalman Filter algorithms for navigation applications, with broad understanding of state estimation theory and alternative filtering approaches. Implement tightly-coupled and loosely-coupled GNSS/INS integration algorithms. Integrate diverse sensing modalities (vision, RF, celestial etc.) into multi-sensor fusion framework for GPS-degraded environments. Develop fault detection, isolation, and recovery (FDIR) strategies for navigation systems. Implement integrity monitoring and protection level calculations for safety-critical operations.</p>
<p><strong>Certification &amp; Verification:</strong></p>
<p>Develop verification and validation test plans for navigation algorithms. Conduct performance analysis including Monte Carlo simulation, covariance analysis, and flight test data evaluation. Document navigation system design, requirements allocation, and compliance evidence. Support safety assessment activities including failure modes and effects analysis.</p>
<p><strong>Technical Leadership:</strong></p>
<p>Provide technical guidance on navigation architecture and state estimation approaches. Support trade studies evaluating navigation sensor suites and fusion strategies. Mentor junior engineers on state estimation theory and implementation.</p>
<p><strong>Required qualifications:</strong></p>
<ul>
<li>MS or PhD in Computer Science, Software Engineering, Electrical Engineering, Aerospace Engineering, Mechanical Engineering, Applied Mathematics, or related field</li>
<li>15+ years of experience developing state estimation algorithms for aerospace navigation applications</li>
<li>Demonstrated experience implementing GNSS/INS integration solutions</li>
<li>Experience with safety-critical system development and certification processes</li>
<li>Deep understanding of state estimation theory</li>
<li>Experience implementing multi-sensor fusion algorithms in production systems</li>
<li>Strong background in inertial navigation, GNSS positioning, and sensor error modeling</li>
<li>Strong programming skills in C/C++ and Python/MATLAB for algorithm development and analysis</li>
<li>Understanding of integrity monitoring, protection levels, and safety assessment methods</li>
<li>Experience with requirements management and verification/validation processes for certifiable systems</li>
<li>Understanding of GPS/GNSS signal structure, error sources, and performance characteristics</li>
<li>Knowledge of IMU error models, calibration, and Allan variance analysis</li>
<li>Familiarity with alternative navigation sensors (camera, RF ranging, celestial, etc.)</li>
<li>Understanding of navigation performance metrics (accuracy, integrity, continuity, availability)</li>
</ul>
<p><strong>Preferred qualifications:</strong></p>
<ul>
<li>PhD in relevant field with focus on state estimation or navigation</li>
<li>Direct involvement in certified navigation system development from requirements through flight test</li>
<li>Experience with specialized navigation approaches (vision-aided navigation, terrain-referenced navigation, celestial navigation, etc.)</li>
<li>Publications or patents in navigation or state estimation</li>
<li>Experience with GPS/GNSS jamming and spoofing mitigation techniques</li>
<li>Clearance eligible or active security clearance</li>
</ul>
<p>The salary for this position is $270,000 - $400,000 a year.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$270,000 - $400,000 a year</Salaryrange>
      <Skills>state estimation theory, GNSS/INS integration, safety-critical system development, multi-sensor fusion algorithms, inertial navigation, GPS/GNSS signal structure, IMU error models, alternative navigation sensors, navigation performance metrics, PhD in relevant field, certified navigation system development, specialized navigation approaches, publications or patents, GPS/GNSS jamming and spoofing mitigation techniques</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015 with a mission to protect service members and civilians with intelligent systems.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/4ccdcce2-ce09-4180-aba4-01f3e405e0e5</Applyto>
      <Location>San Diego, California</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>bed4759c-578</externalid>
      <Title>Staff Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Required Qualifications:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>
<li>Ability to obtain a SECRET clearance</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>
<li>Experience deploying perception software on SWaP-constrained platforms</li>
<li>Familiarity with validating perception systems during flight test events or operational environments</li>
<li>Understanding of sensing challenges in denied or degraded conditions</li>
<li>Exposure to perception applications across air, maritime, and ground platforms</li>
</ul>
<p>$182,720 - $274,080 a year</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$182,720 - $274,080 a year</Salaryrange>
      <Skills>real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>8c44b3ba-384</externalid>
      <Title>Senior Engineer, Aeronautical Modeling &amp; Simulation</Title>
      <Description><![CDATA[<p>The Software Integration &amp; Operations group turns frontier autonomy into mission-ready aircraft. We own the commit-to-flight pipeline,deterministic aircraft and mission simulation, HIL/VIL integration, CI/CD, automated flight qualification testing, and release engineering. Our goal is simple: make AI fly,safely, repeatably, and fast.</p>
<p>As a Modeling &amp; Simulation Engineer, you will be responsible for improving and adding to our world and aeronautical models so that our operator training and internal engineering pipelines have a seamless translation from sim to real results.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Develop and enhance aeronautical physics models (aerodynamics, propulsion, environmental, structural, etc.) for use in simulation and evaluation.</li>
<li>Translate theoretical models into efficient, reliable C++ implementations with a focus on numerical accuracy and performance.</li>
<li>Validate models against real-world data and authoritative references, including field test data and calibration procedures.</li>
<li>Collaborate with simulation and training application teams to ensure models integrate cleanly into operator-facing tools.</li>
<li>Design automated validation and regression testing strategies for mathematical models to ensure fidelity across releases.</li>
<li>Prototype and evaluate new modeling techniques (reduced-order models, uncertainty quantification, machine learning–based surrogates) to push the state of the art.</li>
<li>Document assumptions, equations, and validation results so that both engineers and operators can trust model outputs.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$105,000 - $155,000 a year</Salaryrange>
      <Skills>C++, Aerodynamics, Atmosphere/environment, Vehicle dynamics, Eigen, SciPy, State estimation sensor models, Multi-body dynamics, Flight mechanics, Uncertainty quantification, Statistical analysis methods, Python</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>X-BAT Division – X-BAT Engineering - Software</Employername>
      <Employerlogo>https://logos.yubhub.co/bit.ly.png</Employerlogo>
      <Employerdescription>X-BAT Engineering develops software for opinions-ready aircraft.</Employerdescription>
      <Employerwebsite>http://bit.ly/shieldai_lever_homepage</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/899822bb-7c18-45c5-ac34-7b84cc689f9e</Applyto>
      <Location>Dallas</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>690339e7-e86</externalid>
      <Title>Senior Software Engineer, Autonomy - Calibration, Mapping &amp; Localization</Title>
      <Description><![CDATA[<p>About Cyngn</p>
<p>Based in Mountain View, CA, Cyngn is a publicly-traded autonomous technology company. We deploy self-driving industrial vehicles like forklifts and tuggers to factories, warehouses, and other facilities throughout North America.</p>
<p>To build this emergent technology, we are looking for innovative, motivated, and experienced leaders to join us and move this field forward. If you like to build, tinker, and create with a team of trusted and passionate colleagues, then Cyngn is the place for you.</p>
<p>Key reasons to join Cyngn:</p>
<p>We are small and big. With under 100 employees, Cyngn operates with the energy of a startup. On the other hand, we’re publicly traded. This means our employees not only work in close-knit teams with mentorship from company leaders,they also get access to the liquidity of our publicly-traded equity.</p>
<p>We build today and deploy tomorrow. Our autonomous vehicles aren’t just test concepts,they’re deployed to real clients right now. That means your work will have a tangible, visible impact.</p>
<p>We aren’t robots. We just develop them. We’re a welcoming, diverse team of sharp thinkers and kind humans. Collaboration and trust drive our creative environment. At Cyngn, everyone’s perspective matters,and that’s what powers our innovation.</p>
<p>About this role:</p>
<p>As a Staff/Senior Software Engineer on our Calibration, Localization, &amp; Mapping (CLAM) team, you will be responsible for delivering mission-critical improvements and new features to our calibration, localization, and mapping subsystems. You will work on a small, highly focused team developing production-quality software that enables efficient and accurate creation of HD maps at Cyngn deployment-sites and robust localization for Cyngn’s autonomous vehicle fleets.</p>
<p>Responsibilities</p>
<ul>
<li><p>Design, implement, tune, and test mapping, localization, and sensor calibration algorithms for our autonomous vehicle platforms using C++ and Python.</p>
</li>
<li><p>Develop tooling and metrics for performance validation and continuous testing frameworks.</p>
</li>
<li><p>Balance project tasks, code reviews, and research to meet product-driven milestones in a fast-paced startup environment.</p>
</li>
</ul>
<p>Qualifications</p>
<ul>
<li><p>MS/Phd with focus in robotics or a similar technical field of study</p>
</li>
<li><p>Solid foundation in probability theory, linear algebra, 3D geometry, and spatial coordinate transformations.</p>
</li>
<li><p>In-depth understanding of matrix factorization algorithms and Lie algebra/groups.</p>
</li>
<li><p>Solid theoretical knowledge of state-of-the-art techniques in 3D Lidar-based mapping and localization for autonomous vehicles (LOAM series, GICP, FastLIO, bundle-adjustment)</p>
</li>
<li><p>Familiarity with state estimation frameworks such as EKF’s as well as modern nonlinear optimization libraries (GTSAM, G2O, Ceres-Solver, GNC-Solver, etc.)</p>
</li>
<li><p>6+ years of industry experience as an autonomous vehicle or robotics software engineering professional including hands-on implementation and tuning on production hardware.</p>
</li>
<li><p>6+ years industry experience writing C++ software in a production environment - architecture design, unit testing, code review, algorithm performance trade-offs, etc.</p>
</li>
<li><p>Proficiency in Python.</p>
</li>
<li><p>Excellent written &amp; verbal communication skills.</p>
</li>
</ul>
<p>Bonus Qualifications</p>
<ul>
<li><p>Proven record of top-tier publications or patents.</p>
</li>
<li><p>Experience with GPU programming, CUDA.</p>
</li>
<li><p>Experience in implementing automated map change detection and updating techniques.</p>
</li>
<li><p>Experience implementing modern multi-sensor calibration and sensor mis-alignment detection algorithms.</p>
</li>
<li><p>Experience with camera-based SLAM and 3D multi-view geometry.</p>
</li>
<li><p>Experience working with ROS2 to design, build, and operate robotic systems.</p>
</li>
<li><p>Exposure to modern software development version control and project management tools - Git, Jira, etc.</p>
</li>
</ul>
<p>Benefits &amp; Perks</p>
<ul>
<li><p>Health benefits (Medical, Dental, Vision, HSA and FSA (Health &amp; Dependent Daycare), Employee Assistance Program, 1:1 Health Concierge)</p>
</li>
<li><p>Life, Short-term and long-term disability insurance (Cyngn funds 100% of premiums)</p>
</li>
<li><p>Company 401(k)</p>
</li>
<li><p>Commuter Benefits</p>
</li>
<li><p>Flexible vacation policy</p>
</li>
<li><p>Sabbatical leave opportunity after 5 years with the company</p>
</li>
<li><p>Paid Parental Leave</p>
</li>
<li><p>Daily lunches for in-office employees and fully-stocked kitchen with snacks and beverages</p>
</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$180,000-198,000 per year</Salaryrange>
      <Skills>C++, Python, Probability theory, Linear algebra, 3D geometry, Spatial coordinate transformations, Matrix factorization algorithms, Lie algebra/groups, State estimation frameworks, Nonlinear optimization libraries, GPU programming, CUDA, Automated map change detection and updating techniques, Modern multi-sensor calibration and sensor mis-alignment detection algorithms, Camera-based SLAM and 3D multi-view geometry, ROS2, Git, Jira</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Cyngn</Employername>
      <Employerlogo>https://logos.yubhub.co/cyngn.com.png</Employerlogo>
      <Employerdescription>Cyngn is a publicly-traded autonomous technology company that deploys self-driving industrial vehicles to factories, warehouses, and other facilities throughout North America.</Employerdescription>
      <Employerwebsite>https://www.cyngn.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/cyngn/716dbe41-cac5-4d23-9ec3-cc05b32322b4</Applyto>
      <Location>Mountain View</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
  </jobs>
</source>