{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/kalman-filter"},"x-facet":{"type":"skill","slug":"kalman-filter","display":"Kalman Filter","count":11},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8fa7eb38-b7e"},"title":"Guidance, Navigation and Control (GNC) Engineer -  Tactical Reconnaissance & Strike","description":"<p>We are seeking a Guidance, Navigation and Control (GNC) Engineer to join our Tactical Recon &amp; Strike team. As a GNC Engineer, you will be responsible for developing guidance algorithms and state estimators for group 1-3 UAV platforms. You will work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Develop guidance algorithms (terminal and midcourse) and state estimators for group 1-3 UAV platforms</li>\n<li>Work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines</li>\n<li>Review test data captured from the flight control system, subsystems, and other test instrumentation to verify vehicle performance, evaluate GNC algorithm/controller behaviour, or debug incidents during testing or from fielded assets</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>\n<li>3+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>\n<li>Experience with tactical guidance algorithms and target state estimation development</li>\n<li>Experience with state estimation and filtering</li>\n<li>Kalman filtering, sensor fusion, complementary filters, etc.</li>\n<li>Experience in modelling and simulation of linear and nonlinear dynamic systems and model linearisation</li>\n<li>Experience coding in Matlab and Simulink</li>\n<li>C/C++ Proficiency</li>\n<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s Degree or PhD in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>\n<li>5+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>\n<li>Experience with flight test and debugging GNC on UAV platforms</li>\n<li>Experience with GNC design and analysis for fixed wing, rotary wing aircraft and/or tactical missile systems</li>\n<li>Experience with seeker integration</li>\n<li>Experience in one or more of the following: VIO, TERCOM, gimbal mount models, detection and tracking, LWIR/EO sensors, RADAR, target motion models</li>\n<li>Experience with Simulink Embedded Coder</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8fa7eb38-b7e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5060107007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$166,000-$220,000 USD","x-skills-required":["Matlab","Simulink","C/C++","Kalman filtering","Sensor fusion","Complementary filters","Model linearisation","Tactical guidance algorithms","Target state estimation development","State estimation and filtering"],"x-skills-preferred":["VIO","TERCOM","Gimbal mount models","Detection and tracking","LWIR/EO sensors","RADAR","Target motion models","Simulink Embedded Coder"],"datePosted":"2026-04-18T15:43:51.785Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Matlab, Simulink, C/C++, Kalman filtering, Sensor fusion, Complementary filters, Model linearisation, Tactical guidance algorithms, Target state estimation development, State estimation and filtering, VIO, TERCOM, Gimbal mount models, Detection and tracking, LWIR/EO sensors, RADAR, Target motion models, Simulink Embedded Coder","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_baf3d89c-84b"},"title":"Senior Manager, Perception","description":"<p>As a member of the HMS Perception team, you will conduct software development at the intersection of classical state estimation techniques, sensor fusion, artificial intelligence, machine learning, and machine perception. You will develop cutting-edge technology onto real hardware that provides robust and accurate estimates of vehicle pose and surroundings for real missions.</p>\n<p>Shield AI is pushing the envelope by applying advanced AI solutions to real hardware systems. An ideal candidate should aspire to be a part of this industry-changing team developing and deploying advanced technology that can truly make an impact.</p>\n<p>We are seeking a skilled and motivated leader with 10+ years of experience to manage a technical team supporting the development, integration, and testing of perception algorithms for advanced aerospace, defense, and robotics systems. In this role, you will contribute to implementing and integrating innovative perception solutions while collaborating with a multidisciplinary team of engineers to meet challenging operational requirements.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution. Balance hands-on technical oversight with performance optimization, innovation, and clear stakeholder communication.</li>\n</ul>\n<ul>\n<li>Write production quality software in C++</li>\n</ul>\n<ul>\n<li>Produce an Assured Position, Navigation, and Timing (A-PNT) system to enable reliable autonomy in GNSS-degraded or denied environments</li>\n</ul>\n<ul>\n<li>Extend and specialize Shield AI’s state-of-the-art state estimation framework for new sensors, platforms, and missions</li>\n</ul>\n<ul>\n<li>Write test code to validate your software with simulated and real-world data</li>\n</ul>\n<ul>\n<li>Collaborate with hardware and test teams to validate algorithms/code on aerial platforms</li>\n</ul>\n<ul>\n<li>Write analyzers to ingest data and produce statistics to validate code quality</li>\n</ul>\n<ul>\n<li>Enhance sensor models within a high-fidelity simulation environment</li>\n</ul>\n<ul>\n<li>Work in a fast-paced, collaborative, continuous development environment, enhancing analysis and benchmarking capabilities</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_baf3d89c-84b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/501e3703-1a63-4773-b961-6029e5fb71d6","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["C++","Sensor fusion","Artificial intelligence","Machine learning","Machine perception","Kalman Filter","Factor Graphs","Computer Vision","OpenCV","Unix environments"],"x-skills-preferred":["Robotics technologies","Unmanned system technologies","High-fidelity simulation","Sensor modeling"],"datePosted":"2026-04-17T13:05:41.027Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Sensor fusion, Artificial intelligence, Machine learning, Machine perception, Kalman Filter, Factor Graphs, Computer Vision, OpenCV, Unix environments, Robotics technologies, Unmanned system technologies, High-fidelity simulation, Sensor modeling","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_98550091-2de"},"title":"Staff Engineer, State Estimation","description":"<p>As a Staff State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimisation, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments.</p>\n<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>\n<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>\n<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>\n<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>\n<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>\n<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>\n</ul>\n<p><strong>Qualifications:</strong></p>\n<ul>\n<li>Typically requires a minimum of 7 years of relevant experience with a bachelor’s degree; or 6 years with a master’s degree; or 4 years with a PhD; or equivalent practical experience.</li>\n<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>\n<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>\n<li>Proficient in C++11 or newer in real-time environments.</li>\n<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>\n<li>Strong written and verbal communication skills with a collaborative mindset.</li>\n<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>\n</ul>\n<p><strong>Salary:</strong></p>\n<p>$187,531 - $281,297 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_98550091-2de","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/f8849287-b9ff-4c3e-a37f-be20e39c597b","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$187,531 - $281,297 a year","x-skills-required":["state estimation","sensor fusion","inertial navigation","Kalman filters","C++11","Linux"],"x-skills-preferred":["visual odometry","computer vision","CUDA","hardware acceleration"],"datePosted":"2026-04-17T13:04:28.903Z","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"state estimation, sensor fusion, inertial navigation, Kalman filters, C++11, Linux, visual odometry, computer vision, CUDA, hardware acceleration","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":187531,"maxValue":281297,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3e1eaa7a-031"},"title":"Engineer, State Estimation","description":"<p>As a State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimisation, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments.</p>\n<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>\n</ul>\n<ul>\n<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>\n</ul>\n<ul>\n<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>\n</ul>\n<ul>\n<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>\n</ul>\n<ul>\n<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>\n</ul>\n<ul>\n<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>\n</ul>\n<p><strong>Qualifications:</strong></p>\n<ul>\n<li>Typically requires a minimum of 3 years of relevant experience with a bachelor’s degree; or 2 years with a master’s degree; or 1 years with a PhD; or equivalent practical experience.</li>\n</ul>\n<ul>\n<li>Familiarity with algorithms.</li>\n</ul>\n<ul>\n<li>Proficient in C++11 or newer in real-time environments.</li>\n</ul>\n<ul>\n<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>\n</ul>\n<ul>\n<li>Strong written and verbal communication skills with a collaborative mindset.</li>\n</ul>\n<ul>\n<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>\n</ul>\n<ul>\n<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>\n</ul>\n<ul>\n<li>Experience implementing inertial navigation algorithms in degraded or GPS-denied conditions.</li>\n</ul>\n<ul>\n<li>Exposure to visual odometry or computer vision-based navigation approaches.</li>\n</ul>\n<ul>\n<li>Experience optimising code for performance on compute-constrained platforms.</li>\n</ul>\n<ul>\n<li>Familiarity with CUDA or hardware acceleration techniques (e.g., FPGAs).</li>\n</ul>\n<ul>\n<li>Experience transitioning navigation solutions from research into production environments.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3e1eaa7a-031","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/133ad6aa-d624-4fad-b1cc-1f8f42d0401f","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,000 - $250,000 a year","x-skills-required":["C++11","Linux","real-time environments","algorithmic thinking","strong written and verbal communication skills"],"x-skills-preferred":["Kalman filters","EKF","UKF","particle filters","visual odometry","computer vision-based navigation approaches"],"datePosted":"2026-04-17T13:04:23.277Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas/San Diego/Boston/DC/San Fran"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++11, Linux, real-time environments, algorithmic thinking, strong written and verbal communication skills, Kalman filters, EKF, UKF, particle filters, visual odometry, computer vision-based navigation approaches","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120000,"maxValue":250000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1044b51e-cc6"},"title":"Senior Manager, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Develop advanced perception algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks by integrating data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques.</li>\n<li>Develop state estimation capabilities by designing and refining algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs.</li>\n<li>Analyze and utilize sensor ICDs to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance by tuning and evaluating perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration by working closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings by leveraging synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams to ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing by contributing novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience.</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.</li>\n<li>7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D.</li>\n<li>2+ years of people leadership experience.</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems.</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>\n<li>Experience deploying perception software on SWaP-constrained platforms.</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments.</li>\n<li>Understanding of sensing challenges in denied or degraded conditions.</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1044b51e-cc6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/cebc0dd3-ffbf-4013-a2ad-ae32732cabd3","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree","10+ years of related experience","7+ years of experience in Unmanned Systems programs in the DoD or applied R&D","2+ years of people leadership experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models","Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches","Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications","Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs","Proficiency with version control, debugging, and test-driven development in cross-functional teams","Ability to obtain a SECRET clearance"],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions","Exposure to perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:04:16.670Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / San Diego, California / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, 10+ years of related experience, 7+ years of experience in Unmanned Systems programs in the DoD or applied R&D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3f0b0cce-7be"},"title":"Manager, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.\nThe role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>We are seeking a skilled and motivated manager to lead technical teams and support direct projects integrating perception solutions for defense platforms.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land.\nOur Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Multidisciplinary Team Leadership – Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D</li>\n<li>2+ years of people leadership experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems.</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>\n<li>Experience deploying perception software on SWaP-constrained platforms.</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments.</li>\n<li>Understanding of sensing challenges in denied or degraded conditions.</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3f0b0cce-7be","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/1120529c-2f7d-4b27-a29b-50976c49c433","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,441 - $330,661 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience","Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience","5+ years of experience in Unmanned Systems programs in the DoD or applied R&D","2+ years of people leadership experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models."],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions"],"datePosted":"2026-04-17T13:04:04.648Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience, 5+ years of experience in Unmanned Systems programs in the DoD or applied R&D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models., Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220441,"maxValue":330661,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7da005da-ff5"},"title":"Senior Engineer, State Estimation","description":"<p>As a Senior Engineer, State Estimation, you will work on the GNC team to develop and optimise algorithms that process and fuse data from various sensors to provide accurate, reliable state estimates, enabling the X-BAT to operate autonomously in complex and contested environments.</p>\n<p>Your key responsibilities will include:</p>\n<p>Developing and implementing advanced sensor algorithms for processing data from IMUs, radar, cameras, GPS, and other sensors.\nEnhancing state estimation algorithms by integrating multi-sensor data for improved accuracy and robustness.\nDesigning and implementing real-time sensor data processing pipelines.\nCollaborating with cross-functional teams, including software engineers, autonomy researchers, and hardware engineers, to ensure seamless integration of state estimation algorithms.\nConducting experiments and field tests to validate the performance of state estimation algorithms in real-world scenarios.\nStaying updated with the latest advancements in sensor technologies and state estimation, applying them to our systems.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7da005da-ff5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/0c6acdd5-a39b-4ad3-84fa-b1a1f83409d3","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$160,000 - $240,000 a year","x-skills-required":["C++ 11 or newer","Linux","command line tools","Kalman filters","particle filters"],"x-skills-preferred":["inertial navigation algorithms","computer vision techniques","optimising algorithms for compute-constrained systems","CUDA or other hardware acceleration technologies"],"datePosted":"2026-04-17T13:03:53.983Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Dallas, Texas / Boston, MA / San Diego, California / Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++ 11 or newer, Linux, command line tools, Kalman filters, particle filters, inertial navigation algorithms, computer vision techniques, optimising algorithms for compute-constrained systems, CUDA or other hardware acceleration technologies","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_841c78ea-841"},"title":"Senior Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.\nThe role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.\nImplement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.\nDevelop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.\nAnalyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.\nOptimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.\nSupport autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.\nValidate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.\nCollaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.\nDrive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_841c78ea-841","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/d6f1d906-5c1e-4640-87f3-3e31e1b45fa6","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$160,000 - $240,000 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience","Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models","Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches","Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications","Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs","Proficiency with version control, debugging, and test-driven development in cross-functional teams","Ability to obtain a SECRET clearance"],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions","Exposure to perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:46.950Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5f911dd8-860"},"title":"Senior Staff Engineer, Software - Perception","description":"<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5f911dd8-860","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,800 - $331,200 a year","x-skills-required":["algorithm development","sensor fusion","state estimation","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","perception software deployment on SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:35.432Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220800,"maxValue":331200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bed4759c-578"},"title":"Staff Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Required Qualifications:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p>$182,720 - $274,080 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bed4759c-578","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$182,720 - $274,080 a year","x-skills-required":["real-time object detection","sensor fusion","state estimation algorithms","EO/IR cameras","radars","IMUs","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","Interface Control Documents","hardware integration specs","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration or algorithm development with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","vision-based object detection or classification tasks","SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:02:45.901Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":182720,"maxValue":274080,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5b9f33df-224"},"title":"Engineer, State Estimation","description":"<p>As a State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimization, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments. You will help design real-time sensor processing pipelines, integrate multi-sensor data for robust state estimation, and collaborate closely with autonomy researchers, software engineers, and hardware teams to ensure high system performance and reliability.</p>\n<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>\n<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>\n<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>\n<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>\n<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>\n<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>\n</ul>\n<p><strong>Qualifications:</strong></p>\n<ul>\n<li>Typically requires a minimum of 3 years of relevant experience with a bachelor’s degree; or 2 years with a master’s degree; or 1 years with a PhD; or equivalent practical experience.</li>\n<li>Familiarity with algorithms.</li>\n<li>Proficient in C++11 or newer in real-time environments.</li>\n<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>\n<li>Strong written and verbal communication skills with a collaborative mindset.</li>\n<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>\n<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>\n<li>Experience implementing inertial navigation algorithms in degraded or GPS-denied conditions.</li>\n<li>Exposure to visual odometry or computer vision-based navigation approaches.</li>\n<li>Experience optimizing code for performance on compute-constrained platforms.</li>\n<li>Familiarity with CUDA or hardware acceleration techniques (e.g., FPGAs).</li>\n<li>Experience transitioning navigation solutions from research into production environments.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5b9f33df-224","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/133e1006-4bcd-4a31-afaf-c85ad113b749","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$120,000 - $250,000 a year","x-skills-required":["C++11","Linux","standard command-line tools","scripting","algorithms","real-time environments"],"x-skills-preferred":["Kalman filters","EKF","UKF","particle filters","visual odometry","computer vision-based navigation","CUDA","hardware acceleration techniques"],"datePosted":"2026-04-17T13:02:29.847Z","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++11, Linux, standard command-line tools, scripting, algorithms, real-time environments, Kalman filters, EKF, UKF, particle filters, visual odometry, computer vision-based navigation, CUDA, hardware acceleration techniques","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":120000,"maxValue":250000,"unitText":"YEAR"}}}]}