<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>1044b51e-cc6</externalid>
      <Title>Senior Manager, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>
<li>Develop advanced perception algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks by integrating data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques.</li>
<li>Develop state estimation capabilities by designing and refining algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs.</li>
<li>Analyze and utilize sensor ICDs to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance by tuning and evaluating perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration by working closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings by leveraging synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams to ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing by contributing novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience.</li>
<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.</li>
<li>7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D.</li>
<li>2+ years of people leadership experience.</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>
<li>Ability to obtain a SECRET clearance.</li>
</ul>
<p><strong>Preferences:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems.</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>
<li>Experience deploying perception software on SWaP-constrained platforms.</li>
<li>Familiarity with validating perception systems during flight test events or operational environments.</li>
<li>Understanding of sensing challenges in denied or degraded conditions.</li>
<li>Exposure to perception applications across air, maritime, and ground platforms.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$229,233 - $343,849 a year</Salaryrange>
      <Skills>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, 10+ years of related experience, 7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/cebc0dd3-ffbf-4013-a2ad-ae32732cabd3</Applyto>
      <Location>Washington, DC / San Diego, California / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3f0b0cce-7be</externalid>
      <Title>Manager, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.
The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>We are seeking a skilled and motivated manager to lead technical teams and support direct projects integrating perception solutions for defense platforms.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land.
Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p>Responsibilities:</p>
<ul>
<li>Multidisciplinary Team Leadership – Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p>Required Qualifications:</p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>
<li>5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D</li>
<li>2+ years of people leadership experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>
<li>Ability to obtain a SECRET clearance.</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems.</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>
<li>Experience deploying perception software on SWaP-constrained platforms.</li>
<li>Familiarity with validating perception systems during flight test events or operational environments.</li>
<li>Understanding of sensing challenges in denied or degraded conditions.</li>
<li>Exposure to perception applications across air, maritime, and ground platforms.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$220,441 - $330,661 a year</Salaryrange>
      <Skills>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience, 5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models., Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/1120529c-2f7d-4b27-a29b-50976c49c433</Applyto>
      <Location>Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>841c78ea-841</externalid>
      <Title>Senior Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.
The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.
Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.
Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.
Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.
Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.
Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.
Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.
Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.
Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$160,000 - $240,000 a year</Salaryrange>
      <Skills>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company that develops intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/d6f1d906-5c1e-4640-87f3-3e31e1b45fa6</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5f911dd8-860</externalid>
      <Title>Senior Staff Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>
<li>Ability to obtain a SECRET clearance</li>
</ul>
<p><strong>Preferences:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>
<li>Experience deploying perception software on SWaP-constrained platforms</li>
<li>Familiarity with validating perception systems during flight test events or operational environments</li>
<li>Understanding of sensing challenges in denied or degraded conditions</li>
<li>Exposure to perception applications across air, maritime, and ground platforms</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$220,800 - $331,200 a year</Salaryrange>
      <Skills>algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>bed4759c-578</externalid>
      <Title>Staff Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Required Qualifications:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>
<li>Ability to obtain a SECRET clearance</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>
<li>Experience deploying perception software on SWaP-constrained platforms</li>
<li>Familiarity with validating perception systems during flight test events or operational environments</li>
<li>Understanding of sensing challenges in denied or degraded conditions</li>
<li>Exposure to perception applications across air, maritime, and ground platforms</li>
</ul>
<p>$182,720 - $274,080 a year</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$182,720 - $274,080 a year</Salaryrange>
      <Skills>real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>6d4857ff-459</externalid>
      <Title>Senior Machine Learning Engineer - Generative AI Team</Title>
      <Description><![CDATA[<p>This role exists to join a cross-functional team responsible for taking ML initiatives from early research and ideation through full production integration into live game features.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Research and evaluate machine learning models across areas such as automatic speech recognition, natural language processing, and procedural content generation.</li>
<li>Adapt and optimize models to meet gameplay requirements and platform constraints and integrate them into core game systems.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>Strong programming fundamentals with proficiency in at least one language (C++, Python, Java, or C# preferred).</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$138,700 - $199,900 CAD</Salaryrange>
      <Skills>machine learning, programming fundamentals, C++, Python, Java, C#, experience with ML frameworks such as PyTorch or TensorFlow, experience optimizing ML models for real-time, memory- and compute-constrained environments</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Senior-Machine-Learning-Engineer-Generative-AI-Team/212681</Applyto>
      <Location>Vancouver</Location>
      <Country></Country>
      <Postedate>2026-02-20</Postedate>
    </job>
    <job>
      <externalid>5399cdab-244</externalid>
      <Title>Senior Software Engineer - On Device Machine Learning</Title>
      <Description><![CDATA[<p>We are looking for a Senior Software Engineer with expertise in software optimisation for gaming consoles and CPU/GPU architectures to join our Machine Learning team. You&#39;ll report to a Leader of Engine Development and collaborate with both game and central technology engineers and researchers to bring ML models into the hands of our players by deploying them directly into EA&#39;s games.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Design, build, and maintain robust end-to-end solutions for running machine learning models efficiently on a variety of devices.</li>
<li>Partner with ML experts across EA to help adopt and scale new models and architectures optimised for on-device performance.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>7+ years of hands-on software engineering experience with C++, including expertise in multithreading and low-level/near-hardware optimisations.</li>
<li>Good knowledge of GPU programming.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>C++, GPU programming, ML frameworks such as PyTorch or TensorFlow, Knowledge of the ONNX format</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Senior-Software-Engineer-On-Device-Machine-Learning/212350</Applyto>
      <Location>Guildford, Surrey, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-02-05</Postedate>
    </job>
    <job>
      <externalid>c98c4304-e19</externalid>
      <Title>Principal Research Engineer</Title>
      <Description><![CDATA[<p>The Epic Games Research &amp; Development team is searching for an experienced hands-on Research Programmer. The ideal candidate will bring a passion for digital humans and experience with machine learning and animation tech.</p>
<p><strong>What you&#39;ll do</strong></p>
<p>The Research Programmer will be responsible for researching and prototyping machine learning and generative approaches for game content creation, working with development programmers and product programmers to deploy and productize novel technology results in Epic products, and engaging with the scientific community to contribute domain knowledge and advice in support of strategically relevant R&amp;D efforts.</p>
<p><strong>What you need</strong></p>
<ul>
<li>PhD degree in Computer Science, Machine Learning, Mathematics, Programming, or a related discipline</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$270,572—$396,838 USD (California Base Pay Range)</Salaryrange>
      <Skills>PhD degree in Computer Science, Machine Learning, Mathematics, Programming, or a related discipline, Proven applied research impact in shipping titles, Strong analytical and reasoning skills with an emphasis on innovative and practical solutions, Excellent programming skills with a preference for experience with Python and C++, Experience working in common ML frameworks such as PyTorch or Tensorflow, and rapid prototyping in Python, In-depth knowledge in one or more of the following areas: real-time graphics, computer vision, machine learning, animation, large-language models, speech recognition, speech synthesis, linguistics, performance-capture, 3D reconstruction</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Epic Games</Employername>
      <Employerlogo>https://logos.yubhub.co/epicgames.com.png</Employerlogo>
      <Employerdescription>Epic Games is a leading game development company that creates award-winning games and engine technology. The company is known for its collaborative and welcoming environment, and it prides itself on creating a space for talented individuals to innovate and push the boundaries of game development.</Employerdescription>
      <Employerwebsite>https://www.epicgames.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://www.epicgames.com/en-US/careers/jobs/5500537004</Applyto>
      <Location>Multiple Locations</Location>
      <Country></Country>
      <Postedate>2026-01-08</Postedate>
    </job>
  </jobs>
</source>