<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>eccf1031-6f3</externalid>
      <Title>Senior Computer Vision Engineer, Space</Title>
      <Description><![CDATA[<p>We are seeking a Senior Computer Vision Engineer to join our rapidly growing team in Washington DC. The ideal candidate will have a strong background in computer vision and machine learning, with experience in developing and implementing computer vision algorithms for various spacecraft efforts in all orbital regimes.</p>
<p>The Senior Computer Vision Engineer will be responsible for proposing and prototyping innovative solutions to solve real-world problems, developing and maintaining core libraries and runtime applications, integrating classical and geometric methods in computer vision with ML methods, and working with space vehicle CV software and hardware subsystems for various spacecraft efforts in all orbital regimes.</p>
<p>The successful candidate will have a Master&#39;s or Ph.D. in Machine Learning, Robotics, or Computer Science, with a strong background in computer vision and machine learning. They will also have experience in one or more of the following: objection detection, object tracking, instance segmentation, semantic segmentation, semantic change detection, natural feature tracking (NFT), visual odometry, SLAM, multi-view geometry, structure from motion, 3D geometry, discriminative correlation filters, stereo, neural 3D reconstruction, multi-band sensor processing, RGB-D and LIDAR sensor fusion.</p>
<p>The Senior Computer Vision Engineer will work closely with related teams, including Sensors, GNC, Avionics, Systems, Flight Software, Mission Operations, and Ground Software, to develop and implement computer vision algorithms for various spacecraft efforts in all orbital regimes.</p>
<p>The ideal candidate will have excellent communication and organizational skills, including documentation and training material, and will be able to work effectively in a fast-paced environment with tight deadlines.</p>
<p>The salary range for this role is $191,000-$253,000 USD, and highly competitive equity grants are included in the majority of full-time offers.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$191,000-$253,000 USD</Salaryrange>
      <Skills>Machine Learning, Robotics, Computer Science, Computer Vision, Object Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT), Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band Sensor Processing, RGB-D and LIDAR Sensor Fusion, Matlab, Simulink, Python, Go, C++, Linux systems, OpenCV, NFT</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/andurilindustries.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defense technology company that develops advanced technology for the U.S. and allied military.</Employerdescription>
      <Employerwebsite>https://www.andurilindustries.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/5016343007</Applyto>
      <Location>Washington, District of Columbia, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>cb57c7a1-7e6</externalid>
      <Title>Senior Computer Vision Engineer, Space</Title>
      <Description><![CDATA[<p>We are looking for a Senior Computer Vision Engineer to join our rapidly growing team in Costa Mesa, CA. In this role, you will be responsible for working on and understanding the design of all perception subsystems to include but not limited to hardware sensors and advanced processing platforms, navigation algorithms, flight software implementation, subsystem integration &amp; test (I&amp;T), and vehicle I&amp;T.</p>
<p>The computer vision engineering team will work closely with related teams, including Sensors, GNC, Avionics, Systems, Flight Software, Mission Operations, and Ground Software. The Computer Vision Engineer will work algorithm design, truth and physics modeling, scene rendering, simulation and analysis for a wide variety of spacecraft and space missions to include but not limited to LEO, MEO, GEO, Reentry, and RPOD (Rendezvous Proximity Operations and Docking).</p>
<p>The CV Engineer will help lead successful implementation, validation and CV operations of Anduril’s fleet of space vehicles. This role is directly tied to ongoing, funded programs within Anduril’s Space Business Line. The programs require building and fielding a resilient, software-defined spacecraft systems across numerous mission threads. We work with mission partners and customers to deploy reliable and robust capabilities on operationally-relevant fielding timelines to meet complex challenges across the DOD and IC.</p>
<p>The position requires a strong background in computer vision, machine learning, and software development, with experience in developing and implementing computer vision algorithms for space-based applications. The ideal candidate will have a deep understanding of computer vision principles, including object detection, tracking, and recognition, as well as experience with software development in languages such as C++ and Python.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Propose and prototype innovative solutions to solve real-world problems, leveraging the latest state-of-the-art techniques in the field</li>
<li>Develop and maintain core libraries and runtime applications</li>
<li>Integrate classical and geometric methods in computer vision with ML methods</li>
<li>Work space vehicle CV software and hardware subsystems for various spacecraft efforts in all orbital regimes and work closely with partners for successful implementation</li>
<li>Develop modern, software-defined approaches to autonomous spacecraft operations with maneuvering capabilities to successfully accomplish mission objectives</li>
<li>Develop appropriate test plans and procedures to validate the CV system during ground checkout, on-orbit commissioning and operations</li>
<li>Collaborate across multiple teams to plan, build, and test complex functionality</li>
<li>Coordinate with end-users, other operators and customers to turn needs into features while balancing user experience with engineering constraints</li>
<li>Support challenging schedules during ground testing, launch windows and on-orbit operations of the spacecraft systems</li>
<li>Design of flight software and firmware, algorithms, and simulation products</li>
<li>Development of space vehicle autonomy tools for dynamic space operations</li>
<li>Test process development and execution</li>
<li>Define automated fault detection and responses</li>
<li>Provide hardware-in-the-loop and monte-carlo simulation capabilities</li>
</ul>
<p>Required Qualifications:</p>
<ul>
<li>MS or PhD in Machine Learning, Robotics or Computer Science, Image Science with emphasis on Computer Vision</li>
<li>BS in Computer Science, Machine Learning, Electrical Engineering, or related field</li>
<li>Advanced professional experience developing and benchmarking ML algorithms on large-scale datasets</li>
<li>High proficiency in C++ development in a Linux environment</li>
<li>Experience in one or more of the following: Objection Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT)</li>
<li>Experience in one or more of the following: Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band sensor processing, RGB-D and LIDAR Sensor Fusion</li>
<li>Ability to quickly understand and navigate complex systems and detailed requirements</li>
<li>Familiarity with terminal guidance, rendezvous proximity operations and docking, orbital mechanics with propulsive spacecraft, and/or spacecraft/missile GNC</li>
<li>Clear communication and organizational skills including documentation and training material</li>
<li>Currently possesses and is able to maintain an active U.S. Top Secret security clearance</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Experience with Matlab, Simulink, Python, Go, C++ and/or Linux systems</li>
<li>A desire to work on critical software and hardware designs in the space domain</li>
<li>Strong preference for candidates possessing computer vision, vision navigation, image processing, feature tracking, SLAM, Open CV, and NFT</li>
<li>Experience testing CV subsystems in laboratory environments that mimic the space environmental constraints</li>
<li>Experience with orbital mechanics and resident space object tracking capabilities</li>
<li>Experience conducting spacecraft operations and satellite command and control with an emphasis on system reliability and uptime</li>
<li>Experience with testing/validation leveraging FlatSats, Hardware-in-the-Loop testbeds and digital spacecraft simulators through nominal and fault scenarios</li>
<li>Experience with computer vision and perception algorithms to support GNC operations</li>
<li>Experience developing 3-DOF simplified and 6-DOF high-fidelity dynamics simulation models used for GNC systems analysis and validation</li>
<li>Exposure to US satellite operations policy and constraints for relevant mission threads in all orbits</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$191,000-$253,000 USD</Salaryrange>
      <Skills>Machine Learning, Robotics, Computer Science, Image Science, C++, Python, Objection Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT), Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band sensor processing, RGB-D and LIDAR Sensor Fusion, Matlab, Simulink, Go, Linux systems, Computer vision, Vision navigation, Image processing, Feature tracking, Open CV, NFT</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/anduril.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defence technology company that develops advanced technology for the U.S. and allied military capabilities.</Employerdescription>
      <Employerwebsite>https://www.anduril.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/5016340007</Applyto>
      <Location>Costa Mesa, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>f82f685c-734</externalid>
      <Title>Guidance Navigation and Control (GNC) Engineer, Air Dominance &amp; Strike - Advanced Effects</Title>
      <Description><![CDATA[<p>Join our Flight Engineering team advancing the Air Dominance and Strike division&#39;s portfolio, where we push the boundaries of aerospace technology. As a critical member of our team, the role of a Guidance, Navigation &amp; Control (GNC) Engineer is pivotal in driving the design, development, testing, and implementation of cutting-edge flight system technologies for our aerospace systems.</p>
<p>We are seeking exceptional GNC Engineers to shape the next generation of advanced tactical missile systems, strategic weapons, and hypersonic strike and defense systems. As a key contributor, you will develop, test, and integrate cutting-edge GNC architectures for subsonic, supersonic, and hypersonic autonomous systems, driving innovation in mission-critical capabilities.</p>
<p>Responsibilities:</p>
<ul>
<li>Develop, implement, and test algorithms for guidance, navigation, and control systems in line with project requirements.</li>
<li>Design, implement, and assess the GNC architectures considering all elements, including sensors, actuators, and algorithms.</li>
<li>Participate in all phases of system development, from concept to design, prototyping, testing, and operation.</li>
<li>Collaborate closely with a multidisciplinary team including systems engineers, flight software engineers, and test engineers to ensure system-wide integration and performance.</li>
<li>Contribute to design reviews, safety assessments, and system documentation.</li>
<li>Use simulation and analysis tools to verify GNC models and system performance under different operating conditions.</li>
<li>Provide system engineering expertise in the development of GNC systems, ensuring a cohesive and efficient approach to the design and implementation process.</li>
<li>Troubleshoot and resolve issues related to GNC systems during design, testing, and operational phases.</li>
<li>Keep updated with the latest technologies and trends in GNC engineering and system architecture design.</li>
</ul>
<p>Required Qualifications:</p>
<ul>
<li>BS in Robotics, Computer Science, Electrical, Mechanical, Mechatronics, Aerospace Engineering or related field with focus on dynamics and control</li>
<li>A strong theoretical knowledge of flight mechanics, classical and modern control theory, estimation, and filtering techniques.</li>
<li>Experience with developing GNC solutions for flight systems</li>
<li>Proficiency with GNC simulation tools like MATLAB, Simulink and related toolboxes</li>
<li>Familiarization with 6-DOF simulations</li>
<li>Must be eligible to obtain and maintain a U.S. Top Secret clearance</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>A MS or Ph.D. in Aerospace Engineering, Mechanical Engineering, Electrical Engineering, or a related field is a plus</li>
<li>Strong understanding of flight mechanics and related physics.</li>
<li>Practical experience developing GNC designs for high speed flight systems and re-entry vehicles.</li>
<li>Practical experience with Explicit Model Following, Dynamic Inversion, and Adaptive Control techniques.</li>
<li>Extensive experience in GNC system and architecture development and testing, preferably in the aerospace industry.</li>
<li>Experience generating C or C++ code from Simulink models.</li>
<li>Experience writing code in C/C++</li>
<li>Strong analytical, problem-solving, and decision-making skills.</li>
<li>Excellent verbal and written communication skills.</li>
<li>Ability to work independently and in a team-oriented, collaborative environment.</li>
<li>Familiarity with agile and Lean product development methodologies such as Scrum.</li>
<li>Experience using Jira, Git, GitHub, or GitLab</li>
<li>Understanding of and compliance with industry, safety, and quality standards appropriate for the application</li>
<li>Passion for defending the United States and its allies</li>
<li>Current Active U.S. Top Secret clearance preferred</li>
</ul>
<p>Salary Range:</p>
<p>The salary range for this role is an estimate based on a wide range of compensation factors, inclusive of base salary only. Actual salary offer may vary based on (but not limited to) work experience, education and/or training, critical skills, and/or business considerations. Highly competitive equity grants are included in the majority of full time offers; and are considered part of Anduril&#39;s total compensation package.</p>
<p>Benefits:</p>
<p>Anduril offers top-tier benefits for full-time employees, including:</p>
<ul>
<li>Healthcare Benefits - US Roles: Comprehensive medical, dental, and vision plans at little to no cost to you.</li>
<li>UK &amp; AUS Roles: We cover full cost of medical insurance premiums for you and your dependents.</li>
<li>IE Roles: We offer an annual contribution toward your private health insurance for you and your dependents.</li>
<li>Income Protection: Anduril covers life and disability insurance for all employees.</li>
<li>Generous time off: Highly competitive PTO plans with a holiday hiatus in December.</li>
<li>Caregiver &amp; Wellness Leave is available to care for family members, bond with a new baby, or address your own medical needs.</li>
<li>Family Planning &amp; Parenting Support: Coverage for fertility treatments (e.g., IVF, preservation), adoption, and gestational carriers, along with resources to support you and your partner from planning to parenting.</li>
<li>Mental Health Resources: Access free mental health resources 24/7, including therapy and life coaching.</li>
<li>Additional work-life services, such as legal and financial support, are also available.</li>
<li>Professional Development: Annual reimbursement for professional development.</li>
<li>Commuter Benefits: Company-funded commuter benefits based on your region.</li>
<li>Relocation Assistance: Available depending on role eligibility.</li>
<li>Retirement Savings Plan - US Roles: Traditional 401(k), Roth, and after-tax (mega backdoor Roth) options.</li>
<li>UK &amp; IE Roles: Pension plan with employer match.</li>
<li>AUS Roles: Superannuation plan.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$166,000-$220,000 USD</Salaryrange>
      <Skills>BS in Robotics, Computer Science, Electrical, Mechanical, Mechatronics, Aerospace Engineering or related field with focus on dynamics and control, A strong theoretical knowledge of flight mechanics, classical and modern control theory, estimation, and filtering techniques., Experience with developing GNC solutions for flight systems, Proficiency with GNC simulation tools like MATLAB, Simulink and related toolboxes, Familiarization with 6-DOF simulations, A MS or Ph.D. in Aerospace Engineering, Mechanical Engineering, Electrical Engineering, or a related field is a plus, Strong understanding of flight mechanics and related physics., Practical experience developing GNC designs for high speed flight systems and re-entry vehicles., Practical experience with Explicit Model Following, Dynamic Inversion, and Adaptive Control techniques., Extensive experience in GNC system and architecture development and testing, preferably in the aerospace industry.</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/andurilindustries.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defense technology company that develops advanced technology for the U.S. and allied military.</Employerdescription>
      <Employerwebsite>https://www.andurilindustries.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/5024955007</Applyto>
      <Location>Costa Mesa, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>78d317b2-c2a</externalid>
      <Title>Guidance Navigation and Control (GNC) Engineer, Air Dominance &amp; Strike</Title>
      <Description><![CDATA[<p>We are seeking a Guidance, Navigation and Control (GNC) Engineer to join our team in Costa Mesa, California. As a GNC Engineer, you will be responsible for developing, implementing, and testing algorithms for guidance, navigation, and control systems in line with project requirements. You will design, implement, and assess the GNC architectures considering all elements, including spacecraft, sensors, actuators, and algorithms. You will use MDAO techniques for the design and optimization of GNC systems, balancing performance, cost, and risk factors.</p>
<p>You will participate in all phases of system development, from concept to design, prototyping, testing, and operation. You will collaborate closely with a multidisciplinary team including systems engineers, flight software engineers, and test engineers to ensure system-wide integration and performance. You will contribute to design reviews, safety assessments, and system documentation. You will use simulation and analysis tools to verify GNC models and system performance under different operating conditions.</p>
<p>You will provide system engineering expertise in the development of GNC systems, ensuring a cohesive and efficient approach to the design and implementation process. You will troubleshoot and resolve issues related to GNC systems during design, testing, and operational phases. You will keep updated with the latest technologies and trends in GNC engineering and system architecture design.</p>
<p>Required qualifications include a BS in Robotics, Computer Science, Electrical, Mechanical, Mechatronics, Aerospace Engineering or related field with focus on dynamics and control, 3 or more years of experience in Guidance, Navigation &amp; Control engineering or related technical experience, and a strong theoretical knowledge of classical and modern control theory, estimation, and filtering techniques.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$129,000-$220,000 USD</Salaryrange>
      <Skills>BS in Robotics, Computer Science, Electrical, Mechanical, Mechatronics, Aerospace Engineering or related field with focus on dynamics and control, 3 or more years of experience in Guidance, Navigation &amp; Control engineering or related technical experience, Strong theoretical knowledge of classical and modern control theory, estimation, and filtering techniques, Proficiency with GNC simulation tools like MATLAB, Simulink and related toolboxes, Familiarization with 6-DOF simulations, A Ph.D. in Aerospace Engineering, Mechanical Engineering, Electrical Engineering, or a related field, Strong understanding of flight mechanics and related physics, Practical experience developing GNC designs for flight systems, Extensive experience in GNC system and architecture development and testing, preferably in the aerospace industry, Experience generating in C or C++ code from Simulink models</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/andurilindustries.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defense technology company that develops advanced technology for the U.S. and allied military.</Employerdescription>
      <Employerwebsite>https://www.andurilindustries.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/4904013007</Applyto>
      <Location>Costa Mesa, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8fa7eb38-b7e</externalid>
      <Title>Guidance, Navigation and Control (GNC) Engineer -  Tactical Reconnaissance &amp; Strike</Title>
      <Description><![CDATA[<p>We are seeking a Guidance, Navigation and Control (GNC) Engineer to join our Tactical Recon &amp; Strike team. As a GNC Engineer, you will be responsible for developing guidance algorithms and state estimators for group 1-3 UAV platforms. You will work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Develop guidance algorithms (terminal and midcourse) and state estimators for group 1-3 UAV platforms</li>
<li>Work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines</li>
<li>Review test data captured from the flight control system, subsystems, and other test instrumentation to verify vehicle performance, evaluate GNC algorithm/controller behaviour, or debug incidents during testing or from fielded assets</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>
<li>3+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>
<li>Experience with tactical guidance algorithms and target state estimation development</li>
<li>Experience with state estimation and filtering</li>
<li>Kalman filtering, sensor fusion, complementary filters, etc.</li>
<li>Experience in modelling and simulation of linear and nonlinear dynamic systems and model linearisation</li>
<li>Experience coding in Matlab and Simulink</li>
<li>C/C++ Proficiency</li>
<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Master&#39;s Degree or PhD in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>
<li>5+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>
<li>Experience with flight test and debugging GNC on UAV platforms</li>
<li>Experience with GNC design and analysis for fixed wing, rotary wing aircraft and/or tactical missile systems</li>
<li>Experience with seeker integration</li>
<li>Experience in one or more of the following: VIO, TERCOM, gimbal mount models, detection and tracking, LWIR/EO sensors, RADAR, target motion models</li>
<li>Experience with Simulink Embedded Coder</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$166,000-$220,000 USD</Salaryrange>
      <Skills>Matlab, Simulink, C/C++, Kalman filtering, Sensor fusion, Complementary filters, Model linearisation, Tactical guidance algorithms, Target state estimation development, State estimation and filtering, VIO, TERCOM, Gimbal mount models, Detection and tracking, LWIR/EO sensors, RADAR, Target motion models, Simulink Embedded Coder</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anduril Industries</Employername>
      <Employerlogo>https://logos.yubhub.co/andurilindustries.com.png</Employerlogo>
      <Employerdescription>Anduril Industries is a defence technology company that transforms U.S. and allied military capabilities with advanced technology.</Employerdescription>
      <Employerwebsite>https://www.andurilindustries.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/andurilindustries/jobs/5060107007</Applyto>
      <Location>Costa Mesa, California, United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>baf3d89c-84b</externalid>
      <Title>Senior Manager, Perception</Title>
      <Description><![CDATA[<p>As a member of the HMS Perception team, you will conduct software development at the intersection of classical state estimation techniques, sensor fusion, artificial intelligence, machine learning, and machine perception. You will develop cutting-edge technology onto real hardware that provides robust and accurate estimates of vehicle pose and surroundings for real missions.</p>
<p>Shield AI is pushing the envelope by applying advanced AI solutions to real hardware systems. An ideal candidate should aspire to be a part of this industry-changing team developing and deploying advanced technology that can truly make an impact.</p>
<p>We are seeking a skilled and motivated leader with 10+ years of experience to manage a technical team supporting the development, integration, and testing of perception algorithms for advanced aerospace, defense, and robotics systems. In this role, you will contribute to implementing and integrating innovative perception solutions while collaborating with a multidisciplinary team of engineers to meet challenging operational requirements.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution. Balance hands-on technical oversight with performance optimization, innovation, and clear stakeholder communication.</li>
</ul>
<ul>
<li>Write production quality software in C++</li>
</ul>
<ul>
<li>Produce an Assured Position, Navigation, and Timing (A-PNT) system to enable reliable autonomy in GNSS-degraded or denied environments</li>
</ul>
<ul>
<li>Extend and specialize Shield AI’s state-of-the-art state estimation framework for new sensors, platforms, and missions</li>
</ul>
<ul>
<li>Write test code to validate your software with simulated and real-world data</li>
</ul>
<ul>
<li>Collaborate with hardware and test teams to validate algorithms/code on aerial platforms</li>
</ul>
<ul>
<li>Write analyzers to ingest data and produce statistics to validate code quality</li>
</ul>
<ul>
<li>Enhance sensor models within a high-fidelity simulation environment</li>
</ul>
<ul>
<li>Work in a fast-paced, collaborative, continuous development environment, enhancing analysis and benchmarking capabilities</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$229,233 - $343,849 a year</Salaryrange>
      <Skills>C++, Sensor fusion, Artificial intelligence, Machine learning, Machine perception, Kalman Filter, Factor Graphs, Computer Vision, OpenCV, Unix environments, Robotics technologies, Unmanned system technologies, High-fidelity simulation, Sensor modeling</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015 with a mission to protect service members and civilians with intelligent systems.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/501e3703-1a63-4773-b961-6029e5fb71d6</Applyto>
      <Location>San Diego, California / Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>98550091-2de</externalid>
      <Title>Staff Engineer, State Estimation</Title>
      <Description><![CDATA[<p>As a Staff State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimisation, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments.</p>
<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>
<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>
<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>
<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>
<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>
<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>
</ul>
<p><strong>Qualifications:</strong></p>
<ul>
<li>Typically requires a minimum of 7 years of relevant experience with a bachelor’s degree; or 6 years with a master’s degree; or 4 years with a PhD; or equivalent practical experience.</li>
<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>
<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>
<li>Proficient in C++11 or newer in real-time environments.</li>
<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>
<li>Strong written and verbal communication skills with a collaborative mindset.</li>
<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>
</ul>
<p><strong>Salary:</strong></p>
<p>$187,531 - $281,297 a year</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$187,531 - $281,297 a year</Salaryrange>
      <Skills>state estimation, sensor fusion, inertial navigation, Kalman filters, C++11, Linux, visual odometry, computer vision, CUDA, hardware acceleration</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015 with a mission to protect service members and civilians with intelligent systems.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/f8849287-b9ff-4c3e-a37f-be20e39c597b</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3e1eaa7a-031</externalid>
      <Title>Engineer, State Estimation</Title>
      <Description><![CDATA[<p>As a State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimisation, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments.</p>
<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>
</ul>
<ul>
<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>
</ul>
<ul>
<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>
</ul>
<ul>
<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>
</ul>
<ul>
<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>
</ul>
<ul>
<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>
</ul>
<p><strong>Qualifications:</strong></p>
<ul>
<li>Typically requires a minimum of 3 years of relevant experience with a bachelor’s degree; or 2 years with a master’s degree; or 1 years with a PhD; or equivalent practical experience.</li>
</ul>
<ul>
<li>Familiarity with algorithms.</li>
</ul>
<ul>
<li>Proficient in C++11 or newer in real-time environments.</li>
</ul>
<ul>
<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>
</ul>
<ul>
<li>Strong written and verbal communication skills with a collaborative mindset.</li>
</ul>
<ul>
<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>
</ul>
<ul>
<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>
</ul>
<ul>
<li>Experience implementing inertial navigation algorithms in degraded or GPS-denied conditions.</li>
</ul>
<ul>
<li>Exposure to visual odometry or computer vision-based navigation approaches.</li>
</ul>
<ul>
<li>Experience optimising code for performance on compute-constrained platforms.</li>
</ul>
<ul>
<li>Familiarity with CUDA or hardware acceleration techniques (e.g., FPGAs).</li>
</ul>
<ul>
<li>Experience transitioning navigation solutions from research into production environments.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$120,000 - $250,000 a year</Salaryrange>
      <Skills>C++11, Linux, real-time environments, algorithmic thinking, strong written and verbal communication skills, Kalman filters, EKF, UKF, particle filters, visual odometry, computer vision-based navigation approaches</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/133ad6aa-d624-4fad-b1cc-1f8f42d0401f</Applyto>
      <Location>Dallas/San Diego/Boston/DC/San Fran</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>1044b51e-cc6</externalid>
      <Title>Senior Manager, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>
<li>Develop advanced perception algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks by integrating data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques.</li>
<li>Develop state estimation capabilities by designing and refining algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs.</li>
<li>Analyze and utilize sensor ICDs to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance by tuning and evaluating perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration by working closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings by leveraging synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams to ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing by contributing novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience.</li>
<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.</li>
<li>7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D.</li>
<li>2+ years of people leadership experience.</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>
<li>Ability to obtain a SECRET clearance.</li>
</ul>
<p><strong>Preferences:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems.</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>
<li>Experience deploying perception software on SWaP-constrained platforms.</li>
<li>Familiarity with validating perception systems during flight test events or operational environments.</li>
<li>Understanding of sensing challenges in denied or degraded conditions.</li>
<li>Exposure to perception applications across air, maritime, and ground platforms.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$229,233 - $343,849 a year</Salaryrange>
      <Skills>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, 10+ years of related experience, 7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/cebc0dd3-ffbf-4013-a2ad-ae32732cabd3</Applyto>
      <Location>Washington, DC / San Diego, California / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3f0b0cce-7be</externalid>
      <Title>Manager, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.
The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>We are seeking a skilled and motivated manager to lead technical teams and support direct projects integrating perception solutions for defense platforms.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land.
Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p>Responsibilities:</p>
<ul>
<li>Multidisciplinary Team Leadership – Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p>Required Qualifications:</p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>
<li>5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D</li>
<li>2+ years of people leadership experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>
<li>Ability to obtain a SECRET clearance.</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems.</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>
<li>Experience deploying perception software on SWaP-constrained platforms.</li>
<li>Familiarity with validating perception systems during flight test events or operational environments.</li>
<li>Understanding of sensing challenges in denied or degraded conditions.</li>
<li>Exposure to perception applications across air, maritime, and ground platforms.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$220,441 - $330,661 a year</Salaryrange>
      <Skills>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience, 5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models., Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/1120529c-2f7d-4b27-a29b-50976c49c433</Applyto>
      <Location>Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>7da005da-ff5</externalid>
      <Title>Senior Engineer, State Estimation</Title>
      <Description><![CDATA[<p>As a Senior Engineer, State Estimation, you will work on the GNC team to develop and optimise algorithms that process and fuse data from various sensors to provide accurate, reliable state estimates, enabling the X-BAT to operate autonomously in complex and contested environments.</p>
<p>Your key responsibilities will include:</p>
<p>Developing and implementing advanced sensor algorithms for processing data from IMUs, radar, cameras, GPS, and other sensors.
Enhancing state estimation algorithms by integrating multi-sensor data for improved accuracy and robustness.
Designing and implementing real-time sensor data processing pipelines.
Collaborating with cross-functional teams, including software engineers, autonomy researchers, and hardware engineers, to ensure seamless integration of state estimation algorithms.
Conducting experiments and field tests to validate the performance of state estimation algorithms in real-world scenarios.
Staying updated with the latest advancements in sensor technologies and state estimation, applying them to our systems.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$160,000 - $240,000 a year</Salaryrange>
      <Skills>C++ 11 or newer, Linux, command line tools, Kalman filters, particle filters, inertial navigation algorithms, computer vision techniques, optimising algorithms for compute-constrained systems, CUDA or other hardware acceleration technologies</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed defence technology company founded in 2015, with offices across the U.S., Europe, the Middle East, and the Asia-Pacific.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/0c6acdd5-a39b-4ad3-84fa-b1a1f83409d3</Applyto>
      <Location>Dallas, Texas / Boston, MA / San Diego, California / Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>841c78ea-841</externalid>
      <Title>Senior Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.
The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.
Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.
Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.
Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.
Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.
Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.
Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.
Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.
Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$160,000 - $240,000 a year</Salaryrange>
      <Skills>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company that develops intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/d6f1d906-5c1e-4640-87f3-3e31e1b45fa6</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5f911dd8-860</externalid>
      <Title>Senior Staff Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>
<li>Ability to obtain a SECRET clearance</li>
</ul>
<p><strong>Preferences:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>
<li>Experience deploying perception software on SWaP-constrained platforms</li>
<li>Familiarity with validating perception systems during flight test events or operational environments</li>
<li>Understanding of sensing challenges in denied or degraded conditions</li>
<li>Exposure to perception applications across air, maritime, and ground platforms</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$220,800 - $331,200 a year</Salaryrange>
      <Skills>algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>bed4759c-578</externalid>
      <Title>Staff Engineer, Software - Perception</Title>
      <Description><![CDATA[<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>
<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>
<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>
<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>
<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>
<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>
<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>
<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>
<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>
<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>
<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>
<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>
</ul>
<p><strong>Required Qualifications:</strong></p>
<ul>
<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>
<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>
<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>
<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>
<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>
<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>
<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>
<li>Ability to obtain a SECRET clearance</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Hands-on integration or algorithm development with airborne sensing systems</li>
<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>
<li>Experience deploying perception software on SWaP-constrained platforms</li>
<li>Familiarity with validating perception systems during flight test events or operational environments</li>
<li>Understanding of sensing challenges in denied or degraded conditions</li>
<li>Exposure to perception applications across air, maritime, and ground platforms</li>
</ul>
<p>$182,720 - $274,080 a year</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$182,720 - $274,080 a year</Salaryrange>
      <Skills>real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015, developing intelligent systems to protect service members and civilians.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2</Applyto>
      <Location>San Diego, California / Washington, DC / Boston, MA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>5b9f33df-224</externalid>
      <Title>Engineer, State Estimation</Title>
      <Description><![CDATA[<p>As a State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimization, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments. You will help design real-time sensor processing pipelines, integrate multi-sensor data for robust state estimation, and collaborate closely with autonomy researchers, software engineers, and hardware teams to ensure high system performance and reliability.</p>
<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>
<p><strong>Responsibilities:</strong></p>
<ul>
<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>
<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>
<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>
<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>
<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>
<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>
</ul>
<p><strong>Qualifications:</strong></p>
<ul>
<li>Typically requires a minimum of 3 years of relevant experience with a bachelor’s degree; or 2 years with a master’s degree; or 1 years with a PhD; or equivalent practical experience.</li>
<li>Familiarity with algorithms.</li>
<li>Proficient in C++11 or newer in real-time environments.</li>
<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>
<li>Strong written and verbal communication skills with a collaborative mindset.</li>
<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>
</ul>
<p><strong>Preferred Qualifications:</strong></p>
<ul>
<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>
<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>
<li>Experience implementing inertial navigation algorithms in degraded or GPS-denied conditions.</li>
<li>Exposure to visual odometry or computer vision-based navigation approaches.</li>
<li>Experience optimizing code for performance on compute-constrained platforms.</li>
<li>Familiarity with CUDA or hardware acceleration techniques (e.g., FPGAs).</li>
<li>Experience transitioning navigation solutions from research into production environments.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$120,000 - $250,000 a year</Salaryrange>
      <Skills>C++11, Linux, standard command-line tools, scripting, algorithms, real-time environments, Kalman filters, EKF, UKF, particle filters, visual odometry, computer vision-based navigation, CUDA, hardware acceleration techniques</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Shield AI</Employername>
      <Employerlogo>https://logos.yubhub.co/shield.ai.png</Employerlogo>
      <Employerdescription>Shield AI is a venture-backed deep-tech company founded in 2015 with a mission to protect service members and civilians with intelligent systems.</Employerdescription>
      <Employerwebsite>https://www.shield.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/shieldai/133e1006-4bcd-4a31-afaf-c85ad113b749</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>62efca6f-b6f</externalid>
      <Title>Senior AI Engineer</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Senior AI Engineer who is obsessed with building AI systems that actually work in production: reliable, observable, cost-efficient, and genuinely useful. This is not a research role. You will ship AI-powered features that process real financial data for real businesses.</p>
<p>LLM &amp; AI Pipeline Engineering - Design, build, and maintain production-grade LLM integration pipelines , including retrieval-augmented generation (RAG), prompt engineering, output parsing, and chain orchestration.</p>
<p>Develop and operate AI features within Jeeves&#39;s core financial products: spend categorization, document extraction, anomaly detection, financial Q&amp;A, and automated reconciliation.</p>
<p>Implement structured output validation, fallback handling, and confidence scoring to ensure AI decisions meet reliability standards for financial use cases.</p>
<p>Evaluate and integrate AI frameworks and tools (LangChain, LlamaIndex, OpenAI API, Anthropic API, HuggingFace, vector databases) and advocate for the right tool for the job.</p>
<p>Establish prompt versioning and evaluation practices to ensure AI outputs remain accurate and consistent as models and data evolve.</p>
<p>Retrieval &amp; Vector Search - Design and maintain vector search pipelines using databases such as Pinecone, Weaviate, or pgvector to power semantic search and RAG-based features.</p>
<p>Build document ingestion and chunking pipelines for Jeeves&#39;s financial data , processing invoices, receipts, policy documents, and transaction records.</p>
<p>Optimize retrieval quality through embedding model selection, chunk strategy, metadata filtering, and re-ranking techniques.</p>
<p>ML Model Serving &amp; Operations - Collaborate with data scientists to take trained ML models from experimental notebooks to production serving infrastructure.</p>
<p>Build and maintain model serving endpoints with appropriate latency SLOs, input validation, and output monitoring.</p>
<p>Implement model performance monitoring and data drift detection to ensure production models remain accurate over time.</p>
<p>Support model retraining workflows by designing clean data pipelines and feature engineering that can be continuously updated.</p>
<p>Backend Integration &amp; Reliability - Integrate AI services cleanly with Jeeves&#39;s backend microservices , designing clear API contracts, circuit breakers, and graceful degradation patterns.</p>
<p>Write high-quality, testable backend code in Python or Go/Node.js to power AI-integrated features.</p>
<p>Instrument AI components with structured logging, distributed tracing, latency dashboards, and alerting to ensure operational visibility.</p>
<p>Collaboration &amp; Growth - Partner with Product, Backend Engineering, and Data Science to define the AI roadmap and translate requirements into reliable systems.</p>
<p>Contribute to a culture of quality by writing design docs, reviewing peers&#39; AI system designs, and sharing learnings openly.</p>
<p>Help grow the AI engineering practice at Jeeves by establishing patterns, tooling, and best practices that the broader team can build on.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>LLM, AI, Python, LangChain, LlamaIndex, OpenAI API, Anthropic API, HuggingFace, vector databases, Pinecone, Weaviate, pgvector, semantic search, RAG-based features, document ingestion, chunking pipelines, embedding model selection, chunk strategy, metadata filtering, re-ranking techniques, model serving infrastructure, latency SLOs, input validation, output monitoring, model performance monitoring, data drift detection, clean data pipelines, feature engineering, API contracts, circuit breakers, graceful degradation patterns, structured logging, distributed tracing, latency dashboards, alerting</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Jeeves</Employername>
      <Employerlogo>https://logos.yubhub.co/jeeves.com.png</Employerlogo>
      <Employerdescription>Jeeves is a financial operating system built for global businesses that provides corporate cards, cross-border payments, and spend management software within one unified platform. It operates across 20+ countries and serves over 5,000 clients.</Employerdescription>
      <Employerwebsite>https://www.jeeves.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/tryjeeves/ded9e04e-f18e-4d4c-ae43-4b7882c6200b</Applyto>
      <Location>India</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>76ec9c27-a1c</externalid>
      <Title>Signal Processing Engineer</Title>
      <Description><![CDATA[<p>We&#39;re seeking a highly skilled Signal Processing Engineer to join our growing team. As a Signal Processing Engineer at CX2, you will design, implement, and test signal processing techniques using MATLAB, Python, and other existing frameworks. You will work on digital signal processing, write and contribute to existing Python repositories using CUDA and PyTorch, own requirements, ICDs, and verification from concept through delivery, and stay current with advances in signal processing techniques and associated technologies.</p>
<p>Responsibilities:</p>
<ul>
<li>Design, characterize, and deliver algorithms such as channelizers, frequency agile detection, adaptive filters, MIMO, wideband detectors, and other algorithms related to signal sorting</li>
<li>Write and contribute to existing Python repositories using CUDA and PyTorch</li>
<li>Own requirements, ICDs, and verification from concept through delivery</li>
<li>Stay current: Track and insert advances in signal processing techniques and associated technologies, adaptive beamforming, RF machine learning, and resilient PNT for GPS-denied ops.</li>
</ul>
<p>Required Qualifications:</p>
<ul>
<li>Masters Degree in Electrical, Computer or Systems Engineering or related field with Graduate study emphasis in Signal Processing; OR a Bachelor’s Degree in an Engineering discipline with 3-5 years relevant working Signal Processing Experience</li>
<li>Intermediate to advanced proficiency in Python</li>
<li>Willingness to support critical test events that occasionally require extended hours/weekends.</li>
<li>Ability to obtain and maintain a security clearance. Learn more about Security Clearances here.</li>
<li>Must be a U.S. Person (see ITAR Regulations below) due to required access to U.S. export-controlled information or facilities</li>
</ul>
<p>Bonus Points:</p>
<ul>
<li>PhD in Electrical Engineering, Computer Engineering, or related field</li>
<li>5+ years’ experience with EW subsystems and payloads.</li>
<li>EA/ECM technique design (deception, Digital RF Memory, coherent/non-coherent techniques).</li>
<li>Comms system design (LPI/LPD, Waveform-of-Interest exploitation)</li>
<li>RF machine learning for emitter ID, modulation/classification, anomaly detection, PDW creation</li>
<li>Tools Experience: ADS/AWR/SystemVue, MATLAB/Simulink, Python (NumPy/SciPy), GNU Radio/SDR (USRP/RFSoC), VITA-49; HDL/firmware experience also helpful (Vivado/Quartus/Libero).</li>
<li>Clearance: Active Secret or ability to obtain and maintain; TS/SCI eligibility preferred. ITAR/EAR-controlled work.</li>
<li>Field work: supporting periodic travel for flight tests and customer demonstrations/support</li>
<li>Mindset: Builder-tester who loves first-principles RF, rapid lab iteration, and getting hardware flying fast.</li>
</ul>
<p>What We Offer:</p>
<ul>
<li>Competitive salary, stock options and benefits, including health, vision and dental.</li>
<li>401K enrollment at 90 days.</li>
<li>Generous PTO + most Federal Holidays observed.</li>
<li>Collaborative and inclusive work environment.</li>
<li>Access to the latest tools and technologies.</li>
<li>High levels of responsibility and autonomy.</li>
<li>Professional growth and development opportunities.</li>
<li>Access to the hardest problems in electronic warfare.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>MATLAB, Python, CUDA, PyTorch, Digital Signal Processing, Channelizers, Frequency Agile Detection, Adaptive Filters, MIMO, Wideband Detectors, ADS/AWR/SystemVue, MATLAB/Simulink, Python (NumPy/SciPy), GNU Radio/SDR (USRP/RFSoC), VITA-49, HDL/Firmware (Vivado/Quartus/Libero)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>CX2</Employername>
      <Employerlogo>https://logos.yubhub.co/cx2.com.png</Employerlogo>
      <Employerdescription>CX2 is a next-generation defense technology company that builds AI-enabled hardware and software platforms to detect, disrupt, and defend the electromagnetic spectrum across land, air, sea, and space.</Employerdescription>
      <Employerwebsite>https://cx2.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/cx2/c03eadf7-133f-4785-b7f9-37e5c3d52db9</Applyto>
      <Location>El Segundo</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>511ebb39-00c</externalid>
      <Title>Assistant Procurement Manager - Indirect Services &amp; Materials - 12 month FTC</Title>
      <Description><![CDATA[<p>About the role</p>
<p>As an Indirect Assistant Procurement Manager, you will play a vital role in managing the procurement of goods and services not for resale, ensuring the efficient and cost-effective buying of items across key categories including but not limited to - Marketing, IT, Professional services, Store Design and Visual Merchandising.</p>
<p>You will work closely with the Procurement Manager and Head of Department across various projects and liaise with different departments to support the overall procurement strategy and contribute to the company&#39;s success.</p>
<p>We are looking for a dynamic individual who has appropriate indirect procurement experience, to meet the needs of the key responsibilities, working within a fast-paced environment, ideally from within a digitally mature industry or heavy presence online retailer.</p>
<p>The individual will have a global support remit. Project managing a number of ad-hoc support projects will be a key activity within the role. Previous experience in marketing procurement is an advantage.</p>
<p>Responsibilities</p>
<ul>
<li>Spend Mapping analysis of all indirect spend, providing Procurement Manager (PM) with analysis and cost saving opportunities.</li>
<li>Prepare and maintain procurement reports, including supplier performance, cost analysis, and contract status.</li>
<li>Ensure all procurement activities comply with company policies, industry regulations, and sustainability standards.</li>
<li>Managing ESG initiatives with suppliers, including Ecovadis audit and improvement management actions.</li>
<li>Negotiation of contracts and commercials. Support negotiations with suppliers to secure favourable terms, pricing, and service agreements.</li>
<li>Assist in identifying, evaluating, and managing relationships with suppliers of indirect goods and services to ensure quality, cost-effectiveness, and reliability.</li>
<li>Oversee the administration of contracts for indirect procurement, ensuring compliance with terms and conditions.</li>
<li>Work on sourcing activities as required by category leads.</li>
<li>Provide full tender support to the procurement manager, including RFI/RFP as directed, including collation of responses and score carding with appropriate analysis provided.</li>
<li>Organising QBR supplier reviews across key suppliers, including annual meeting plans and responsible for chasing through improvement actions and KPI management.</li>
<li>Rebate management tracking and regular liaison with finance team to ensure accurate accruals and ready availability of audit information.</li>
<li>Take ownership of cost initiative projects as directed by the Procurement Manager as part of their personal development plan, and in accordance with Procurement director expected annual savings targets.</li>
<li>Work with PM’s to ensure Procurement/Legal action log is up to date.</li>
<li>Set Up and deliver ongoing indirect material/services supply base credit monitoring and proactive feedback to Procurement team of any critical profile changes, using D&amp;B tracker system.</li>
</ul>
<p>About you</p>
<ul>
<li>Learning Curve - Capability to quickly take on board and disseminate new data/info quickly and communicate to team effectively.</li>
<li>Attention to Detail – Able to work at an extremely fast pace, without making unchecked errors across a wide range of projects at the same time.</li>
<li>Conflicting Priorities - Ability to commit to and deliver against deadlines agreed, where there will be multiple priorities to consider and manage.</li>
<li>Project Management skills – Must be conversant with both sourcing tenders/RFP processes.</li>
<li>Commercial acumen - Role holder must be keen to develop their negotiation skills, and to develop strong relationships with their suppliers and stakeholders alike.</li>
<li>Autonomy – Must be able to take responsibility for own resource time, prioritising work according to demands of the PM taking ownership of delivery, with minimal direct support.</li>
<li>Should be proficient with Microsoft applications, including excel and PowerPoint especially. Experience using Monday project management system is a plus.</li>
<li>Previous procurement/purchasing experience within digitally mature industry or Cosmetic/Personal care online retailer.</li>
<li>Proven project management experience track record a must have.</li>
<li>Strong operational stakeholder relationship development and management.</li>
<li>Must have digital/SaaS buying support background.</li>
<li>Performance Management service sourcing background a distinct advantage - Must be strong in data analysis, and have excel analytical expertise, including Vlookups, nested ‘if’ statements, pivot tables filters etc.</li>
<li>Must be comfortable working with ambiguous and/or incomplete information.</li>
<li>Use of ‘source to contract’ e-sourcing tools a must. Implementation support a distinct advantage.</li>
<li>Some experience in formal Contract development would be a distinct advantage.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Procurement, Indirect procurement, Spend mapping, Contract negotiation, Supplier management, Project management, Microsoft Office, Excel, PowerPoint, Monday project management system, Digital/SaaS buying support, Performance Management service sourcing, Data analysis, Excel analytical expertise, Vlookups, Nested ‘if’ statements, Pivot tables filters</Skills>
      <Category>Operations</Category>
      <Industry>Beauty</Industry>
      <Employername>Charlotte Tilbury Beauty</Employername>
      <Employerlogo>https://logos.yubhub.co/charlottetilbury.com.png</Employerlogo>
      <Employerdescription>A global beauty company founded in 2013, with a presence in 50 markets and over 2,300 employees.</Employerdescription>
      <Employerwebsite>https://www.charlottetilbury.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/B8EFF04925</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-20</Postedate>
    </job>
    <job>
      <externalid>ea3a63bb-362</externalid>
      <Title>2nd Shift Manufacturing Technician</Title>
      <Description><![CDATA[<p><strong>Job Summary</strong></p>
<p>As a 2nd Shift Manufacturing Technician at FUCHS Lubricants Co., you will be part of a driven community focused on providing our customers with world-class solutions that help push our customers forward. You will be responsible for the care and maintenance of our clients&#39; industrial fluids, including coolants, wash, and other liquids.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Operate and perform routine preventive maintenance on lubrication and fluid decantation systems, including filter changes and system cleanings</li>
<li>Input data daily and generate performance reports</li>
<li>Handle materials, operate lift trucks, and move drums and totes</li>
<li>Make fluid corrections as needed based on testing results in the absence of the site leader</li>
<li>Follow procedures and adhere to all customer and safety requirements</li>
<li>Participate in daily communication meetings with site supervisor and communicate effectively</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>High school diploma or equivalent required</li>
<li>Standing and/or walking for approximately 4-8 hours daily</li>
<li>Previous work experience working in manufacturing environment related to lubricants and/or laboratory testing preferred</li>
<li>Basic computer knowledge and ability to navigate the internet and email</li>
<li>Ability to lift up to 50lbs as well as standing and/or walking for approximately 4-8 hours daily</li>
<li>Previous forklift experience is desired</li>
</ul>
<p><strong>Benefits</strong></p>
<p>FUCHS offers a challenging and rewarding working environment where employees are encouraged to develop and grow as professionals. In this role, you will have the opportunity to work on projects that will expand your experience and challenge your abilities in the global marketplace. The position also offers an excellent compensation package and a comprehensive suite of benefits.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$15-17 per hour</Salaryrange>
      <Skills>manufacturing, lubricants, fluid decantation, filter changes, system cleanings, data entry, performance reports, material handling, lift truck operation, drum and tote movement, fluid corrections, customer and safety requirements, communication, previous work experience in manufacturing environment related to lubricants and/or laboratory testing, basic computer knowledge, ability to lift up to 50lbs</Skills>
      <Category>Manufacturing</Category>
      <Industry>Manufacturing</Industry>
      <Employername>FUCHS Lubricants Co.</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.fuchs.com.png</Employerlogo>
      <Employerdescription>FUCHS Lubricants Co. is the United States operating unit of FUCHS S.E., the world&apos;s largest independent manufacturer of specialty lubricants with global sales of over $3.5 Billion.</Employerdescription>
      <Employerwebsite>https://jobs.fuchs.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.fuchs.com/job/Greeneville-2nd-Shift-Manufacturing-Technician-TN-37745/1239413801/</Applyto>
      <Location>Greeneville</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>d729b446-2b6</externalid>
      <Title>Team Lead - Finance (Accounts Payable)</Title>
      <Description><![CDATA[<p>We are hiring a Team Lead - Finance (Accounts Payable) to join our team in Gurgaon. As a Team Lead, you will be responsible for managing the accounts payable function, ensuring accurate and timely processing of vendor invoices, debit notes, and credit notes. You will also be responsible for vendor reconciliation, GST inputs, and tracking payment cycles.</p>
<p><strong>Key Responsibilities and Activities</strong></p>
<ul>
<li>Accounts Payable:</li>
<li>Process vendor invoices, debit notes, and credit notes</li>
<li>Match PO and invoice</li>
<li>Perform vendor reconciliation and ledger management</li>
<li>Track payment cycles and prepare payment batches</li>
<li>Handle vendor queries and resolve discrepancies</li>
<li>GST:</li>
<li>Understand GST input tax credit (ITC) rules</li>
<li>Verify vendor GST invoices</li>
<li>Reconcile with GSTR-2A/2B</li>
<li>Be aware of RCM (reverse charge) entries</li>
<li>Have advance knowledge of GST returns documentation</li>
<li>TDS:</li>
<li>Know applicable TDS sections on contractors, professionals, rent, etc.</li>
<li>Ensure correct deduction and accounting of TDS</li>
<li>Validate vendor PAN and apply rate</li>
<li>Prepare TDS working for monthly payments</li>
<li>Excel Skills:</li>
<li>Be comfortable with VLOOKUP/XLOOKUP, SUMIF(S), FILTER, PIVOT TABLES, etc.</li>
<li>Prepare reconciliations and reports</li>
<li>Clean and format data</li>
<li>Software &amp; Tools:</li>
<li>Have hands-on experience with Tally / SAP / ERP systems</li>
<li>Be comfortable using GST portal and TRACES</li>
<li>Have strong working knowledge of MS Excel</li>
<li>Soft Skills:</li>
<li>Be accurate and attentive to detail</li>
<li>Communicate effectively with other departments</li>
<li>Show maturity while handling tasks</li>
<li>Meet deadlines</li>
<li>Maintain strong documentation and record-keeping</li>
<li>Additional Advantage:</li>
<li>Understand audit requirements</li>
<li>Assist in monthly closing activities</li>
<li>Know basic tax compliances</li>
<li>Reconcile bank statements</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Cab facility within hiring zones</li>
<li>Medical insurance, term insurance, and accidental insurance</li>
<li>Lunch/dinner provided at subsidized rates</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Excel, Tally, SAP, ERP, GST, TDS, Vendor Reconciliation, Ledger Management, Payment Cycles, Vendor Queries, Discrepancies, GST Input Tax Credit, GSTR-2A/2B, RCM Entries, GST Returns Documentation, TDS Sections, Vendor PAN Validation, Rate Application, TDS Working, MS Excel, VLOOKUP/XLOOKUP, SUMIF(S), FILTER, PIVOT TABLES, GST Portal, TRACES</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Keywords Studios</Employername>
      <Employerlogo>https://logos.yubhub.co/j.com.png</Employerlogo>
      <Employerdescription>Keywords Studios is a fast-growing plc listed on the London Stock Exchange&apos;s AIM market, providing localisation services to the Video games and Software Localisation markets worldwide.</Employerdescription>
      <Employerwebsite>https://apply.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/5649ED4042</Applyto>
      <Location>Gurgaon</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>0b1006f8-b4f</externalid>
      <Title>Principal SerDes Systems Engineer</Title>
      <Description><![CDATA[<p>We are seeking a Principal SerDes Systems Engineer to join our team. As a Principal SerDes Systems Engineer, you will be responsible for developing and maintaining SerDes system models for NRZ and PAM4 transceivers targeting PCIe (128Gbps+) and Ethernet (200Gbps+) standards. You will also run comprehensive system simulations to verify and sign-off design performance across multiple protocols and channels.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Developing and maintaining SerDes system models for NRZ and PAM4 transceivers targeting PCIe (128Gbps+) and Ethernet (200Gbps+) standards.</li>
<li>Running comprehensive system simulations to verify and sign-off design performance across multiple protocols and channels.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>M.Sc. or Ph.D. in Electrical or Computer Engineering.</li>
<li>Strong experience modeling circuits and systems in MATLAB/Simulink.</li>
<li>Expertise in designing high-speed analog CMOS circuits.</li>
<li>Solid understanding of DSP and communications theory, including equalization, coding, and noise/crosstalk filtering.</li>
<li>Proficiency in analyzing link budgets for NRZ and/or PAM4 high-speed serial links.</li>
<li>Familiarity with wireline protocols (PCIe, Ethernet, JESD204C, CPRI) and optical protocols (LINEAR, RTLR).</li>
<li>Experience with circuit topologies used in high-speed SerDes Tx/Rx and Tx/Rx equalization techniques.</li>
<li>Hands-on lab testing for high-speed serial links and proficiency in C/Verilog-A/systemVerilog.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>M.Sc. or Ph.D. in Electrical or Computer Engineering, Strong experience modeling circuits and systems in MATLAB/Simulink, Expertise in designing high-speed analog CMOS circuits, Solid understanding of DSP and communications theory, including equalization, coding, and noise/crosstalk filtering, Proficiency in analyzing link budgets for NRZ and/or PAM4 high-speed serial links, Familiarity with wireline protocols (PCIe, Ethernet, JESD204C, CPRI) and optical protocols (LINEAR, RTLR), Experience with circuit topologies used in high-speed SerDes Tx/Rx and Tx/Rx equalization techniques, Hands-on lab testing for high-speed serial links and proficiency in C/Verilog-A/systemVerilog, Innovative thinker with a passion for cutting-edge technology, Collaborative team player who thrives in a multidisciplinary environment, Analytical problem-solver with meticulous attention to detail, Effective communicator, able to translate complex concepts for diverse audiences, Adaptable and eager to learn, keeping pace with evolving industry trends, Customer-focused, dedicated to delivering exceptional support and results</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Synopsys</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.synopsys.com.png</Employerlogo>
      <Employerdescription>Synopsys is a leading provider of electronic design automation (EDA) software and intellectual property (IP) solutions. The company&apos;s technology is used to design and develop complex electronic systems, including semiconductors, software, and systems-on-chip (SoCs).</Employerdescription>
      <Employerwebsite>https://careers.synopsys.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.synopsys.com/job/eindhoven/principal-serdes-systems-engineer/44408/92341044576</Applyto>
      <Location>Eindhoven, North Brabant, Netherlands</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>600601e3-040</externalid>
      <Title>Principal SerDes Systems Engineer</Title>
      <Description><![CDATA[<p>We are seeking a Principal SerDes Systems Engineer to join our team. As a Principal SerDes Systems Engineer, you will be responsible for developing and maintaining SerDes system models for NRZ and PAM4 transceivers targeting PCIe (128Gbps+) and Ethernet (200Gbps+) standards.</p>
<p><strong>What you&#39;ll do</strong></p>
<ul>
<li>Developing and maintaining SerDes system models for NRZ and PAM4 transceivers targeting PCIe (128Gbps+) and Ethernet (200Gbps+) standards.</li>
<li>Running comprehensive system simulations to verify and sign-off design performance across multiple protocols and channels.</li>
<li>Designing and proposing advanced algorithms to calibrate and adapt transceivers for optimal performance.</li>
<li>Correlating simulated performance with silicon measurements to ensure accuracy and reliability.</li>
<li>Providing expert assistance to customers for system-level performance issues and troubleshooting.</li>
<li>Collaborating with cross-functional teams of analog, digital, and hardware engineers throughout all stages of development.</li>
<li>Contributing to lab testing and analysis for high-speed serial links, ensuring robust design validation.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>M.Sc. or Ph.D. in Electrical or Computer Engineering.</li>
<li>Strong experience modeling circuits and systems in MATLAB/Simulink.</li>
<li>Expertise in designing high-speed analog CMOS circuits.</li>
<li>Solid understanding of DSP and communications theory, including equalization, coding, and noise/crosstalk filtering.</li>
<li>Proficiency in analyzing link budgets for NRZ and/or PAM4 high-speed serial links.</li>
<li>Familiarity with wireline protocols (PCIe, Ethernet, JESD204C, CPRI) and optical protocols (LINEAR, RTLR).</li>
<li>Experience with circuit topologies used in high-speed SerDes Tx/Rx and Tx/Rx equalization techniques.</li>
<li>Hands-on lab testing for high-speed serial links and proficiency in C/Verilog-A/systemVerilog.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>M.Sc. or Ph.D. in Electrical or Computer Engineering, Strong experience modeling circuits and systems in MATLAB/Simulink, Expertise in designing high-speed analog CMOS circuits, Solid understanding of DSP and communications theory, including equalization, coding, and noise/crosstalk filtering, Proficiency in analyzing link budgets for NRZ and/or PAM4 high-speed serial links, Familiarity with wireline protocols (PCIe, Ethernet, JESD204C, CPRI) and optical protocols (LINEAR, RTLR)</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Synopsys</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.synopsys.com.png</Employerlogo>
      <Employerdescription>Synopsys is a leading provider of electronic design automation (EDA) software and intellectual property (IP) solutions. The company&apos;s technology is used to design and develop complex semiconductor products, such as microprocessors, memory chips, and graphics processing units.</Employerdescription>
      <Employerwebsite>https://careers.synopsys.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.synopsys.com/job/mississauga/principal-serdes-systems-engineer/44408/92341044560</Applyto>
      <Location>Mississauga, Ontario, Canada</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
  </jobs>
</source>