{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/sensor-fusion"},"x-facet":{"type":"skill","slug":"sensor-fusion","display":"Sensor Fusion","count":33},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_56e29c57-cd1"},"title":"Robotics Technician","description":"<p>We&#39;re seeking a Robotics Technician to join our team in Mexico City. As a key contributor, you will partner with cross-functional stakeholders to bring up new robots and productionize the maintenance of robots and collection hardware. You will play a critical role in supporting the day-to-day operations of the factory by bringing up and maintaining robots and collection hardware. You will also provide technical support for data collection operations, manage physical inventory, maintain equipment, and coordinate logistics.</p>\n<p>You will become a subject matter expert on all capabilities of the robotics platforms deployed in the factory. You will develop technical domain expertise in areas of 2D and 3D imaging and annotation, multi-sensor fusion and calibration, GPS/INS navigation systems, computer vision, and other autonomy-adjacent concepts.</p>\n<p>You have a Bachelor&#39;s degree or industry experience, an engineering background, preferably in Computer Science, Mathematics, or other Engineering fields. You have 2+ years of experience developing with Python, C++, Java, and/or other scripting languages. You have 1-3 years of experience in hardware labs or a manufacturing environment. You have experience managing risk and operating robots safely. You have strong project management and interpersonal skills, high attention to detail, and a strong sense of ownership. You have a high level of comfort communicating effectively across internal and external organizations.</p>\n<p>Nice to have: hands-on experience in Robotics, AI, and/or Computer Vision, intellectual curiosity, empathy, and ability to operate with a high degree of autonomy, experience building and/or maintaining lab networks and data pipelines, experience running large-scale data collection and controlled experiments, experience building out facilities, and experience in logistics.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_56e29c57-cd1","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4635128005","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","C++","Java","Robotics","AI","Computer Vision","Multi-sensor fusion and calibration","GPS/INS navigation systems"],"x-skills-preferred":["hands-on experience in Robotics, AI, and/or Computer Vision","intellectual curiosity","empathy","ability to operate with a high degree of autonomy","experience building and/or maintaining lab networks and data pipelines","experience running large-scale data collection and controlled experiments","experience building out facilities","experience in logistics"],"datePosted":"2026-04-18T16:00:01.904Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mexico City, MX"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, C++, Java, Robotics, AI, Computer Vision, Multi-sensor fusion and calibration, GPS/INS navigation systems, hands-on experience in Robotics, AI, and/or Computer Vision, intellectual curiosity, empathy, ability to operate with a high degree of autonomy, experience building and/or maintaining lab networks and data pipelines, experience running large-scale data collection and controlled experiments, experience building out facilities, experience in logistics"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8697757e-7b4"},"title":"Lead Robotics Software Engineer","description":"<p>As a Lead Robotics Software Engineer on our Tactical Recon &amp; Strike team, you&#39;ll be at the forefront of cutting-edge autonomous systems development. You&#39;ll tackle diverse challenges in autonomy, systems integration, robotics, and networking, making critical engineering decisions that directly impact mission success.</p>\n<p>Your role will be pivotal in ensuring Anduril&#39;s products work seamlessly together to achieve a variety of crucial outcomes. You&#39;ll develop innovative solutions for complex robotics problems, balance pragmatic engineering trade-offs with mission-critical requirements, and collaborate across teams to integrate software with hardware systems.</p>\n<p>Contributing to the entire product lifecycle, from concept to deployment, you&#39;ll rapidly prototype and iterate on software solutions. We&#39;re looking for someone who thrives in a fast-paced environment and isn&#39;t afraid to tackle ambiguous problems. Your &#39;Whatever It Takes&#39; mindset will be key in executing tasks efficiently, scalably, and pragmatically, always keeping the mission at the forefront of your work.</p>\n<p>This role offers the opportunity to make a significant impact on next-generation defense technology, working with state-of-the-art robotics and autonomous systems. You&#39;ll be part of a team that values innovation, quick iteration, and delivering high-quality solutions that meet real-world needs.</p>\n<p>Must be eligible to obtain and maintain an active U.S. Secret security clearance. This position will be located at our office in Atlanta, GA (relocation benefits provided.)</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Develop and maintain core robotics libraries, including frame transformations, targeting, and guidance systems, that will be utilized across all Anduril robotics platforms</li>\n</ul>\n<ul>\n<li>Lead the development and implementation of major features for our products, such as designing and building Software-in-the-Loop simulators for advanced systems like Altius</li>\n</ul>\n<ul>\n<li>Lead and mentor a group software engineers to help drive team success and to successfully hit tight project deadlines</li>\n</ul>\n<ul>\n<li>Optimize performance of existing products, primarily focused on our Altius Drone product line</li>\n</ul>\n<ul>\n<li>Collaborate closely with hardware and manufacturing teams throughout the product development lifecycle, providing timely feedback to influence and enhance final hardware designs</li>\n</ul>\n<ul>\n<li>Troubleshoot and resolve complex issues in deployed systems, ensuring optimal performance in the field</li>\n</ul>\n<ul>\n<li>Contribute to the design and implementation of multi-agent coordination systems for UAVs</li>\n</ul>\n<ul>\n<li>Participate in the full software development lifecycle, from concept and design through testing and deployment</li>\n</ul>\n<ul>\n<li>Stay current with emerging technologies and industry trends, recommending and implementing innovations to improve our products and processes</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>Bachelor&#39;s degree in Robotics, Computer Science, or related field</li>\n</ul>\n<ul>\n<li>7+ years of professional software development experience</li>\n</ul>\n<ul>\n<li>Experience as a lead of a small software engineering team</li>\n</ul>\n<ul>\n<li>Strong proficiency in C++ or Rust, with experience in Linux development environments</li>\n</ul>\n<ul>\n<li>Demonstrated expertise in data structures, algorithms, concurrency, and code optimization</li>\n</ul>\n<ul>\n<li>Proven experience troubleshooting and analyzing remotely deployed software systems</li>\n</ul>\n<ul>\n<li>Hands-on experience working with and testing electrical and mechanical systems</li>\n</ul>\n<ul>\n<li>Ability to collaborate effectively with cross-functional teams, including hardware and manufacturing</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills and a &#39;Whatever It Takes&#39; mindset</li>\n</ul>\n<ul>\n<li>Excellent communication skills, both written and verbal</li>\n</ul>\n<ul>\n<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>\n</ul>\n<ul>\n<li>Willingness to relocate to Atlanta, GA</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s or Ph.D. in a relevant field (e.g., Robotics, Computer Science, Electrical Engineering)</li>\n</ul>\n<ul>\n<li>Expertise in one or more advanced robotics areas: motion planning, perception, localization, mapping, or controls</li>\n</ul>\n<ul>\n<li>Experience with performance optimization and metrics for complex robotic systems</li>\n</ul>\n<ul>\n<li>Proficiency in Python, Rust, and/or Go, in addition to C++</li>\n</ul>\n<ul>\n<li>Hands-on experience programming for embedded systems and physical devices</li>\n</ul>\n<ul>\n<li>Background in multi-agent coordination, particularly with UAVs</li>\n</ul>\n<ul>\n<li>Demonstrated ability to solve complex frame transformation problems (e.g., target localization, multi-degree-of-freedom robotic arms)</li>\n</ul>\n<ul>\n<li>Experience with real-time operating systems and distributed computing</li>\n</ul>\n<ul>\n<li>Familiarity with machine learning and AI applications in robotics</li>\n</ul>\n<ul>\n<li>Knowledge of sensor fusion techniques and implementation</li>\n</ul>\n<ul>\n<li>Understanding of aerodynamics and flight dynamics as applied to UAV systems</li>\n</ul>\n<ul>\n<li>Experience with simulation environments for robotics testing and development</li>\n</ul>\n<ul>\n<li>Track record of contributions to open-source robotics projects or relevant publications</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8697757e-7b4","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5033836007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$190,000-$252,000 USD","x-skills-required":["C++","Rust","Linux development environments","Data structures","Algorithms","Concurrency","Code optimization","Troubleshooting","Analysis","Electrical and mechanical systems"],"x-skills-preferred":["Python","Go","Embedded systems","Physical devices","Multi-agent coordination","UAVs","Frame transformation","Real-time operating systems","Distributed computing","Machine learning","AI","Sensor fusion","Aerodynamics","Flight dynamics"],"datePosted":"2026-04-18T15:58:40.280Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta, Georgia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Rust, Linux development environments, Data structures, Algorithms, Concurrency, Code optimization, Troubleshooting, Analysis, Electrical and mechanical systems, Python, Go, Embedded systems, Physical devices, Multi-agent coordination, UAVs, Frame transformation, Real-time operating systems, Distributed computing, Machine learning, AI, Sensor fusion, Aerodynamics, Flight dynamics","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":190000,"maxValue":252000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_eccf1031-6f3"},"title":"Senior Computer Vision Engineer, Space","description":"<p>We are seeking a Senior Computer Vision Engineer to join our rapidly growing team in Washington DC. The ideal candidate will have a strong background in computer vision and machine learning, with experience in developing and implementing computer vision algorithms for various spacecraft efforts in all orbital regimes.</p>\n<p>The Senior Computer Vision Engineer will be responsible for proposing and prototyping innovative solutions to solve real-world problems, developing and maintaining core libraries and runtime applications, integrating classical and geometric methods in computer vision with ML methods, and working with space vehicle CV software and hardware subsystems for various spacecraft efforts in all orbital regimes.</p>\n<p>The successful candidate will have a Master&#39;s or Ph.D. in Machine Learning, Robotics, or Computer Science, with a strong background in computer vision and machine learning. They will also have experience in one or more of the following: objection detection, object tracking, instance segmentation, semantic segmentation, semantic change detection, natural feature tracking (NFT), visual odometry, SLAM, multi-view geometry, structure from motion, 3D geometry, discriminative correlation filters, stereo, neural 3D reconstruction, multi-band sensor processing, RGB-D and LIDAR sensor fusion.</p>\n<p>The Senior Computer Vision Engineer will work closely with related teams, including Sensors, GNC, Avionics, Systems, Flight Software, Mission Operations, and Ground Software, to develop and implement computer vision algorithms for various spacecraft efforts in all orbital regimes.</p>\n<p>The ideal candidate will have excellent communication and organizational skills, including documentation and training material, and will be able to work effectively in a fast-paced environment with tight deadlines.</p>\n<p>The salary range for this role is $191,000-$253,000 USD, and highly competitive equity grants are included in the majority of full-time offers.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_eccf1031-6f3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5016343007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["Machine Learning","Robotics","Computer Science","Computer Vision","Object Detection","Object Tracking","Instance Segmentation","Semantic Segmentation","Semantic Change Detection","Natural Feature Tracking (NFT)","Visual Odometry","SLAM","Multi-view Geometry","Structure from Motion","3D Geometry","Discriminative Correlation Filters","Stereo","Neural 3D Reconstruction","Multi-band Sensor Processing","RGB-D and LIDAR Sensor Fusion"],"x-skills-preferred":["Matlab","Simulink","Python","Go","C++","Linux systems","OpenCV","NFT"],"datePosted":"2026-04-18T15:55:25.839Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, District of Columbia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Machine Learning, Robotics, Computer Science, Computer Vision, Object Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT), Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band Sensor Processing, RGB-D and LIDAR Sensor Fusion, Matlab, Simulink, Python, Go, C++, Linux systems, OpenCV, NFT","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cb57c7a1-7e6"},"title":"Senior Computer Vision Engineer, Space","description":"<p>We are looking for a Senior Computer Vision Engineer to join our rapidly growing team in Costa Mesa, CA. In this role, you will be responsible for working on and understanding the design of all perception subsystems to include but not limited to hardware sensors and advanced processing platforms, navigation algorithms, flight software implementation, subsystem integration &amp; test (I&amp;T), and vehicle I&amp;T.</p>\n<p>The computer vision engineering team will work closely with related teams, including Sensors, GNC, Avionics, Systems, Flight Software, Mission Operations, and Ground Software. The Computer Vision Engineer will work algorithm design, truth and physics modeling, scene rendering, simulation and analysis for a wide variety of spacecraft and space missions to include but not limited to LEO, MEO, GEO, Reentry, and RPOD (Rendezvous Proximity Operations and Docking).</p>\n<p>The CV Engineer will help lead successful implementation, validation and CV operations of Anduril’s fleet of space vehicles. This role is directly tied to ongoing, funded programs within Anduril’s Space Business Line. The programs require building and fielding a resilient, software-defined spacecraft systems across numerous mission threads. We work with mission partners and customers to deploy reliable and robust capabilities on operationally-relevant fielding timelines to meet complex challenges across the DOD and IC.</p>\n<p>The position requires a strong background in computer vision, machine learning, and software development, with experience in developing and implementing computer vision algorithms for space-based applications. The ideal candidate will have a deep understanding of computer vision principles, including object detection, tracking, and recognition, as well as experience with software development in languages such as C++ and Python.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Propose and prototype innovative solutions to solve real-world problems, leveraging the latest state-of-the-art techniques in the field</li>\n<li>Develop and maintain core libraries and runtime applications</li>\n<li>Integrate classical and geometric methods in computer vision with ML methods</li>\n<li>Work space vehicle CV software and hardware subsystems for various spacecraft efforts in all orbital regimes and work closely with partners for successful implementation</li>\n<li>Develop modern, software-defined approaches to autonomous spacecraft operations with maneuvering capabilities to successfully accomplish mission objectives</li>\n<li>Develop appropriate test plans and procedures to validate the CV system during ground checkout, on-orbit commissioning and operations</li>\n<li>Collaborate across multiple teams to plan, build, and test complex functionality</li>\n<li>Coordinate with end-users, other operators and customers to turn needs into features while balancing user experience with engineering constraints</li>\n<li>Support challenging schedules during ground testing, launch windows and on-orbit operations of the spacecraft systems</li>\n<li>Design of flight software and firmware, algorithms, and simulation products</li>\n<li>Development of space vehicle autonomy tools for dynamic space operations</li>\n<li>Test process development and execution</li>\n<li>Define automated fault detection and responses</li>\n<li>Provide hardware-in-the-loop and monte-carlo simulation capabilities</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>MS or PhD in Machine Learning, Robotics or Computer Science, Image Science with emphasis on Computer Vision</li>\n<li>BS in Computer Science, Machine Learning, Electrical Engineering, or related field</li>\n<li>Advanced professional experience developing and benchmarking ML algorithms on large-scale datasets</li>\n<li>High proficiency in C++ development in a Linux environment</li>\n<li>Experience in one or more of the following: Objection Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT)</li>\n<li>Experience in one or more of the following: Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band sensor processing, RGB-D and LIDAR Sensor Fusion</li>\n<li>Ability to quickly understand and navigate complex systems and detailed requirements</li>\n<li>Familiarity with terminal guidance, rendezvous proximity operations and docking, orbital mechanics with propulsive spacecraft, and/or spacecraft/missile GNC</li>\n<li>Clear communication and organizational skills including documentation and training material</li>\n<li>Currently possesses and is able to maintain an active U.S. Top Secret security clearance</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Experience with Matlab, Simulink, Python, Go, C++ and/or Linux systems</li>\n<li>A desire to work on critical software and hardware designs in the space domain</li>\n<li>Strong preference for candidates possessing computer vision, vision navigation, image processing, feature tracking, SLAM, Open CV, and NFT</li>\n<li>Experience testing CV subsystems in laboratory environments that mimic the space environmental constraints</li>\n<li>Experience with orbital mechanics and resident space object tracking capabilities</li>\n<li>Experience conducting spacecraft operations and satellite command and control with an emphasis on system reliability and uptime</li>\n<li>Experience with testing/validation leveraging FlatSats, Hardware-in-the-Loop testbeds and digital spacecraft simulators through nominal and fault scenarios</li>\n<li>Experience with computer vision and perception algorithms to support GNC operations</li>\n<li>Experience developing 3-DOF simplified and 6-DOF high-fidelity dynamics simulation models used for GNC systems analysis and validation</li>\n<li>Exposure to US satellite operations policy and constraints for relevant mission threads in all orbits</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cb57c7a1-7e6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5016340007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["Machine Learning","Robotics","Computer Science","Image Science","C++","Python","Objection Detection","Object Tracking","Instance Segmentation","Semantic Segmentation","Semantic Change Detection","Natural Feature Tracking (NFT)","Visual Odometry","SLAM","Multi-view Geometry","Structure from Motion","3D Geometry","Discriminative Correlation Filters","Stereo","Neural 3D Reconstruction","Multi-band sensor processing","RGB-D and LIDAR Sensor Fusion"],"x-skills-preferred":["Matlab","Simulink","Go","Linux systems","Computer vision","Vision navigation","Image processing","Feature tracking","Open CV","NFT"],"datePosted":"2026-04-18T15:52:59.001Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Machine Learning, Robotics, Computer Science, Image Science, C++, Python, Objection Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT), Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band sensor processing, RGB-D and LIDAR Sensor Fusion, Matlab, Simulink, Go, Linux systems, Computer vision, Vision navigation, Image processing, Feature tracking, Open CV, NFT","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_73f04783-b19"},"title":"Robotics Engineer, Maritime","description":"<p>As a Robotics Engineer in Anduril&#39;s Maritime Division, you will contribute to the delivery of vehicle perception and planning capability integrated into our products. This includes systems analysis, sensor selection and integration, perception architecture and implementation, motion planning, health management, behavior analysis, simulation and test infrastructure, and interfaces with lower- and higher-level systems.</p>\n<p>We expect Robotics Engineers to demonstrate end-to-end ownership of their projects, contributing as a member of a team to the rapid architecting, design, delivery, support, and evolution of next-generation autonomous platforms through their entire product life-cycle.</p>\n<p>Key responsibilities include:</p>\n<p>Implementing trusted safe navigation, collision avoidance, and situational awareness systems that balance constraints, restrictions, and requirements in a multi-stakeholder environment Implementing scalable sub-systems including sensor processing, perception, tracking, motion planning, health management, anomaly detection, simulation, testing fixtures, and vehicle interfaces Contributing to the development of existing software components across Anduril, with the aim of developing components that are re-usable across multiple Anduril product lines Utilizing advanced techniques in computer vision, sensor fusion, and machine learning to enhance perception and planning capabilities of Anduril vehicles Conducting thorough testing and validation of perception and planning algorithms through development and use of simulation and analysis of data from real-world experiments, including evolving the data analysis tooling Collaborating with cross-functional teams, including software engineers, mechanical engineers, and systems engineers, to ensure effective system integration and testing Traveling to co-locate with end-users and/or other teams up to 20% of the time</p>\n<p>Required qualifications include:</p>\n<p>Bachelor&#39;s degree in Robotics, Mechatronics, Computer Science, Engineering, or a relevant field Experienced and proficient in C++ and/or Python software development Familiarity with autonomous vehicle hardware and sensors such as radar, sonar, LIDAR, and cameras Demonstrated knowledge of at least one of: computer vision, sensor fusion, SLAM, motion planning, machine learning Experience in a senior perception or planning role for the delivery of a robotic system Capacity to act as the technical owner for a system, including stakeholder engagement, requirements definition, roadmap management, team coordination, design, implementation, sustainment, and evolution Ability to collaborate with stakeholders to define and implement robust validation and verification strategies for perception and planning modules Capacity to learn and grow individually, while mentoring junior team members effectively, building team cohesion and capacity Eligibility to obtain and maintain an active U.S. Secret security clearance</p>\n<p>Preferred qualifications include:</p>\n<p>Experience with autonomous systems in the ground, air, maritime, or space domains Experience with simulation tools and frameworks, such as Gazebo, Unity, or Unreal Engine, for algorithm validation and testing Knowledge of safety standards and certification processes for autonomous systems Familiarity with System Engineering concepts</p>\n<p>US Salary Range: $191,000-$253,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_73f04783-b19","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5091916007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["C++","Python","Autonomous vehicle hardware","Sensor fusion","SLAM","Motion planning","Machine learning"],"x-skills-preferred":["Simulation tools","Gazebo","Unity","Unreal Engine","Safety standards","Certification processes","System Engineering"],"datePosted":"2026-04-18T15:51:01.959Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Boston, Massachusetts, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Autonomous vehicle hardware, Sensor fusion, SLAM, Motion planning, Machine learning, Simulation tools, Gazebo, Unity, Unreal Engine, Safety standards, Certification processes, System Engineering","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9406d4ff-94e"},"title":"Robotics Engineer, Maritime","description":"<p>We are seeking a Robotics Engineer to contribute to the delivery of vehicle perception and planning capability integrated into our products. This includes systems analysis, sensor selection and integration, perception architecture and implementation, motion planning, health management, behavior analysis, simulation and test infrastructure, and interfaces with lower- and higher-level systems.</p>\n<p>As a Robotics Engineer, you will be responsible for implementing trusted safe navigation, collision avoidance, and situational awareness systems that balance constraints, restrictions, and requirements in a multi-stakeholder environment. You will also implement scalable sub-systems including sensor processing, perception, tracking, motion planning, health management, anomaly detection, simulation, testing fixtures, and vehicle interfaces.</p>\n<p>In addition, you will contribute to the development of existing software components across Anduril, with the aim of developing components that are reusable across multiple Anduril product lines. You will utilize advanced techniques in computer vision, sensor fusion, and machine learning to enhance perception and planning capabilities of Anduril vehicles.</p>\n<p>You will conduct thorough testing and validation of perception and planning algorithms through development and use of simulation and analysis of data from real-world experiments, including evolving the data analysis tooling. You will also collaborate with cross-functional teams, including software engineers, mechanical engineers, and systems engineers, to ensure effective system integration and testing.</p>\n<p>This role requires a strong background in robotics, mechatronics, computer science, or engineering, with experience in C++ and/or Python software development. You should have familiarity with autonomous vehicle hardware and sensors such as radar, sonar, LIDAR, and cameras. Demonstrated knowledge of at least one of computer vision, sensor fusion, SLAM, motion planning, or machine learning is required.</p>\n<p>Experience in a senior perception or planning role for the delivery of a robotic system is preferred. You should have the capacity to act as the technical owner for a system, including stakeholder engagement, requirements definition, roadmap management, team coordination, design, implementation, sustainment, and evolution. Ability to collaborate with stakeholders to define and implement robust validation and verification strategies for perception and planning modules is also required.</p>\n<p>Eligibility to obtain and maintain an active U.S. Secret security clearance is necessary for this role.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9406d4ff-94e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5051580007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["C++","Python","Autonomous vehicle hardware","Radar","Sonar","LIDAR","Cameras","Computer vision","Sensor fusion","SLAM","Motion planning","Machine learning"],"x-skills-preferred":["Simulation tools and frameworks","Safety standards and certification processes for autonomous systems"],"datePosted":"2026-04-18T15:50:59.724Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Autonomous vehicle hardware, Radar, Sonar, LIDAR, Cameras, Computer vision, Sensor fusion, SLAM, Motion planning, Machine learning, Simulation tools and frameworks, Safety standards and certification processes for autonomous systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_7c36d505-6f8"},"title":"Robotics Engineer","description":"<p>As a Robotics Engineer at Anduril Industries, you will contribute to the delivery of vehicle perception and planning capability integrated into our products. This includes systems analysis, sensor selection and integration, perception architecture and implementation, motion planning, health management, behavior analysis, simulation and test infrastructure, and interfaces with lower- and higher-level systems.</p>\n<p>We expect Robotics Engineers to demonstrate end-to-end ownership of their projects, contributing as a member of a team to the rapid architecting, design, delivery, support, and evolution of next-generation autonomous platforms through their entire product life-cycle.</p>\n<p>Key responsibilities include:</p>\n<p>Implementing trusted safe navigation, collision avoidance, and situational awareness systems that balance constraints, restrictions, and requirements in a multi-stakeholder environment Implementing scalable sub-systems including sensor processing, perception, tracking, motion planning, health management, anomaly detection, simulation, testing fixtures, and vehicle interfaces Contributing to the development of existing software components across Anduril, with the aim of developing components that are re-usable across multiple Anduril product lines Utilizing advanced techniques in computer vision, sensor fusion, and machine learning to enhance perception and planning capabilities of Anduril vehicles Conducting thorough testing and validation of perception and planning algorithms through development and use of simulation and analysis of data from real-world experiments, including evolving the data analysis tooling Collaborating with cross-functional teams, including software engineers, mechanical engineers, and systems engineers, to ensure effective system integration and testing Traveling to co-locate with end-users and/or other teams up to 20% of the time</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_7c36d505-6f8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4972426007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"The salary range for this role is an estimate based on a wide range of compensation factors, inclusive of base salary only. Actual salary offer may vary based on (but not limited to) work experience, education and/or training, critical skills, and/or business considerations.","x-skills-required":["C++","Python","Autonomous vehicle hardware and sensors","Computer vision","Sensor fusion","SLAM","Motion planning","Machine learning"],"x-skills-preferred":["Simulation tools and frameworks","Safety standards and certification processes for autonomous systems"],"datePosted":"2026-04-18T15:50:33.819Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Melbourne, Victoria, Australia"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Autonomous vehicle hardware and sensors, Computer vision, Sensor fusion, SLAM, Motion planning, Machine learning, Simulation tools and frameworks, Safety standards and certification processes for autonomous systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_269373be-68a"},"title":"Robotics Engineer","description":"<p><strong>Job Description</strong></p>\n<p>We are seeking a skilled Robotics Engineer to join our Maritime Division. As a Robotics Engineer, you will contribute to the delivery of vehicle perception and planning capability integrated into our products.</p>\n<p>This includes systems analysis, sensor selection and integration, perception architecture and implementation, motion planning, health management, behaviour analysis, simulation and test infrastructure, and interfaces with lower- and higher-level systems.</p>\n<p><strong>Responsibilities</strong></p>\n<ul>\n<li>Implement trusted safe navigation, collision avoidance, and situational awareness systems that balance constraints, restrictions and requirements in a multi-stakeholder environment</li>\n<li>Implement scalable sub-systems including sensor processing, perception, tracking, motion planning, health management, anomaly detection, simulation, testing fixtures, and vehicle interfaces</li>\n<li>Contribute to the development of existing software components across Anduril, with the aim of developing components that are re-usable across multiple Anduril product lines</li>\n<li>Utilize advanced techniques in computer vision, sensor fusion, and machine learning to enhance perception and planning capabilities of Anduril vehicles</li>\n<li>Conduct thorough testing and validation of perception and planning algorithms through development and use of simulation and analysis of data from real-world experiments, including evolving the data analysis tooling</li>\n<li>Collaborate with cross-functional teams, including software engineers, mechanical engineers, and systems engineers, to ensure effective system integration and testing</li>\n</ul>\n<p><strong>Requirements</strong></p>\n<ul>\n<li>Bachelor’s degree in Robotics, Mechatronics, Computer Science, Engineering, a relevant field, or equivalent experience</li>\n<li>Experienced and proficient in C++ and/or Python software development</li>\n<li>Familiarity with autonomous vehicle hardware and sensors such as radar, sonar, LIDAR, and cameras</li>\n<li>Demonstrated knowledge of at least one of: computer vision, sensor fusion, SLAM, motion planning, machine learning</li>\n<li>Experience in a senior perception or planning role for the delivery of a robotic system</li>\n<li>Capacity to act as the technical owner for a system, including stakeholder engagement, requirements definition, roadmap management, team coordination, design, implementation, sustainment and evolution</li>\n<li>Ability to collaborate with stakeholders to define and implement robust validation and verification strategies for perception and planning modules</li>\n<li>Capacity to learn and grow individually, while mentoring junior team members effectively, building team cohesion and capacity</li>\n<li>Eligible to obtain and maintain an Australian Government Negative Vetting 2 security clearance (NV2)</li>\n</ul>\n<p><strong>Preferred Qualifications</strong></p>\n<ul>\n<li>Experience with autonomous systems in the ground, air, maritime or space domains</li>\n<li>Experience with simulation tools and frameworks, such as Gazebo, Unity, or Unreal Engine, for algorithm validation and testing</li>\n<li>Knowledge of safety standards and certification processes for autonomous systems</li>\n<li>Familiarity with System Engineering concepts</li>\n</ul>\n<p><strong>Salary and Benefits</strong></p>\n<p>The salary range for this role is an estimate based on a wide range of compensation factors, inclusive of base salary only. Actual salary offer may vary based on (but not limited to) work experience, education and/or training, critical skills, and/or business considerations. Highly competitive equity grants are included in the majority of full-time offers; and are considered part of Anduril&#39;s total compensation package. Additionally, Anduril offers top-tier benefits for full-time employees, including:</p>\n<ul>\n<li>Healthcare Benefits - US Roles: Comprehensive medical, dental, and vision plans at little to no cost to you.</li>\n<li>UK &amp; AUS Roles: We cover full cost of medical insurance premiums for you and your dependents.</li>\n<li>IE Roles: We offer an annual contribution toward your private health insurance for you and your dependents.</li>\n<li>Income Protection: Anduril covers life and disability insurance for all employees.</li>\n<li>Generous time off: Highly competitive PTO plans with a holiday hiatus in December.</li>\n<li>Caregiver &amp; Wellness Leave is available to care for family members, bond with a new baby, or address your own medical needs.</li>\n<li>Family Planning &amp; Parenting Support: Coverage for fertility treatments (e.g., IVF, preservation), adoption, and gestational carriers, along with resources to support you and your partner from planning to parenting.</li>\n<li>Mental Health Resources: Access free mental health resources 24/7, including therapy and life coaching.</li>\n<li>Additional work-life services, such as legal and financial support, are also available.</li>\n<li>Professional Development: Annual reimbursement for professional development - Commuter Benefits: Company-funded commuter benefits based on your region.</li>\n<li>Relocation Assistance: Available depending on role eligibility.</li>\n<li>Retirement Savings Plan - US Roles: Traditional 401(k), Roth, and after-tax (mega backdoor Roth) options.</li>\n<li>UK &amp; IE Roles: Pension plan with employer match.</li>\n<li>AUS Roles: Superannuation plan.</li>\n</ul>\n<p><strong>Protecting Yourself from Recruitment Scams</strong></p>\n<p>Anduril is committed to maintaining the highest standards of integrity and transparency in our recruitment processes.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_269373be-68a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4961600007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["C++","Python","Autonomous vehicle hardware","Sensor fusion","SLAM","Motion planning","Machine learning"],"x-skills-preferred":["Simulation tools and frameworks","Safety standards and certification processes","System Engineering concepts"],"datePosted":"2026-04-18T15:50:19.825Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Sydney, New South Wales, Australia"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Autonomous vehicle hardware, Sensor fusion, SLAM, Motion planning, Machine learning, Simulation tools and frameworks, Safety standards and certification processes, System Engineering concepts"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4cbfcaea-677"},"title":"Project Delivery Lead, Battlespace","description":"<p>Anduril Industries is seeking a Project Delivery Lead to oversee the development, execution, and delivery of radar software and algorithms for critical programs. The successful candidate will have a strong technical background in software development, particularly in the domain of radar signal processing, sensor fusion, and tracking algorithms, and experience managing complex projects with multiple sub-teams and many stakeholders.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Developing and maintaining a detailed project and execution plan, including feature development, testing and validation, deployment, and monitoring.</li>\n<li>Establishing clear objectives and deliverables for the software that align with program needs.</li>\n<li>Maintaining a deep understanding of underlying algorithms and software interfaces for radar processing, target tracking, detection, classification, and sensor fusion to ensure that the relevant services integrate effectively with the larger system.</li>\n<li>Staying updated on the latest techniques to work with subject matter experts in incorporating relevant innovations into the project plan.</li>\n<li>Ensuring comprehensive documentation is maintained for the project, including technical specifications, user manuals, and project reports.</li>\n<li>Overseeing the planning and execution of the software release process, ensuring that deployment is carried out smoothly and efficiently.</li>\n<li>Serving as the primary point of contact for all project-related communications, internally across teams and externally with stakeholders.</li>\n</ul>\n<p>The ideal candidate will have 5+ years of experience and a Bachelor&#39;s or Master&#39;s degree in Computer Science, Electrical Engineering, or a related technical field. They will also have extensive project management experience, preferably in software development for radar systems or advanced signal processing algorithms.</p>\n<p>Salary: $111,000-$147,000 USD per year.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4cbfcaea-677","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5108263007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$111,000-$147,000 USD per year","x-skills-required":["Radar signal processing","Sensor fusion","Tracking algorithms","AI/ML technology","JIRA","Confluence","GitHub"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:50:08.625Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Fort Collins, Colorado, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Radar signal processing, Sensor fusion, Tracking algorithms, AI/ML technology, JIRA, Confluence, GitHub","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":111000,"maxValue":147000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_08a5f496-732"},"title":"Robotics Software Engineer","description":"<p>As a Robotics Software Engineer on our Tactical Recon &amp; Strike team, you&#39;ll be at the forefront of cutting-edge autonomous systems development. You&#39;ll tackle diverse challenges in autonomy, systems integration, robotics, and networking, making critical engineering decisions that directly impact mission success.</p>\n<p>Your role will be pivotal in ensuring Anduril&#39;s products work seamlessly together to achieve a variety of crucial outcomes. You&#39;ll develop innovative solutions for complex robotics problems, balance pragmatic engineering trade-offs with mission-critical requirements, and collaborate across teams to integrate software with hardware systems.</p>\n<p>Contributing to the entire product lifecycle, from concept to deployment, you&#39;ll rapidly prototype and iterate on software solutions. We&#39;re looking for someone who thrives in a fast-paced environment and isn&#39;t afraid to tackle ambiguous problems. Your &#39;Whatever It Takes&#39; mindset will be key in executing tasks efficiently, scalably, and pragmatically, always keeping the mission at the forefront of your work.</p>\n<p>This role offers the opportunity to make a significant impact on next-generation defence technology, working with state-of-the-art robotics and autonomous systems. You&#39;ll be part of a team that values innovation, quick iteration, and delivering high-quality solutions that meet real-world needs.</p>\n<p>Must be eligible to obtain and maintain an active U.S. Secret security clearance. This position will be located at our office in Atlanta, GA (relocation benefits provided.)</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Develop and maintain core robotics libraries, including frame transformations, targeting, and guidance systems, that will be utilized across all Anduril robotics platforms</li>\n</ul>\n<ul>\n<li>Lead the development and implementation of major features for our products, such as designing and building Software-in-the-Loop simulators for advanced systems like Altius</li>\n</ul>\n<ul>\n<li>Optimise performance of existing products, primarily focused on our Altius Drone product line</li>\n</ul>\n<ul>\n<li>Collaborate closely with hardware and manufacturing teams throughout the product development lifecycle, providing timely feedback to influence and enhance final hardware designs</li>\n</ul>\n<ul>\n<li>Troubleshoot and resolve complex issues in deployed systems, ensuring optimal performance in the field</li>\n</ul>\n<ul>\n<li>Contribute to the design and implementation of multi-agent coordination systems for UAVs</li>\n</ul>\n<ul>\n<li>Participate in the full software development lifecycle, from concept and design through testing and deployment</li>\n</ul>\n<ul>\n<li>Stay current with emerging technologies and industry trends, recommending and implementing innovations to improve our products and processes</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>Bachelor&#39;s degree in Robotics, Computer Science, or related field</li>\n</ul>\n<ul>\n<li>3+ years of professional software development experience</li>\n</ul>\n<ul>\n<li>Strong proficiency in C++ or Rust, with experience in Linux development environments</li>\n</ul>\n<ul>\n<li>Demonstrated expertise in data structures, algorithms, concurrency, and code optimisation</li>\n</ul>\n<ul>\n<li>Proven experience troubleshooting and analysing remotely deployed software systems</li>\n</ul>\n<ul>\n<li>Hands-on experience working with and testing electrical and mechanical systems</li>\n</ul>\n<ul>\n<li>Ability to collaborate effectively with cross-functional teams, including hardware and manufacturing</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills and a &#39;Whatever It Takes&#39; mindset</li>\n</ul>\n<ul>\n<li>Excellent communication skills, both written and verbal</li>\n</ul>\n<ul>\n<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>\n</ul>\n<ul>\n<li>Willingness to relocate to Atlanta, GA</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s or Ph.D. in a relevant field (e.g., Robotics, Computer Science, Electrical Engineering)</li>\n</ul>\n<ul>\n<li>Expertise in one or more advanced robotics areas: motion planning, perception, localisation, mapping, or controls</li>\n</ul>\n<ul>\n<li>Experience with performance optimisation and metrics for complex robotic systems</li>\n</ul>\n<ul>\n<li>Proficiency in Python, Rust, and/or Go, in addition to C++</li>\n</ul>\n<ul>\n<li>Hands-on experience programming for embedded systems and physical devices</li>\n</ul>\n<ul>\n<li>Background in multi-agent coordination, particularly with UAVs</li>\n</ul>\n<ul>\n<li>Demonstrated ability to solve complex frame transformation problems (e.g., target localisation, multi-degree-of-freedom robotic arms)</li>\n</ul>\n<ul>\n<li>Experience with real-time operating systems and distributed computing</li>\n</ul>\n<ul>\n<li>Familiarity with machine learning and AI applications in robotics</li>\n</ul>\n<ul>\n<li>Knowledge of sensor fusion techniques and implementation</li>\n</ul>\n<ul>\n<li>Understanding of aerodynamics and flight dynamics as applied to UAV systems</li>\n</ul>\n<ul>\n<li>Experience with simulation environments for robotics testing and development</li>\n</ul>\n<ul>\n<li>Track record of contributions to open-source robotics projects or relevant publications</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_08a5f496-732","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5078772007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$165,000-$218,000 USD","x-skills-required":["C++","Rust","Linux development environments","Data structures","Algorithms","Concurrency","Code optimisation","Troubleshooting","Analysis","Electrical and mechanical systems","Collaboration","Problem-solving","Communication"],"x-skills-preferred":["Python","Go","Embedded systems","Physical devices","Multi-agent coordination","Motion planning","Perception","Localisation","Mapping","Controls","Performance optimisation","Real-time operating systems","Distributed computing","Machine learning","AI applications","Sensor fusion","Aerodynamics","Flight dynamics","Simulation environments"],"datePosted":"2026-04-18T15:49:37.735Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta, Georgia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Rust, Linux development environments, Data structures, Algorithms, Concurrency, Code optimisation, Troubleshooting, Analysis, Electrical and mechanical systems, Collaboration, Problem-solving, Communication, Python, Go, Embedded systems, Physical devices, Multi-agent coordination, Motion planning, Perception, Localisation, Mapping, Controls, Performance optimisation, Real-time operating systems, Distributed computing, Machine learning, AI applications, Sensor fusion, Aerodynamics, Flight dynamics, Simulation environments","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":165000,"maxValue":218000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_817ba638-cb6"},"title":"Mission Operations Engineer","description":"<p>Job Title: Mission Operations Engineer</p>\n<p>We are seeking a highly motivated and experienced Mission Operations Engineer to join our team at Anduril Industries. As a Mission Operations Engineer, you will be responsible for managing the execution, growth, and success of our client accounts. You will work across product, engineering, sales, and logistics teams to develop, plan, and deploy Anduril products in support of client missions.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Serve as a customer-facing Account Manager, working directly with key strategic partners to develop and deploy Anduril&#39;s ALTIUS UAS and other TRS products.</li>\n<li>Own the execution of contract deliverables, including developing and implementing measures of performance and effectiveness, writing technical reports, and creating user engagement roadmaps.</li>\n<li>Support engagement with all service components of the US DOD, local government organizations, and other partner nation&#39;s defense forces as required to integrate Anduril products laterally for increased adoption.</li>\n<li>Orchestrate the deployment of Anduril products, leading the planning and deployment of Anduril&#39;s flagship products in support of client missions.</li>\n<li>Grow your business through client success, meeting commercial targets that drive top line growth.</li>\n<li>Shape our products to meet mission needs, working side-by-side with clients to understand their specific mission challenges and representing the client as you work with Anduril engineers to shape products that solve critical national and international security challenges.</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Mission First Mindset: We put the needs of the mission and our clients first and understand that the US and its allies have no preordained right to victory in any future conflict.</li>\n<li>Ownership Mentality: We&#39;re looking for owners; those who have a natural bias to assume responsibility, a healthy dose of skepticism, and contribute to a culture of performance.</li>\n<li>Technical Aptitude and Intellectual Curiosity: We are first and foremost a technology company, working at the leading edge of capabilities like machine learning, autonomy, distributed networking, and multi-modal sensor fusion.</li>\n<li>Structured Thinker and Problem Solver: Leading technical programs requires clear communication and well thought-out plans.</li>\n</ul>\n<p>What You&#39;ll Need:</p>\n<ul>\n<li>Bachelor&#39;s degree in a relevant field such as engineering, computer science, or a related field.</li>\n<li>5+ years of experience in a related field, preferably in a defense or aerospace industry.</li>\n<li>Strong technical skills, including experience with machine learning, autonomy, and distributed networking.</li>\n<li>Excellent communication and problem-solving skills.</li>\n<li>Ability to work in a fast-paced environment and prioritize multiple tasks and projects.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s degree in a relevant field.</li>\n<li>Experience with Anduril&#39;s products and services.</li>\n<li>Familiarity with the US DOD and other partner nation&#39;s defense forces.</li>\n<li>Experience working in a defense or aerospace industry.</li>\n</ul>\n<p>Salary Range: Not specified</p>\n<p>Required Skills:</p>\n<ul>\n<li>Mission Operations</li>\n<li>Customer Service</li>\n<li>Contract Management</li>\n<li>Technical Writing</li>\n<li>User Engagement</li>\n<li>Product Development</li>\n<li>Project Management</li>\n<li>Communication</li>\n<li>Problem-Solving</li>\n</ul>\n<p>Preferred Skills:</p>\n<ul>\n<li>Machine Learning</li>\n<li>Autonomy</li>\n<li>Distributed Networking</li>\n<li>Multi-Modal Sensor Fusion</li>\n<li>Anduril Products and Services</li>\n<li>US DOD and Partner Nation&#39;s Defense Forces</li>\n<li>Defense and Aerospace Industry</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_817ba638-cb6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5100663007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Mission Operations","Customer Service","Contract Management","Technical Writing","User Engagement","Product Development","Project Management","Communication","Problem-Solving"],"x-skills-preferred":["Machine Learning","Autonomy","Distributed Networking","Multi-Modal Sensor Fusion","Anduril Products and Services","US DOD and Partner Nation's Defense Forces","Defense and Aerospace Industry"],"datePosted":"2026-04-18T15:48:01.924Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Atlanta, Georgia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Mission Operations, Customer Service, Contract Management, Technical Writing, User Engagement, Product Development, Project Management, Communication, Problem-Solving, Machine Learning, Autonomy, Distributed Networking, Multi-Modal Sensor Fusion, Anduril Products and Services, US DOD and Partner Nation's Defense Forces, Defense and Aerospace Industry"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3d163d40-284"},"title":"Mission Operations Engineer, Connected Warfare (Active Clearance)","description":"<p>Orchestrate the deployment of Anduril products.</p>\n<p>Lead the planning and deployment of Anduril&#39;s flagship hardware and software products in support of customer missions.</p>\n<p>Through collaboration with our engineering, logistics, and technical operations teams, Mission Operators are in charge of deploying our products to the field, training end users, and ensuring the high performance of our products.</p>\n<p>We are obsessed about engagement and keeping our finger on the pulse of how customers interact with our products.</p>\n<p>This helps us improve the user experience and evolve their operational tactics and techniques.</p>\n<p>And if something isn&#39;t quite right, we work with product teams to quickly address the issue on behalf of our customer.</p>\n<p>Grow your business through customer success.</p>\n<p>In addition to the operational and executive relationships you own, you will be responsible for meeting commercial targets that drive top line growth.</p>\n<p>This requires establishing committed and trust-based partnerships with our customers to not only help them solve their immediate problems, but also to foresee future opportunities where Anduril&#39;s technology can make them more effective.</p>\n<p>Understanding Anduril&#39;s value proposition and articulating how we can have an impact on a customer&#39;s problem set, from the executive level to the end user, is critical to positioning Anduril for the future.</p>\n<p>Shape our products to meet mission needs.</p>\n<p>Work side-by-side with our customers to understand their specific mission challenges and represent the customer as you work with Anduril engineers to shape products that solve critical national and international security challenges.</p>\n<p>Anduril engineers rely on Mission Operators to understand the lay of the land and bring a perspective that informs the product development process.</p>\n<p>Collaborate across nearly every Anduril team.</p>\n<p>Successfully deploying our complex hardware and software products requires collaborating with nearly every internal Anduril team - business development, engineering, product, logistics, finance, legal, contracting, technical operations, and many more.</p>\n<p>You&#39;ll be responsible for ensuring each party is engaged and informed, and will therefore become well-versed in what it takes to really bring the best of Anduril to our customers.</p>\n<p>Mission First Mindset.</p>\n<p>We put the needs of the mission and our customers first and understand that the US and its allies have no preordained right to victory in any future conflict.</p>\n<p>We actively seek out opportunities to better understand our customers&#39; mission needs, with a goal to build the best possible products for the warfighter and radically evolve our national and international defense capabilities.</p>\n<p>Ownership Mentality.</p>\n<p>We&#39;re looking for owners; those who have a natural bias to assume responsibility, a healthy dose of skepticism, and contribute to a culture of performance.</p>\n<p>We empower our Mission Operators to quickly assume high levels of responsibility and entrust them to actively own their account and shape its success.</p>\n<p>Our Mission Operators are just as comfortable helping to shape a proposal or pitch a new customer as they are working with engineers on new features or deploying products in the field.</p>\n<p>Technical Aptitude and Intellectual Curiosity.</p>\n<p>We are first and foremost a technology company, working at the leading edge of capabilities like machine learning, autonomy, distributed networking, and multi-modal sensor fusion.</p>\n<p>Do you have a natural desire to see beyond simple cause and effect relationships to really understand how complex systems operate?</p>\n<p>Do you actively seek out opportunities to educate yourself?</p>\n<p>These capabilities form the foundation of our product offerings, so we are looking for those who are comfortable communicating technical concepts and can articulate them in the context of the customer&#39;s mission.</p>\n<p>Structured Thinker and Problem Solver.</p>\n<p>Leading technical programs requires clear communication and well thought-out plans.</p>\n<p>We are looking for assertive communicators who bring a thoughtful perspective and honest dialogue.</p>\n<p>A solution-driven approach, the ability to distill complexity into its essential, digestible components, and the ability to supervise execution are critical to ensure mission success.</p>\n<p>High Value Team Member.</p>\n<p>Do you enjoy working as part of a team?</p>\n<p>At Anduril, the path to success weaves through the organization - account teams, engineering, logistics, technical operations, and more.</p>\n<p>It requires humility, an eagerness to learn, and empathy toward your fellow team member.</p>\n<p>We assume best intentions and empathize with customers just as we do with our colleagues.</p>\n<p>Can you develop rapport at both the user and executive levels?</p>\n<p>Do you enjoy sharing successes?</p>\n<p>We are a low ego group that promotes teamwork and collaboration to bring about success.</p>\n<p>360 Degree Leader.</p>\n<p>We are a flat, non-hierarchical organization with a matrix structure that is dynamic and evolving.</p>\n<p>We expect you to be self-aware enough to recognize your position of unstated leadership within the company.</p>\n<p>You should have a demonstrated ability to lead and influence people through both formal and informal constructs and in situations where there is lack of clarity across teams.</p>\n<p>Finally, we need people who identify gaps and breakdowns and solve them immediately, rather than point them out and step back.</p>\n<p>You should also seek 360 feedback to help inform your growth trajectory as a leader.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3d163d40-284","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4159543007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Machine Learning","Autonomy","Distributed Networking","Multi-modal Sensor Fusion","Customer Success","Product Development","Technical Operations","Business Development","Engineering","Logistics","Finance","Legal","Contracting"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:47:47.886Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Machine Learning, Autonomy, Distributed Networking, Multi-modal Sensor Fusion, Customer Success, Product Development, Technical Operations, Business Development, Engineering, Logistics, Finance, Legal, Contracting"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_a8eb1f8d-988"},"title":"Mission Operations Engineer","description":"<p>We are seeking a skilled Mission Operations Engineer to join our team. As a Mission Operations Engineer, you will be responsible for managing the execution, growth, and success of our customer accounts. You will work closely with product, engineering, sales, and logistics teams to develop, plan, and deploy Anduril products in support of customer missions.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Serving as a project manager/on-site lead working directly with US military, DHS, and allied customers to deliver Anduril&#39;s radar and integrated sensor capabilities across multiple locations</li>\n<li>Leading a small team of engineers through site standups from initial radar integration through operational acceptance while interfacing with strategic end-users throughout the process</li>\n<li>Monitoring radar system performance, connectivity, integration with mesh networks, and fusion with other sensors (EO/IR, RF) and C2 systems at distributed sites and with partner organisations</li>\n<li>Gathering, triaging, and prioritising user feedback on radar detection performance, tracking accuracy, classification capabilities, issues, and errors found during fielding, testing, exercises, and normal use within operational schedules</li>\n<li>Providing input to account and engineering teams on radar features, detection algorithms, and UI/UX improvements that enhance or greatly improve efficiency of operator workflows</li>\n<li>Supporting engagement with other units, commands, and organisations for increased deployment and installation of our radar systems across military installations, critical infrastructure sites, and forward operating locations</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>5+ years of experience and a Bachelor&#39;s degree in electrical engineering, electronics engineering, computer engineering or related field</li>\n<li>Willingness to travel 50% - 75%</li>\n<li>Eligibility to obtain and maintain an active US Secret security clearance</li>\n</ul>\n<p>Preferred qualifications include familiarity and experience working with materiel leadership or mission partners within Army Air &amp; Missile Defense, USMC Ground Based Air Defence, SOCOM, DHS CBP, or allied defence organisations and working knowledge of the counter-UAS and air surveillance competitive landscape.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_a8eb1f8d-988","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5056328007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$98,000-$129,000 USD","x-skills-required":["electrical engineering","electronics engineering","computer engineering","radar systems","sensor fusion","AI-powered technology"],"x-skills-preferred":["materiel leadership","mission partners","counter-UAS","air surveillance"],"datePosted":"2026-04-18T15:46:59.104Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Broomfield, Colorado, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"electrical engineering, electronics engineering, computer engineering, radar systems, sensor fusion, AI-powered technology, materiel leadership, mission partners, counter-UAS, air surveillance","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":98000,"maxValue":129000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8fa7eb38-b7e"},"title":"Guidance, Navigation and Control (GNC) Engineer -  Tactical Reconnaissance & Strike","description":"<p>We are seeking a Guidance, Navigation and Control (GNC) Engineer to join our Tactical Recon &amp; Strike team. As a GNC Engineer, you will be responsible for developing guidance algorithms and state estimators for group 1-3 UAV platforms. You will work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Develop guidance algorithms (terminal and midcourse) and state estimators for group 1-3 UAV platforms</li>\n<li>Work closely with the flight software and computer vision team to define requirements, develop GNC software and validate functionality in simulation on aggressive timelines</li>\n<li>Review test data captured from the flight control system, subsystems, and other test instrumentation to verify vehicle performance, evaluate GNC algorithm/controller behaviour, or debug incidents during testing or from fielded assets</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Bachelor&#39;s degree in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>\n<li>3+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>\n<li>Experience with tactical guidance algorithms and target state estimation development</li>\n<li>Experience with state estimation and filtering</li>\n<li>Kalman filtering, sensor fusion, complementary filters, etc.</li>\n<li>Experience in modelling and simulation of linear and nonlinear dynamic systems and model linearisation</li>\n<li>Experience coding in Matlab and Simulink</li>\n<li>C/C++ Proficiency</li>\n<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Master&#39;s Degree or PhD in Robotics, Mechanical Engineering, Electrical Engineering, Aerospace Engineering, or a related field with a focus on dynamic systems and control</li>\n<li>5+ years professional experience in a GNC role, preferably focusing on missile or weapons systems</li>\n<li>Experience with flight test and debugging GNC on UAV platforms</li>\n<li>Experience with GNC design and analysis for fixed wing, rotary wing aircraft and/or tactical missile systems</li>\n<li>Experience with seeker integration</li>\n<li>Experience in one or more of the following: VIO, TERCOM, gimbal mount models, detection and tracking, LWIR/EO sensors, RADAR, target motion models</li>\n<li>Experience with Simulink Embedded Coder</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8fa7eb38-b7e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5060107007","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$166,000-$220,000 USD","x-skills-required":["Matlab","Simulink","C/C++","Kalman filtering","Sensor fusion","Complementary filters","Model linearisation","Tactical guidance algorithms","Target state estimation development","State estimation and filtering"],"x-skills-preferred":["VIO","TERCOM","Gimbal mount models","Detection and tracking","LWIR/EO sensors","RADAR","Target motion models","Simulink Embedded Coder"],"datePosted":"2026-04-18T15:43:51.785Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Matlab, Simulink, C/C++, Kalman filtering, Sensor fusion, Complementary filters, Model linearisation, Tactical guidance algorithms, Target state estimation development, State estimation and filtering, VIO, TERCOM, Gimbal mount models, Detection and tracking, LWIR/EO sensors, RADAR, Target motion models, Simulink Embedded Coder","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_baf3d89c-84b"},"title":"Senior Manager, Perception","description":"<p>As a member of the HMS Perception team, you will conduct software development at the intersection of classical state estimation techniques, sensor fusion, artificial intelligence, machine learning, and machine perception. You will develop cutting-edge technology onto real hardware that provides robust and accurate estimates of vehicle pose and surroundings for real missions.</p>\n<p>Shield AI is pushing the envelope by applying advanced AI solutions to real hardware systems. An ideal candidate should aspire to be a part of this industry-changing team developing and deploying advanced technology that can truly make an impact.</p>\n<p>We are seeking a skilled and motivated leader with 10+ years of experience to manage a technical team supporting the development, integration, and testing of perception algorithms for advanced aerospace, defense, and robotics systems. In this role, you will contribute to implementing and integrating innovative perception solutions while collaborating with a multidisciplinary team of engineers to meet challenging operational requirements.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution. Balance hands-on technical oversight with performance optimization, innovation, and clear stakeholder communication.</li>\n</ul>\n<ul>\n<li>Write production quality software in C++</li>\n</ul>\n<ul>\n<li>Produce an Assured Position, Navigation, and Timing (A-PNT) system to enable reliable autonomy in GNSS-degraded or denied environments</li>\n</ul>\n<ul>\n<li>Extend and specialize Shield AI’s state-of-the-art state estimation framework for new sensors, platforms, and missions</li>\n</ul>\n<ul>\n<li>Write test code to validate your software with simulated and real-world data</li>\n</ul>\n<ul>\n<li>Collaborate with hardware and test teams to validate algorithms/code on aerial platforms</li>\n</ul>\n<ul>\n<li>Write analyzers to ingest data and produce statistics to validate code quality</li>\n</ul>\n<ul>\n<li>Enhance sensor models within a high-fidelity simulation environment</li>\n</ul>\n<ul>\n<li>Work in a fast-paced, collaborative, continuous development environment, enhancing analysis and benchmarking capabilities</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_baf3d89c-84b","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/501e3703-1a63-4773-b961-6029e5fb71d6","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["C++","Sensor fusion","Artificial intelligence","Machine learning","Machine perception","Kalman Filter","Factor Graphs","Computer Vision","OpenCV","Unix environments"],"x-skills-preferred":["Robotics technologies","Unmanned system technologies","High-fidelity simulation","Sensor modeling"],"datePosted":"2026-04-17T13:05:41.027Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Sensor fusion, Artificial intelligence, Machine learning, Machine perception, Kalman Filter, Factor Graphs, Computer Vision, OpenCV, Unix environments, Robotics technologies, Unmanned system technologies, High-fidelity simulation, Sensor modeling","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_415fa450-752"},"title":"Senior Manager, Software - Autonomous Aircraft Integration","description":"<p>This position is ideal for an individual who thrives on solving complex integration challenges that span hardware, software, and systems engineering. As a Senior Manager, Software - Autonomous Aircraft Integration, you will lead technical teams and support direct projects integrating autonomy solutions for defense platforms.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Leading teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Integrating autonomy software onto unmanned aircraft systems, ensuring seamless operation across onboard compute, sensors, and control interfaces.</li>\n<li>Owning the build, configuration, and validation process for flight-ready systems; coordinating hardware/software compatibility and mission readiness.</li>\n<li>Traveling to test sites and supporting live flight operations, including safety checks, system bring-up, and troubleshooting under time-critical constraints.</li>\n<li>Diagnosing and resolving integration issues across complex autonomy software stacks and embedded systems in lab and field environments.</li>\n<li>Managing data collection during missions and post-test analysis, working with autonomy engineers to refine behaviors and identify improvements.</li>\n<li>Collaborating with autonomy, GNC, systems, and test teams to ensure mission-critical functionality is delivered on time and validated thoroughly.</li>\n</ul>\n<p>Required qualifications include:</p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience.</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.</li>\n<li>7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D.</li>\n<li>2+ years of people leadership experience.</li>\n<li>Proficiency in programming languages such as C++ and Python, and familiarity with real-time operating systems (RTOS).</li>\n<li>Proficiency in Linux-based development and experience working with embedded systems, shell scripting, and system diagnostics.</li>\n<li>Knowledge of sensor integration, sensor fusion, and middleware frameworks (e.g., ROS, DDS).</li>\n<li>Hands-on experience supporting flight demos or live exercises.</li>\n<li>Experience with simulation tools and environments (e.g., AFSIM, NGTS) for testing and validation.</li>\n<li>Strong problem-solving skills, with the ability to troubleshoot and optimize system performance.</li>\n<li>Excellent communication and teamwork skills, with the ability to work effectively in a collaborative, multidisciplinary environment.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Direct experience supporting unmanned aerial systems or similar flight test campaigns.</li>\n<li>Familiarity with autonomy stacks, flight control systems, or GNC pipelines.</li>\n<li>Competence in sensor integration, electronics debugging, or avionics bring-up.</li>\n<li>Proficiency in developing automation tools for system testing, logging, and data parsing.</li>\n<li>Comfortable interfacing with DoD stakeholders during field events or technical reviews.</li>\n<li>Experience with UCI and OMS Standards.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_415fa450-752","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/53d404c6-d2cd-4b97-934f-7b17b2a76768","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["C++","Python","Real-time operating systems (RTOS)","Linux-based development","Embedded systems","Shell scripting","System diagnostics","Sensor integration","Sensor fusion","Middleware frameworks (e.g., ROS, DDS)","Flight demos","Live exercises","Simulation tools and environments (e.g., AFSIM, NGTS)"],"x-skills-preferred":["Autonomy stacks","Flight control systems","GNC pipelines","Electronics debugging","Avionics bring-up","Automation tools","System testing","Logging","Data parsing","UCI and OMS Standards"],"datePosted":"2026-04-17T13:04:50.550Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Real-time operating systems (RTOS), Linux-based development, Embedded systems, Shell scripting, System diagnostics, Sensor integration, Sensor fusion, Middleware frameworks (e.g., ROS, DDS), Flight demos, Live exercises, Simulation tools and environments (e.g., AFSIM, NGTS), Autonomy stacks, Flight control systems, GNC pipelines, Electronics debugging, Avionics bring-up, Automation tools, System testing, Logging, Data parsing, UCI and OMS Standards","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_98550091-2de"},"title":"Staff Engineer, State Estimation","description":"<p>As a Staff State Estimation Engineer, you will play a critical role on the GNC team, contributing to the development, optimisation, and deployment of advanced sensor fusion and navigation algorithms for autonomous UAV operations in dynamic and contested environments.</p>\n<p>Your work will support the transition of cutting-edge research into fielded capabilities, helping Shield AI deliver precision navigation solutions for mission-critical applications.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop and implement real-time state estimation algorithms including inertial navigation, sensor fusion, and alternative navigation methods for GPS-denied or degraded environments.</li>\n<li>Integrate data from IMUs, GNSS receivers, visual odometry, magnetometers, barometers, and radar into robust estimation frameworks.</li>\n<li>Design sensor processing pipelines focused on accuracy, robustness, and system-level fault tolerance.</li>\n<li>Collaborate with autonomy, software, and hardware teams to ensure end-to-end integration of navigation and PNT systems.</li>\n<li>Conduct simulation, lab testing, and field trials to evaluate algorithm performance under real-world conditions.</li>\n<li>Stay current on advancements in state estimation and navigation technologies and help adapt new innovations into deployable solutions.</li>\n</ul>\n<p><strong>Qualifications:</strong></p>\n<ul>\n<li>Typically requires a minimum of 7 years of relevant experience with a bachelor’s degree; or 6 years with a master’s degree; or 4 years with a PhD; or equivalent practical experience.</li>\n<li>Experience developing and deploying real-time navigation or sensor fusion algorithms using IMUs, GPS, or other sensors.</li>\n<li>Strong understanding of filtering and estimation techniques (e.g., Kalman filters, EKF, UKF, particle filters).</li>\n<li>Proficient in C++11 or newer in real-time environments.</li>\n<li>Comfortable working in Linux, with experience using standard command-line tools and scripting.</li>\n<li>Strong written and verbal communication skills with a collaborative mindset.</li>\n<li>Demonstrated success working in fast-paced development cycles and delivering high-quality results.</li>\n</ul>\n<p><strong>Salary:</strong></p>\n<p>$187,531 - $281,297 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_98550091-2de","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/f8849287-b9ff-4c3e-a37f-be20e39c597b","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$187,531 - $281,297 a year","x-skills-required":["state estimation","sensor fusion","inertial navigation","Kalman filters","C++11","Linux"],"x-skills-preferred":["visual odometry","computer vision","CUDA","hardware acceleration"],"datePosted":"2026-04-17T13:04:28.903Z","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"state estimation, sensor fusion, inertial navigation, Kalman filters, C++11, Linux, visual odometry, computer vision, CUDA, hardware acceleration","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":187531,"maxValue":281297,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5f911dd8-860"},"title":"Senior Staff Engineer, Software - Perception","description":"<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5f911dd8-860","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,800 - $331,200 a year","x-skills-required":["algorithm development","sensor fusion","state estimation","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","perception software deployment on SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:35.432Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220800,"maxValue":331200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_006babdb-38a"},"title":"Principal Engineer, State Estimation","description":"<p>We are looking for an experienced state estimation engineer to design, develop, and support the deployment of safety-critical navigation solutions for aerospace platforms. This role requires deep expertise in state estimation theory, extensive practical experience implementing ownship navigation systems in the aerospace domain, and strong software development skills for production-quality navigation code.</p>\n<p><strong>Navigation System Architecture &amp; Design:</strong></p>\n<p>Establish navigation performance requirements and error budgets for safety-critical applications. Support decomposition of navigation requirements into allocations for sensors, estimation algorithms, and software components. Design detailed software architecture for state estimation implementations, including module interfaces and data flow.</p>\n<p><strong>Algorithm Development &amp; Implementation:</strong></p>\n<p>Design and implement Extended Kalman Filter algorithms for navigation applications, with broad understanding of state estimation theory and alternative filtering approaches. Implement tightly-coupled and loosely-coupled GNSS/INS integration algorithms. Integrate diverse sensing modalities (vision, RF, celestial etc.) into multi-sensor fusion framework for GPS-degraded environments. Develop fault detection, isolation, and recovery (FDIR) strategies for navigation systems. Implement integrity monitoring and protection level calculations for safety-critical operations.</p>\n<p><strong>Certification &amp; Verification:</strong></p>\n<p>Develop verification and validation test plans for navigation algorithms. Conduct performance analysis including Monte Carlo simulation, covariance analysis, and flight test data evaluation. Document navigation system design, requirements allocation, and compliance evidence. Support safety assessment activities including failure modes and effects analysis.</p>\n<p><strong>Technical Leadership:</strong></p>\n<p>Provide technical guidance on navigation architecture and state estimation approaches. Support trade studies evaluating navigation sensor suites and fusion strategies. Mentor junior engineers on state estimation theory and implementation.</p>\n<p><strong>Required qualifications:</strong></p>\n<ul>\n<li>MS or PhD in Computer Science, Software Engineering, Electrical Engineering, Aerospace Engineering, Mechanical Engineering, Applied Mathematics, or related field</li>\n<li>15+ years of experience developing state estimation algorithms for aerospace navigation applications</li>\n<li>Demonstrated experience implementing GNSS/INS integration solutions</li>\n<li>Experience with safety-critical system development and certification processes</li>\n<li>Deep understanding of state estimation theory</li>\n<li>Experience implementing multi-sensor fusion algorithms in production systems</li>\n<li>Strong background in inertial navigation, GNSS positioning, and sensor error modeling</li>\n<li>Strong programming skills in C/C++ and Python/MATLAB for algorithm development and analysis</li>\n<li>Understanding of integrity monitoring, protection levels, and safety assessment methods</li>\n<li>Experience with requirements management and verification/validation processes for certifiable systems</li>\n<li>Understanding of GPS/GNSS signal structure, error sources, and performance characteristics</li>\n<li>Knowledge of IMU error models, calibration, and Allan variance analysis</li>\n<li>Familiarity with alternative navigation sensors (camera, RF ranging, celestial, etc.)</li>\n<li>Understanding of navigation performance metrics (accuracy, integrity, continuity, availability)</li>\n</ul>\n<p><strong>Preferred qualifications:</strong></p>\n<ul>\n<li>PhD in relevant field with focus on state estimation or navigation</li>\n<li>Direct involvement in certified navigation system development from requirements through flight test</li>\n<li>Experience with specialized navigation approaches (vision-aided navigation, terrain-referenced navigation, celestial navigation, etc.)</li>\n<li>Publications or patents in navigation or state estimation</li>\n<li>Experience with GPS/GNSS jamming and spoofing mitigation techniques</li>\n<li>Clearance eligible or active security clearance</li>\n</ul>\n<p>The salary for this position is $270,000 - $400,000 a year.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_006babdb-38a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/4ccdcce2-ce09-4180-aba4-01f3e405e0e5","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$270,000 - $400,000 a year","x-skills-required":["state estimation theory","GNSS/INS integration","safety-critical system development","multi-sensor fusion algorithms","inertial navigation","GPS/GNSS signal structure","IMU error models","alternative navigation sensors","navigation performance metrics"],"x-skills-preferred":["PhD in relevant field","certified navigation system development","specialized navigation approaches","publications or patents","GPS/GNSS jamming and spoofing mitigation techniques"],"datePosted":"2026-04-17T13:03:22.503Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"state estimation theory, GNSS/INS integration, safety-critical system development, multi-sensor fusion algorithms, inertial navigation, GPS/GNSS signal structure, IMU error models, alternative navigation sensors, navigation performance metrics, PhD in relevant field, certified navigation system development, specialized navigation approaches, publications or patents, GPS/GNSS jamming and spoofing mitigation techniques","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":270000,"maxValue":400000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_c174eeee-910"},"title":"Staff Engineer, Software - Autonomous Aircraft Integration","description":"<p>This position is ideal for an individual who thrives on solving complex integration challenges that span hardware, software, and systems engineering. As a Staff Engineer, Software - Autonomous Aircraft Integration, you will be skilled at deploying autonomy solutions onto unmanned platforms, preparing systems for flight, and troubleshooting mission-critical issues in both lab and field environments.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Flight Integration Engineers are essential to bridging the gap between R&amp;D and deployment, ensuring that autonomous systems function reliably and effectively when and where they are needed most.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li><strong>System Integration &amp; Deployment</strong> , Integrate autonomy software onto unmanned aircraft systems, ensuring seamless operation across onboard compute, sensors, and control interfaces.</li>\n</ul>\n<ul>\n<li><strong>Pre-flight Preparation</strong> , Own the build, configuration, and validation process for flight-ready systems; coordinate hardware/software compatibility and mission readiness.</li>\n</ul>\n<ul>\n<li><strong>On-site Flight Test Support</strong> , Travel to test sites and support live flight operations, including safety checks, system bring-up, and troubleshooting under time-critical constraints.</li>\n</ul>\n<ul>\n<li><strong>Hardware/Software Debugging</strong> , Diagnose and resolve integration issues across complex autonomy software stacks and embedded systems in lab and field environments.</li>\n</ul>\n<ul>\n<li><strong>Flight Data Capture &amp; Analysis</strong> , Manage data collection during missions and post-test analysis, working with autonomy engineers to refine behaviours and identify improvements.</li>\n</ul>\n<ul>\n<li><strong>Collaboration Across Teams</strong> , Work closely with autonomy, GNC, systems, and test teams to ensure mission-critical functionality is delivered on time and validated thoroughly.</li>\n</ul>\n<ul>\n<li><strong>Continuous Improvement</strong> , Build tools and processes to improve integration timelines, flight test reliability, and team efficiency across deployment cycles.</li>\n</ul>\n<ul>\n<li><strong>Support Certification and Compliance</strong> , Assist with documentation and system-level validation required for certification, airworthiness, and compliance in defense-relevant environments.</li>\n</ul>\n<ul>\n<li><strong>Travel Requirement</strong>– Members of this team typically travel around 30-40% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n</ul>\n<ul>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n</ul>\n<ul>\n<li>Proficiency in programming languages such as C++ and Python, and familiarity with real-time operating systems (RTOS)</li>\n</ul>\n<ul>\n<li>Proficiency in Linux-based development and experience working with embedded systems, shell scripting, and system diagnostics</li>\n</ul>\n<ul>\n<li>Knowledge of sensor integration, sensor fusion, and middleware frameworks (e.g., ROS, DDS)</li>\n</ul>\n<ul>\n<li>Hands-on experience supporting flight demos or live exercises</li>\n</ul>\n<ul>\n<li>Experience with simulation tools and environments (e.g., AFSIM, NGTS) for testing and validation</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills, with the ability to troubleshoot and optimize system performance</li>\n</ul>\n<ul>\n<li>Excellent communication and teamwork skills, with the ability to work effectively in a collaborative, multidisciplinary environment</li>\n</ul>\n<ul>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Direct experience supporting unmanned aerial systems or similar flight test campaigns</li>\n</ul>\n<ul>\n<li>Familiarity with autonomy stacks, flight control systems, or GNC pipelines</li>\n</ul>\n<ul>\n<li>Competence in sensor integration, electronics debugging, or avionics bring-up</li>\n</ul>\n<ul>\n<li>Proficiency in developing automation tools for system testing, logging, and data parsing</li>\n</ul>\n<ul>\n<li>Comfortable interfacing with DoD stakeholders during field events or technical reviews</li>\n</ul>\n<ul>\n<li>Experience with UCI and OMS Standards</li>\n</ul>\n<p>$182,720 - $274,080 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_c174eeee-910","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/6265ee65-8136-41b5-9279-97f9a4b1d2f6","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$182,720 - $274,080 a year","x-skills-required":["C++","Python","Real-time operating systems (RTOS)","Linux-based development","Embedded systems","Shell scripting","System diagnostics","Sensor integration","Sensor fusion","Middleware frameworks (e.g., ROS, DDS)","Simulation tools and environments (e.g., AFSIM, NGTS)"],"x-skills-preferred":["Direct experience supporting unmanned aerial systems or similar flight test campaigns","Familiarity with autonomy stacks, flight control systems, or GNC pipelines","Competence in sensor integration, electronics debugging, or avionics bring-up","Proficiency in developing automation tools for system testing, logging, and data parsing","Comfortable interfacing with DoD stakeholders during field events or technical reviews","Experience with UCI and OMS Standards"],"datePosted":"2026-04-17T13:03:18.853Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / Boston, MA / San Diego, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Real-time operating systems (RTOS), Linux-based development, Embedded systems, Shell scripting, System diagnostics, Sensor integration, Sensor fusion, Middleware frameworks (e.g., ROS, DDS), Simulation tools and environments (e.g., AFSIM, NGTS), Direct experience supporting unmanned aerial systems or similar flight test campaigns, Familiarity with autonomy stacks, flight control systems, or GNC pipelines, Competence in sensor integration, electronics debugging, or avionics bring-up, Proficiency in developing automation tools for system testing, logging, and data parsing, Comfortable interfacing with DoD stakeholders during field events or technical reviews, Experience with UCI and OMS Standards","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":182720,"maxValue":274080,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_947bfaeb-58e"},"title":"Senior Manager, Autonomy Integration (SD/DC)","description":"<p>We are seeking a skilled and motivated Autonomy Integration Engineer with 10+ years of experience to manage a technical team supporting the development, integration, and testing of autonomy algorithms for advanced aerospace, defense, and robotics systems.</p>\n<p>This position offers an exciting opportunity to apply your technical expertise in autonomy and system integration while growing your skills in a fast-paced and collaborative environment. You will work on cutting-edge projects that advance the capabilities of autonomous systems and ensure their seamless integration into real-world platforms.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Assist in developing autonomy algorithms for path planning, decision-making, perception, and control to meet system performance and mission requirements.</li>\n<li>Support the integration of autonomy algorithms with platform hardware and software, including sensors, control systems, and communication interfaces.</li>\n<li>Conduct simulation-based testing (HIL/SIL) and participate in live system tests to validate autonomy functionality and identify areas for improvement.</li>\n<li>Work closely with software developers, systems engineers, and testing teams to ensure alignment of autonomy capabilities with overall system architecture and mission goals.</li>\n<li>Debug and optimize autonomy systems to enhance performance, reliability, and adaptability.</li>\n<li>Contribute to technical documentation, test reports, and presentations for stakeholders and team members.</li>\n<li>Stay current on emerging trends and technologies in robotics, autonomy, and AI to incorporate best practices into development efforts.</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>Bachelor’s Degree in Computer Science, Electrical Engineering, Aerospace Engineering, Robotics, or a related field (Master’s preferred).</li>\n<li>10+ years of experience developing and integrating autonomy algorithms in robotics, aerospace, or defense systems.</li>\n<li>2+ years managing technical teams.</li>\n<li>Proficiency in programming languages such as C++ and Python, and familiarity with real-time operating systems (RTOS).</li>\n<li>Knowledge of sensor integration, sensor fusion, and middleware frameworks (e.g., ROS, DDS).</li>\n<li>Experience with simulation tools and environments (e.g., AFSIM, NGTS) for testing and validation.</li>\n<li>Strong problem-solving skills, with the ability to troubleshoot and optimize system performance.</li>\n<li>Excellent communication and teamwork skills, with the ability to work effectively in a collaborative, multidisciplinary environment.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Familiarity with open architecture standards (e.g., OMS/UCI or FACE) and regulatory standards (e.g., DO-178C).</li>\n<li>Active SECRET Clearance.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_947bfaeb-58e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/5f3cd249-0401-4a28-b830-55ed0a2a3222","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["C++","Python","Real-time Operating Systems (RTOS)","Sensor Integration","Sensor Fusion","Middleware Frameworks (ROS, DDS)","Simulation Tools (AFSIM, NGTS)"],"x-skills-preferred":["Open Architecture Standards (OMS/UCI or FACE)","Regulatory Standards (DO-178C)"],"datePosted":"2026-04-17T13:03:11.069Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / San Diego, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Real-time Operating Systems (RTOS), Sensor Integration, Sensor Fusion, Middleware Frameworks (ROS, DDS), Simulation Tools (AFSIM, NGTS), Open Architecture Standards (OMS/UCI or FACE), Regulatory Standards (DO-178C)","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bed4759c-578"},"title":"Staff Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Required Qualifications:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p>$182,720 - $274,080 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bed4759c-578","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$182,720 - $274,080 a year","x-skills-required":["real-time object detection","sensor fusion","state estimation algorithms","EO/IR cameras","radars","IMUs","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","Interface Control Documents","hardware integration specs","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration or algorithm development with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","vision-based object detection or classification tasks","SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:02:45.901Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":182720,"maxValue":274080,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_239f5e85-c2a"},"title":"Senior Engineer, GNC - Sensor Integration","description":"<p>We are seeking a Sensor Integration Engineer to support selection, characterisation, calibration, and integration of multi-modal navigation and localisation sensors for aerospace applications. This role operates at the intersection of electrical, mechanical, and software engineering, requiring the ability to work across disciplines and directly contribute to cross-functional engineering efforts.</p>\n<p><strong>Key Responsibilities:</strong></p>\n<ul>\n<li>Support sensor integration activities from specification through production</li>\n<li>Develop and execute sensor characterisation and calibration procedures for IMUs, GNSS receivers, cameras, and other navigation sensors</li>\n<li>Contribute to sensor trade studies and selection processes</li>\n</ul>\n<p><strong>Collaboration:</strong></p>\n<ul>\n<li>Work closely with electrical engineering, mechanical engineering, and software engineering teams on electrical interfaces, power distribution, and signal conditioning, mounting solutions, alignment, and thermal/vibration considerations, software interfaces, drivers, data acquisition, and calibration implementation</li>\n<li>Participate in design reviews across EE/ME/SW disciplines, providing interdisciplinary perspective and coordination</li>\n</ul>\n<p><strong>Technical Contributions:</strong></p>\n<ul>\n<li>Develop and execute test plans for sensor performance characterisation and validation</li>\n<li>Document sensor characteristics, trade studies, and integration requirements</li>\n<li>Develop or modify calibration scripts and data processing tools</li>\n<li>Support development of calibration procedures and tooling</li>\n<li>Debug issues spanning hardware-firmware-software boundaries</li>\n<li>Review and contribute to schematics, CAD models, and code as necessary</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS in Electrical Engineering, Mechanical Engineering, Aerospace Engineering, Physics, or related field</li>\n<li>1-4 years of relevant experience (new grads with strong cross-functional internship/project experience will be considered)</li>\n<li>Demonstrated ability to work across at least two of: electrical, mechanical, software domains</li>\n</ul>\n<p><strong>Desirable Qualifications:</strong></p>\n<ul>\n<li>Experience with navigation or localisation sensors (IMUs, GNSS, cameras, altimeters, etc.)</li>\n<li>Exposure to sensor calibration or characterisation activities</li>\n<li>Basic proficiency with engineering tools such as Altium, NX, and Teamcenter</li>\n<li>Familiarity with embedded systems or real-time software</li>\n<li>Understanding of coordinate transformations or sensor fusion concepts</li>\n<li>Experience with automated testing or data acquisition systems</li>\n<li>Knowledge of aerospace environmental testing or standards</li>\n</ul>\n<p><strong>Offer Package:</strong></p>\n<ul>\n<li>Pay within range listed ($110,000 - $170,000 a year) + Bonus + Benefits + Equity</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_239f5e85-c2a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/6d3cf300-d22e-47bb-b190-e376bd8715f1","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$110,000 - $170,000 a year","x-skills-required":["sensor measurement principles and error sources","electrical schematics and/or mechanical drawings","scripting or programming for data analysis (Python, MATLAB, or similar)","sensor communication protocols (SPI, I2C, UART, CAN, etc.)","lab equipment and sensor testing"],"x-skills-preferred":["navigation or localisation sensors (IMUs, GNSS, cameras, altimeters, etc.)","sensor calibration or characterisation activities","engineering tools such as Altium, NX, and Teamcenter","embedded systems or real-time software","coordinate transformations or sensor fusion concepts","automated testing or data acquisition systems","aerospace environmental testing or standards"],"datePosted":"2026-04-17T13:02:24.238Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"sensor measurement principles and error sources, electrical schematics and/or mechanical drawings, scripting or programming for data analysis (Python, MATLAB, or similar), sensor communication protocols (SPI, I2C, UART, CAN, etc.), lab equipment and sensor testing, navigation or localisation sensors (IMUs, GNSS, cameras, altimeters, etc.), sensor calibration or characterisation activities, engineering tools such as Altium, NX, and Teamcenter, embedded systems or real-time software, coordinate transformations or sensor fusion concepts, automated testing or data acquisition systems, aerospace environmental testing or standards","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":110000,"maxValue":170000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d29c97ab-a9a"},"title":"Senior Engineer, Software - Autonomous Aircraft Integration","description":"<p>This position is ideal for an individual who thrives on solving complex integration challenges that span hardware, software, and systems engineering. As a Senior Engineer, Software - Autonomous Aircraft Integration, you will be skilled at deploying autonomy solutions onto unmanned platforms, preparing systems for flight, and troubleshooting mission-critical issues in both lab and field environments.</p>\n<p>The role is highly dynamic, requiring hands-on experience, strong systems thinking, and the ability to operate effectively in fast-paced, real-world test conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Flight Integration Engineers are essential to bridging the gap between R&amp;D and deployment, ensuring that autonomous systems function reliably and effectively when and where they are needed most.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>System Integration &amp; Deployment , Integrate autonomy software onto unmanned aircraft systems, ensuring seamless operation across onboard compute, sensors, and control interfaces.</li>\n</ul>\n<ul>\n<li>Pre-flight Preparation , Own the build, configuration, and validation process for flight-ready systems; coordinate hardware/software compatibility and mission readiness.</li>\n</ul>\n<ul>\n<li>On-site Flight Test Support , Travel to test sites and support live flight operations, including safety checks, system bring-up, and troubleshooting under time-critical constraints.</li>\n</ul>\n<ul>\n<li>Hardware/Software Debugging , Diagnose and resolve integration issues across complex autonomy software stacks and embedded systems in lab and field environments.</li>\n</ul>\n<ul>\n<li>Flight Data Capture &amp; Analysis , Manage data collection during missions and post-test analysis, working with autonomy engineers to refine behaviors and identify improvements.</li>\n</ul>\n<ul>\n<li>Collaboration Across Teams , Work closely with autonomy, GNC, systems, and test teams to ensure mission-critical functionality is delivered on time and validated thoroughly.</li>\n</ul>\n<ul>\n<li>Continuous Improvement , Build tools and processes to improve integration timelines, flight test reliability, and team efficiency across deployment cycles.</li>\n</ul>\n<ul>\n<li>Support Certification and Compliance , Assist with documentation and system-level validation required for certification, airworthiness, and compliance in defense-relevant environments.</li>\n</ul>\n<ul>\n<li>Travel Requirement , Members of this team typically travel around 30-40% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n</ul>\n<ul>\n<li>Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience</li>\n</ul>\n<ul>\n<li>Proficiency in programming languages such as C++ and Python, and familiarity with real-time operating systems (RTOS)</li>\n</ul>\n<ul>\n<li>Proficiency in Linux-based development and experience working with embedded systems, shell scripting, and system diagnostics</li>\n</ul>\n<ul>\n<li>Knowledge of sensor integration, sensor fusion, and middleware frameworks (e.g., ROS, DDS)</li>\n</ul>\n<ul>\n<li>Hands-on experience supporting flight demos or live exercises</li>\n</ul>\n<ul>\n<li>Experience with simulation tools and environments (e.g., AFSIM, NGTS) for testing and validation</li>\n</ul>\n<ul>\n<li>Strong problem-solving skills, with the ability to troubleshoot and optimize system performance</li>\n</ul>\n<ul>\n<li>Excellent communication and teamwork skills, with the ability to work effectively in a collaborative, multidisciplinary environment</li>\n</ul>\n<ul>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Direct experience supporting unmanned aerial systems or similar flight test campaigns</li>\n</ul>\n<ul>\n<li>Familiarity with autonomy stacks, flight control systems, or GNC pipelines</li>\n</ul>\n<ul>\n<li>Competence in sensor integration, electronics debugging, or avionics bring-up</li>\n</ul>\n<ul>\n<li>Proficiency in developing automation tools for system testing, logging, and data parsing</li>\n</ul>\n<ul>\n<li>Comfortable interfacing with DoD stakeholders during field events or technical reviews</li>\n</ul>\n<ul>\n<li>Experience with UCI and OMS Standards</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d29c97ab-a9a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/3daaf9c5-164e-4a8e-abc9-b475a38522c3","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$160,000 - $240,000 a year","x-skills-required":["C++","Python","Real-time operating systems (RTOS)","Linux-based development","Embedded systems","Shell scripting","System diagnostics","Sensor integration","Sensor fusion","Middleware frameworks (e.g., ROS, DDS)","Simulation tools and environments (e.g., AFSIM, NGTS)"],"x-skills-preferred":["Direct experience supporting unmanned aerial systems or similar flight test campaigns","Familiarity with autonomy stacks, flight control systems, or GNC pipelines","Competence in sensor integration, electronics debugging, or avionics bring-up","Proficiency in developing automation tools for system testing, logging, and data parsing","Comfortable interfacing with DoD stakeholders during field events or technical reviews","Experience with UCI and OMS Standards"],"datePosted":"2026-04-17T13:02:07.941Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / Boston, MA / San Diego, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Real-time operating systems (RTOS), Linux-based development, Embedded systems, Shell scripting, System diagnostics, Sensor integration, Sensor fusion, Middleware frameworks (e.g., ROS, DDS), Simulation tools and environments (e.g., AFSIM, NGTS), Direct experience supporting unmanned aerial systems or similar flight test campaigns, Familiarity with autonomy stacks, flight control systems, or GNC pipelines, Competence in sensor integration, electronics debugging, or avionics bring-up, Proficiency in developing automation tools for system testing, logging, and data parsing, Comfortable interfacing with DoD stakeholders during field events or technical reviews, Experience with UCI and OMS Standards","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_4bc0441f-89d"},"title":"Senior Staff Engineer, Software - Autonomous Aircraft Integration","description":"<p>This position is ideal for an individual who thrives on solving complex integration challenges that span hardware, software, and systems engineering. As a Senior Staff Engineer, Software - Autonomous Aircraft Integration, you will be skilled at deploying autonomy solutions onto unmanned platforms, preparing systems for flight, and troubleshooting mission-critical issues in both lab and field environments.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Flight Integration Engineers are essential to bridging the gap between R&amp;D and deployment, ensuring that autonomous systems function reliably and effectively when and where they are needed most.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>System Integration &amp; Deployment , Integrate autonomy software onto unmanned aircraft systems, ensuring seamless operation across onboard compute, sensors, and control interfaces.</li>\n<li>Pre-flight Preparation , Own the build, configuration, and validation process for flight-ready systems; coordinate hardware/software compatibility and mission readiness.</li>\n<li>On-site Flight Test Support , Travel to test sites and support live flight operations, including safety checks, system bring-up, and troubleshooting under time-critical constraints.</li>\n<li>Hardware/Software Debugging , Diagnose and resolve integration issues across complex autonomy software stacks and embedded systems in lab and field environments.</li>\n<li>Flight Data Capture &amp; Analysis , Manage data collection during missions and post-test analysis, working with autonomy engineers to refine behaviors and identify improvements.</li>\n<li>Collaboration Across Teams , Work closely with autonomy, GNC, systems, and test teams to ensure mission-critical functionality is delivered on time and validated thoroughly.</li>\n<li>Continuous Improvement , Build tools and processes to improve integration timelines, flight test reliability, and team efficiency across deployment cycles.</li>\n<li>Support Certification and Compliance , Assist with documentation and system-level validation required for certification, airworthiness, and compliance in defense-relevant environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 30-40% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>\n<li>Proficiency in programming languages such as C++ and Python, and familiarity with real-time operating systems (RTOS)</li>\n<li>Proficiency in Linux-based development and experience working with embedded systems, shell scripting, and system diagnostics</li>\n<li>Knowledge of sensor integration, sensor fusion, and middleware frameworks (e.g., ROS, DDS)</li>\n<li>Hands-on experience supporting flight demos or live exercises</li>\n<li>Experience with simulation tools and environments (e.g., AFSIM, NGTS) for testing and validation</li>\n<li>Strong problem-solving skills, with the ability to troubleshoot and optimize system performance</li>\n<li>Excellent communication and teamwork skills, with the ability to work effectively in a collaborative, multidisciplinary environment</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Direct experience supporting unmanned aerial systems or similar flight test campaigns</li>\n<li>Familiarity with autonomy stacks, flight control systems, or GNC pipelines</li>\n<li>Competence in sensor integration, electronics debugging, or avionics bring-up</li>\n<li>Proficiency in developing automation tools for system testing, logging, and data parsing</li>\n<li>Comfortable interfacing with DoD stakeholders during field events or technical reviews</li>\n<li>Experience with UCI and OMS Standards</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_4bc0441f-89d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/25011392-094f-482c-b007-f307fb8c4f9f","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,800 - $331,200 a year","x-skills-required":["C++","Python","Real-time operating systems (RTOS)","Linux-based development","Embedded systems","Shell scripting","System diagnostics","Sensor integration","Sensor fusion","Middleware frameworks (e.g., ROS, DDS)","Simulation tools and environments (e.g., AFSIM, NGTS)"],"x-skills-preferred":["Direct experience supporting unmanned aerial systems or similar flight test campaigns","Familiarity with autonomy stacks, flight control systems, or GNC pipelines","Competence in sensor integration, electronics debugging, or avionics bring-up","Proficiency in developing automation tools for system testing, logging, and data parsing","Comfortable interfacing with DoD stakeholders during field events or technical reviews","Experience with UCI and OMS Standards"],"datePosted":"2026-04-17T13:01:44.268Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / Boston, MA / San Diego, California"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Real-time operating systems (RTOS), Linux-based development, Embedded systems, Shell scripting, System diagnostics, Sensor integration, Sensor fusion, Middleware frameworks (e.g., ROS, DDS), Simulation tools and environments (e.g., AFSIM, NGTS), Direct experience supporting unmanned aerial systems or similar flight test campaigns, Familiarity with autonomy stacks, flight control systems, or GNC pipelines, Competence in sensor integration, electronics debugging, or avionics bring-up, Proficiency in developing automation tools for system testing, logging, and data parsing, Comfortable interfacing with DoD stakeholders during field events or technical reviews, Experience with UCI and OMS Standards","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220800,"maxValue":331200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_850c077a-c6e"},"title":"Software Engineer, Generalist","description":"<p>We are seeking a Software Engineer, Generalist to play a pivotal role in the design, development, and implementation of software systems for our autonomous surface vessels (ASVs).</p>\n<p>You will work closely with cross-functional teams to ensure the seamless integration of software components into our ASV platform.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Collaborate with hardware engineers, robotics engineers, and other software engineers across the tech stack to design, develop, and deploy software solutions for autonomous surface vessels</li>\n<li>Participate in all phases of the software development lifecycle, including requirements gathering, design, implementation, testing, deployment, and maintenance</li>\n<li>Develop robust, scalable, and maintainable software systems that meet the unique challenges of autonomous maritime operations</li>\n<li>Implement algorithms for perception, navigation, path planning, and control to enable autonomous behavior in ASVs</li>\n<li>Optimise software performance and reliability to meet stringent DoD requirements and operational standards</li>\n<li>Conduct thorough testing and validation of software components to ensure functionality, accuracy, and safety</li>\n<li>Stay current with emerging technologies and industry trends in autonomous systems, robotics, and maritime technology</li>\n</ul>\n<p>Qualifications:</p>\n<ul>\n<li>Bachelor&#39;s degree in Computer Science, Software Engineering, or a related field</li>\n<li>Proven experience in software development, with a focus on autonomous systems, robotics, or related fields</li>\n<li>Proficiency in programming languages such as C++, Python, or Java, with a strong emphasis on object-oriented design and development</li>\n<li>Experience with software development tools and frameworks commonly used in robotics and autonomous systems (e.g., ROS, OpenCV, TensorFlow, etc.)</li>\n<li>Familiarity with sensor fusion techniques, SLAM algorithms, and other technologies relevant to autonomous navigation and perception</li>\n<li>Strong problem-solving skills and the ability to work effectively in a fast-paced environment</li>\n<li>Excellent communication skills and the ability to clearly articulate technical concepts</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Medical Insurance: Comprehensive health insurance plans covering a range of services</li>\n<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</li>\n<li>Saronic pays 100% of the premium for employees and 80% for dependents</li>\n<li>Time Off: Generous PTO and Holidays</li>\n<li>Parental Leave: Paid maternity and paternity leave to support new parents</li>\n<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</li>\n<li>Retirement Plan: 401(k) plan</li>\n<li>Stock Options: Equity options to give employees a stake in the company’s success</li>\n<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</li>\n<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</li>\n</ul>\n<p>Physical Demands:</p>\n<ul>\n<li>Prolonged periods of sitting at a desk and working on a computer.</li>\n<li>Occasional standing and walking within the office.</li>\n<li>Manual dexterity to operate a computer keyboard, mouse, and other office equipment.</li>\n<li>Visual acuity to read screens, documents, and reports.</li>\n<li>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies.</li>\n<li>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages).</li>\n</ul>\n<p>Additional Information:</p>\n<p>This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3).</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_850c077a-c6e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/77ef1f0f-5ba5-46b5-aca6-d38730790e97","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["C++","Python","Java","ROS","OpenCV","TensorFlow","sensor fusion techniques","SLAM algorithms"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:59:03.450Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Java, ROS, OpenCV, TensorFlow, sensor fusion techniques, SLAM algorithms"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_55e8fefe-652"},"title":"Senior Perception & Autonomy Engineer","description":"<p>We are seeking a Senior Perception &amp; Autonomy Engineer to play a pivotal role in designing, developing, and implementing perception systems for our autonomous surface vessels.</p>\n<p>Our team is focused on making boats go and perform tasks with no human involvement. This job is available at multiple levels, including entry, senior, and staff.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Develop algorithms and models which allow boats to sense and navigate</li>\n<li>Develop metrics which allow quantitative analysis of improvements and regressions in boat performance</li>\n<li>Analyze and work with large data systems to enable model training and evaluation</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Strong programming fundamentals</li>\n<li>Extensive programming experience and demonstrated ability to work on large systems</li>\n<li>Computing Fundamentals</li>\n<li>A general understanding of operating systems and or similar large scale systems</li>\n<li>An understanding of basic computer architecture</li>\n<li>A demonstrated willingness to learn and pivot based on new information</li>\n</ul>\n<p>Useful Skills:</p>\n<ul>\n<li>Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch)</li>\n<li>Understanding of various filters and their applications</li>\n<li>Proficiency in Rust</li>\n<li>Experience with maritime or autonomous vehicle projects</li>\n<li>Experience with signals processing or sensor fusion</li>\n<li>Experience with low latency inference and tracking pipelines</li>\n<li>Experience with path planning algorithms</li>\n<li>Experience training and deploying multi modal models</li>\n<li>Experience with various sensors including radar, cameras, and lidar</li>\n<li>Experience developing and optimizing deployed ML systems</li>\n</ul>\n<p>Physical Demands:</p>\n<ul>\n<li>Prolonged periods of sitting at a desk and working on a computer</li>\n<li>Occasional standing and walking within the office</li>\n<li>Manual dexterity to operate a computer keyboard, mouse, and other office equipment</li>\n<li>Visual acuity to read screens, documents, and reports</li>\n<li>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies</li>\n<li>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages)</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Medical Insurance: Comprehensive health insurance plans covering a range of services</li>\n<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</li>\n<li>Saronic pays 100% of the premium for employees and 80% for dependents</li>\n<li>Time Off: Generous PTO and Holidays</li>\n<li>Parental Leave: Paid maternity and paternity leave to support new parents</li>\n<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</li>\n<li>Retirement Plan: 401(k) plan</li>\n<li>Stock Options: Equity options to give employees a stake in the company’s success</li>\n<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</li>\n<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</li>\n</ul>\n<p>Additional Information:</p>\n<p>This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3).</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_55e8fefe-652","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/d770926d-1d32-40d2-a43d-7fc4c6fd9350","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["strong programming fundamentals","extensive programming experience","computing fundamentals","familiarity with deep learning frameworks","proficiency in Rust"],"x-skills-preferred":["experience with maritime or autonomous vehicle projects","experience with signals processing or sensor fusion","experience with low latency inference and tracking pipelines","experience with path planning algorithms","experience training and deploying multi modal models"],"datePosted":"2026-04-17T12:57:31.628Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"strong programming fundamentals, extensive programming experience, computing fundamentals, familiarity with deep learning frameworks, proficiency in Rust, experience with maritime or autonomous vehicle projects, experience with signals processing or sensor fusion, experience with low latency inference and tracking pipelines, experience with path planning algorithms, experience training and deploying multi modal models"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d9cd7bb0-a0d"},"title":"Senior Electrical Engineer","description":"<p>Saronic Technologies is seeking a Senior Electrical Engineer to drive research and development efforts in navigation and positioning technologies for autonomous marine systems.</p>\n<p>This individual will be responsible for evaluating, benchmarking, and integrating cutting-edge navigation and sensor technologies across Saronic&#39;s fleet of autonomous vessels.</p>\n<p>Key responsibilities include leading R&amp;D initiatives, designing and executing benchmarking tests, performing trade studies, and developing system architectures.</p>\n<p>The ideal candidate brings deep experience in electrical systems design, system-level trade studies, and technology evaluation, and thrives at the intersection of hardware innovation and applied research.</p>\n<p>Responsibilities:</p>\n<ul>\n<li><p>Lead R&amp;D initiatives focused on next-generation navigation and positioning solutions for autonomous vessels.</p>\n</li>\n<li><p>Design and execute benchmarking tests of commercial and prototype systems (GNSS, INS/IMU, radar, LiDAR, Doppler velocity logs, etc.), including lab, dockside, and sea-trial environments.</p>\n</li>\n<li><p>Perform trade studies and comparative analyses of current and emerging market offerings to inform system selection, integration paths, and performance improvements.</p>\n</li>\n<li><p>Develop system architectures and interface specifications for integrating navigation sensors with vessel control, autonomy, and data processing subsystems.</p>\n</li>\n<li><p>Prototype, test, and validate new electrical configurations supporting advanced sensing and positioning.</p>\n</li>\n<li><p>Author technical reports and white papers summarizing benchmarking results, performance metrics, and recommendations.</p>\n</li>\n<li><p>Collaborate with autonomy, software, and mechanical teams to ensure seamless sensor fusion and data pipeline integrity.</p>\n</li>\n<li><p>Mentor junior engineers and contribute to internal R&amp;D roadmaps for navigation system evolution.</p>\n</li>\n<li><p>Contribute to IP generation, system qualification, and design standardization for future vessel platforms.</p>\n</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li><p>B.S. or M.S. in Electrical Engineering, Computer Engineering, or related discipline.</p>\n</li>\n<li><p>7+ years of professional experience in electrical systems engineering, preferably within marine, aerospace, automotive or defense domains.</p>\n</li>\n<li><p>Demonstrated expertise with navigation and positioning systems: GNSS, INS/IMU, radar, LiDAR, or sensor fusion.</p>\n</li>\n<li><p>Proven experience in system-level benchmarking, R&amp;D experimentation, and trade study development.</p>\n</li>\n<li><p>Strong understanding of signal integrity, timing synchronization, and sensor data acquisition.</p>\n</li>\n<li><p>Proficiency with schematic and wiring design tools (e.g., Altium, AutoCAD Electrical, or SolidWorks Electrical).</p>\n</li>\n<li><p>Experience with communication interfaces such as CAN, Ethernet, and serial protocols.</p>\n</li>\n<li><p>Excellent technical writing and presentation skills , capable of producing concise, data-driven R&amp;D documentation.</p>\n</li>\n<li><p>Ability to operate effectively in both lab and field environments, including vessel trials.</p>\n</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li><p>Experience in autonomous or unmanned system development, particularly maritime platforms.</p>\n</li>\n<li><p>Familiarity with sensor fusion algorithms and real-time data processing pipelines.</p>\n</li>\n<li><p>Working knowledge of relevant industry standards (ABS, USCG, MIL-STD, IEC).</p>\n</li>\n<li><p>Background in environmental testing, EMI/EMC compliance, and system reliability analysis.</p>\n</li>\n<li><p>Experience supporting hardware-in-the-loop (HITL) or model-based system simulations.</p>\n</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li><p>Medical Insurance: Comprehensive health insurance plans covering a range of services</p>\n</li>\n<li><p>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</p>\n</li>\n<li><p>Saronic pays 100% of the premium for employees and 80% for dependents</p>\n</li>\n<li><p>Time Off: Generous PTO and Holidays</p>\n</li>\n<li><p>Parental Leave: Paid maternity and paternity leave to support new parents</p>\n</li>\n<li><p>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</p>\n</li>\n<li><p>Retirement Plan: 401(k) plan</p>\n</li>\n<li><p>Stock Options: Equity options to give employees a stake in the company’s success</p>\n</li>\n<li><p>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</p>\n</li>\n<li><p>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</p>\n</li>\n</ul>\n<p>Physical Demands:</p>\n<ul>\n<li><p>Prolonged periods of sitting at a desk and working on a computer.</p>\n</li>\n<li><p>Occasional standing and walking within the office.</p>\n</li>\n<li><p>Manual dexterity to operate a computer keyboard, mouse, and other office equipment.</p>\n</li>\n<li><p>Visual acuity to read screens, documents, and reports.</p>\n</li>\n<li><p>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies.</p>\n</li>\n<li><p>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages).</p>\n</li>\n</ul>\n<p>Additional Information:</p>\n<p>This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3).</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d9cd7bb0-a0d","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/8459b73d-38da-40bc-9ec7-0c5430f8be50","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Electrical Systems Design","System-Level Trade Studies","Navigation and Positioning Systems","Sensor Fusion","Signal Integrity","Timing Synchronization","Sensor Data Acquisition","Communication Interfaces","Technical Writing and Presentation"],"x-skills-preferred":["Autonomous or Unmanned System Development","Sensor Fusion Algorithms","Real-Time Data Processing Pipelines","Industry Standards","Environmental Testing","EMI/EMC Compliance","System Reliability Analysis","Hardware-in-the-Loop Simulations"],"datePosted":"2026-04-17T12:57:31.271Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Electrical Systems Design, System-Level Trade Studies, Navigation and Positioning Systems, Sensor Fusion, Signal Integrity, Timing Synchronization, Sensor Data Acquisition, Communication Interfaces, Technical Writing and Presentation, Autonomous or Unmanned System Development, Sensor Fusion Algorithms, Real-Time Data Processing Pipelines, Industry Standards, Environmental Testing, EMI/EMC Compliance, System Reliability Analysis, Hardware-in-the-Loop Simulations"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_e2d1fa71-cef"},"title":"Senior Electrical Engineer","description":"<p>Saronic Technologies is seeking a Senior Electrical Engineer with expertise in maritime acoustic sensor systems to join our Electrical Engineering - Advanced Development group.</p>\n<p>This role focuses on the research, evaluation, and development of underwater acoustic sensing technologies - including sonar transducers, hydrophones, acoustic modems, and signal processing subsystems - supporting navigation, detection, and situational awareness capabilities for autonomous surface vessels.</p>\n<p>The ideal candidate has extensive experience in acoustic system design, integration, and testing within marine or defense environments, and a strong background in hardware evaluation, data analysis, and system trade studies.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li><p>Lead R&amp;D initiatives in maritime acoustic sensing and underwater signal systems for autonomous vessel applications.</p>\n</li>\n<li><p>Evaluate and benchmark acoustic sensors and subsystems (e.g., sonar arrays, hydrophones, acoustic modems, Doppler sensors) across commercial and defense-grade offerings.</p>\n</li>\n<li><p>Design and execute experiments for acoustic performance testing, calibration, and environmental validation in laboratory and sea-trial conditions.</p>\n</li>\n<li><p>Develop and analyze trade studies comparing acoustic hardware and architectures to inform system selection, performance optimization, and integration paths.</p>\n</li>\n<li><p>Design and document system architectures for acoustic data acquisition, amplification, filtering, and digital conversion.</p>\n</li>\n<li><p>Collaborate with autonomy, software, and mechanical teams to integrate acoustic systems into vessel control and perception frameworks.</p>\n</li>\n<li><p>Develop and validate signal conditioning and analog front-end circuitry for low-noise, high-fidelity acoustic measurements.</p>\n</li>\n<li><p>Author detailed R&amp;D documentation, including test reports, design specifications, and performance assessments.</p>\n</li>\n<li><p>Contribute to system safety, reliability, and environmental qualification efforts, including compliance with applicable maritime and defense standards.</p>\n</li>\n<li><p>Mentor junior engineers and contribute to Saronic’s acoustic sensing roadmap and experimental methodologies.</p>\n</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li><p>B.S. or M.S. in Electrical Engineering, Acoustics, or related discipline.</p>\n</li>\n<li><p>7+ years of experience in acoustic system design, integration, or testing - preferably in commercial marine, defense, or undersea applications.</p>\n</li>\n<li><p>Expertise with underwater acoustics, including sonar, hydrophones, and acoustic communication systems.</p>\n</li>\n<li><p>Experience in hardware benchmarking, R&amp;D testing, and performance analysis of acoustic sensors and electronics.</p>\n</li>\n<li><p>Strong understanding of signal processing concepts, analog front-end design, and data acquisition systems.</p>\n</li>\n<li><p>Familiarity with marine environmental factors affecting acoustic propagation (temperature, salinity, depth, and noise).</p>\n</li>\n<li><p>Proficiency in schematic design and documentation tools (e.g., Altium, AutoCAD Electrical, or SolidWorks Electrical).</p>\n</li>\n<li><p>Hands-on experience with lab instrumentation (oscilloscopes, spectrum analyzers, data loggers, acoustic test equipment).</p>\n</li>\n<li><p>Strong technical writing and presentation skills, with ability to produce rigorous R&amp;D and evaluation documentation.</p>\n</li>\n<li><p>Comfortable working in laboratory, dockside, and sea-trial environments.</p>\n</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li><p>Experience developing or integrating active and passive sonar systems for maritime platforms.</p>\n</li>\n<li><p>Familiarity with acoustic transducer design, impedance matching, and waterproofing methods.</p>\n</li>\n<li><p>Knowledge of marine and defense standards (MIL-STD-167, MIL-STD-461, IEC 60529, etc.).</p>\n</li>\n<li><p>Experience with acoustic modeling software (e.g., COMSOL, MATLAB, BELLHOP, or equivalent).</p>\n</li>\n<li><p>Background in sensor fusion and data analysis pipelines for multi-sensor acoustic systems.</p>\n</li>\n<li><p>Experience supporting hardware-in-the-loop (HITL) or real-time signal processing systems.</p>\n</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li><p>Medical Insurance: Comprehensive health insurance plans covering a range of services</p>\n</li>\n<li><p>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</p>\n</li>\n<li><p>Saronic pays 100% of the premium for employees and 80% for dependents</p>\n</li>\n<li><p>Time Off: Generous PTO and Holidays</p>\n</li>\n<li><p>Parental Leave: Paid maternity and paternity leave to support new parents</p>\n</li>\n<li><p>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</p>\n</li>\n<li><p>Retirement Plan: 401(k) plan</p>\n</li>\n<li><p>Stock Options: Equity options to give employees a stake in the company’s success</p>\n</li>\n<li><p>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</p>\n</li>\n<li><p>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</p>\n</li>\n</ul>\n<p>Physical Requirements:</p>\n<ul>\n<li><p>Prolonged periods of sitting at a desk and working on a computer.</p>\n</li>\n<li><p>Occasional standing and walking within the office.</p>\n</li>\n<li><p>Manual dexterity to operate a computer keyboard, mouse, and other office equipment.</p>\n</li>\n<li><p>Visual acuity to read screens, documents, and reports.</p>\n</li>\n<li><p>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies.</p>\n</li>\n<li><p>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages).</p>\n</li>\n</ul>\n<p>Additional Information:</p>\n<ul>\n<li>This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3).</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_e2d1fa71-cef","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/d0f3e7d8-26c7-4b88-932a-3fb258e00d37","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Electrical Engineering","Acoustic System Design","Integration and Testing","Hardware Evaluation","Data Analysis","Signal Processing","Analog Front-end Design","Data Acquisition Systems","Marine Environmental Factors","Schematic Design and Documentation Tools","Lab Instrumentation","Technical Writing and Presentation"],"x-skills-preferred":["Active and Passive Sonar Systems","Acoustic Transducer Design","Impedance Matching and Waterproofing Methods","Marine and Defense Standards","Acoustic Modeling Software","Sensor Fusion and Data Analysis Pipelines","Hardware-in-the-Loop (HITL) or Real-time Signal Processing Systems"],"datePosted":"2026-04-17T12:57:09.848Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Electrical Engineering, Acoustic System Design, Integration and Testing, Hardware Evaluation, Data Analysis, Signal Processing, Analog Front-end Design, Data Acquisition Systems, Marine Environmental Factors, Schematic Design and Documentation Tools, Lab Instrumentation, Technical Writing and Presentation, Active and Passive Sonar Systems, Acoustic Transducer Design, Impedance Matching and Waterproofing Methods, Marine and Defense Standards, Acoustic Modeling Software, Sensor Fusion and Data Analysis Pipelines, Hardware-in-the-Loop (HITL) or Real-time Signal Processing Systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_edaaa5b1-6da"},"title":"Perception Engineer","description":"<p>We are seeking a Perception Engineer to play a pivotal role in designing, developing, and implementing perception systems for our autonomous surface vessels.</p>\n<p>Our team is focused on making boats go and perform tasks with no human involvement. This job is available at multiple levels, including entry, senior, and staff.</p>\n<p>The successful candidate will develop algorithms and models which allow boats to sense and navigate, as well as develop metrics which allow quantitative analysis of improvements and regressions in boat performance.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Develop algorithms and models which allow boats to sense and navigate</li>\n<li>Develop metrics which allow quantitative analysis of improvements and regressions in boat performance</li>\n<li>Analyze and work with large data systems to enable model training and evaluation</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Strong programming fundamentals</li>\n<li>Extensive programming experience and demonstrated ability to work on large systems</li>\n<li>Computing Fundamentals</li>\n<li>A general understanding of operating systems and or similar large scale systems</li>\n<li>An understanding of basic computer architecture</li>\n<li>A demonstrated willingness to learn and pivot based on new information</li>\n</ul>\n<p>Useful Skills:</p>\n<ul>\n<li>Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch)</li>\n<li>Understanding of various filters and their applications</li>\n<li>Proficiency in Rust</li>\n<li>Experience with maritime or autonomous vehicle projects</li>\n<li>Experience with signals processing or sensor fusion</li>\n<li>Experience with low latency inference and tracking pipelines</li>\n<li>Experience with path planning algorithms</li>\n<li>Experience training and deploying multi modal models</li>\n<li>Experience with various sensors including radar, cameras, and lidar</li>\n<li>Experience developing and optimizing deployed ML systems</li>\n</ul>\n<p>Physical Demands:</p>\n<ul>\n<li>Prolonged periods of sitting at a desk and working on a computer</li>\n<li>Occasional standing and walking within the office</li>\n<li>Manual dexterity to operate a computer keyboard, mouse, and other office equipment</li>\n<li>Visual acuity to read screens, documents, and reports</li>\n<li>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies</li>\n<li>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages)</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Medical Insurance: Comprehensive health insurance plans covering a range of services</li>\n<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</li>\n<li>Saronic pays 100% of the premium for employees and 80% for dependents</li>\n<li>Time Off: Generous PTO and Holidays</li>\n<li>Parental Leave: Paid maternity and paternity leave to support new parents</li>\n<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</li>\n<li>Retirement Plan: 401(k) plan</li>\n<li>Stock Options: Equity options to give employees a stake in the company’s success</li>\n<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</li>\n<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</li>\n</ul>\n<p>Additional Information:</p>\n<p>This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3).</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_edaaa5b1-6da","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/30af5320-d158-4127-969f-de7ee92504ce","x-work-arrangement":"onsite","x-experience-level":"entry|senior|staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Strong programming fundamentals","Extensive programming experience and demonstrated ability to work on large systems","Computing Fundamentals","Understanding of basic computer architecture","Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch)","Proficiency in Rust","Experience with maritime or autonomous vehicle projects","Experience with signals processing or sensor fusion","Experience with low latency inference and tracking pipelines","Experience with path planning algorithms","Experience training and deploying multi modal models","Experience with various sensors including radar, cameras, and lidar","Experience developing and optimizing deployed ML systems"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:56:56.687Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Strong programming fundamentals, Extensive programming experience and demonstrated ability to work on large systems, Computing Fundamentals, Understanding of basic computer architecture, Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch), Proficiency in Rust, Experience with maritime or autonomous vehicle projects, Experience with signals processing or sensor fusion, Experience with low latency inference and tracking pipelines, Experience with path planning algorithms, Experience training and deploying multi modal models, Experience with various sensors including radar, cameras, and lidar, Experience developing and optimizing deployed ML systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_8b82d370-9f7"},"title":"Open Application","description":"<p>At Varjo, we are pioneers in the immersive computing revolution. Our mixed reality solutions redefine realism, creating virtual experiences that match the authenticity of the real world. We are not just a company; we are a team of talents from around the globe, where diversity fuels innovation and drives results.</p>\n<p>Join a multicultural team where English is our daily working language, providing an inclusive and collaborative atmosphere. At Varjo, we believe in the power of different experiences, backgrounds, and ideas coming together to shape the future of immersive computing.</p>\n<p>We are seeking the best and brightest to join us on this exhilarating journey. As we continue to set new standards in technology, we invite you to be a part of our vision for the future. When we are done, computers will look nothing like what they do right now.</p>\n<p>Areas and Technologies We Work With:</p>\n<ul>\n<li>C++/C Programming</li>\n<li>Embedded C/C++</li>\n<li>SLAM and Computer Vision (image processing, object recognition and detection)</li>\n<li>Algorithm Design and Optimization</li>\n<li>GPU/CPU Programming</li>\n<li>Unity and Unreal Development</li>\n<li>Sensor Fusion</li>\n<li>ROS (Robot Operating System)</li>\n<li>ADAS (Advanced Driver Assistance Systems)</li>\n<li>3D Reconstruction</li>\n<li>CUDA</li>\n<li>Optics and Cameras</li>\n<li>Audio and Video Streaming</li>\n</ul>\n<p>Apply Now\nSubmit an open application, including your CV, a link to your LinkedIn profile, and details of projects that make you particularly proud. If you have connections at Varjo, feel free to drop some names. Join Varjo and play a crucial role in shaping the future of immersive computing.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_8b82d370-9f7","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Varjo","sameAs":"https://apply.workable.com","logo":"https://logos.yubhub.co/j.com.png"},"x-apply-url":"https://apply.workable.com/j/B64F9C1C64","x-work-arrangement":"onsite","x-experience-level":"entry|mid|senior|staff|executive","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["C++/C Programming","Embedded C/C++","SLAM and Computer Vision","Algorithm Design and Optimization","GPU/CPU Programming","Unity and Unreal Development","Sensor Fusion","ROS (Robot Operating System)","ADAS (Advanced Driver Assistance Systems)","3D Reconstruction","CUDA","Optics and Cameras","Audio and Video Streaming"],"x-skills-preferred":[],"datePosted":"2026-03-08T17:55:55.964Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Helsinki"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++/C Programming, Embedded C/C++, SLAM and Computer Vision, Algorithm Design and Optimization, GPU/CPU Programming, Unity and Unreal Development, Sensor Fusion, ROS (Robot Operating System), ADAS (Advanced Driver Assistance Systems), 3D Reconstruction, CUDA, Optics and Cameras, Audio and Video Streaming"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_440a65d7-eed"},"title":"Software Engineer - Sensing, Consumer Products","description":"<p><strong>Software Engineer - Sensing, Consumer Products</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Department</strong></p>\n<p>Consumer Products</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$325K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p><strong>About the Team</strong></p>\n<p>Consumer Products Research prototypes the future of computing: we explore new modalities, interaction patterns, and system behaviors, then do the engineering required to make those ideas real in rigorous prototypes. The Neosensing team sits at the intersection of sensing, edge algorithms, and systems engineering. We build the end-to-end software that turns new signals into dependable capabilities—collection tooling and protocols, algorithm integration and evaluation hooks, and on-device loops that stay stable under real-world variability. We care deeply about software quality and iteration speed: clean interfaces, debuggability, observability, and performance under tight device constraints.</p>\n<p><strong>About the Role</strong></p>\n<p>As a Software Engineer on Consumer Products Research, you’ll sit at the boundary between algorithm development and shippable systems. You’ll work closely with algorithm engineers to translate prototypes into clean interfaces, reliable pipelines, and efficient on-device implementations—with strong attention to performance, observability, and real-world failure modes.</p>\n<p>This is a software role first: we’re looking for someone who loves writing great code every day, takes pride in engineering craft, and is comfortable going deep enough into the algorithmic details to make the system work end-to-end.</p>\n<p><strong>This role is based in San Francisco, CA. We use a hybrid work model of four days in the office per week and offer relocation assistance to new employees.</strong></p>\n<p><strong>In this role, you will:</strong></p>\n<ul>\n<li>Build and ship production software for sensing algorithms, translating algorithm prototypes into reliable end-to-end systems.</li>\n</ul>\n<ul>\n<li>Implement and own key parts of the Python shipping pipeline (integration surfaces, evaluation hooks, and quality/performance guardrails).</li>\n</ul>\n<ul>\n<li>Develop embedded/on-device software in an RTOS environment (e.g., Zephyr) and deploy models to device runtimes and hardware accelerators.</li>\n</ul>\n<ul>\n<li>Optimize real-time on-device perception loops (e.g., detection/tracking-style pipelines) for stability, latency, power, and memory constraints.</li>\n</ul>\n<ul>\n<li>Create data collection + instrumentation tooling to bring up new sensing modalities and accelerate iteration from prototype → dataset → model → device.</li>\n</ul>\n<ul>\n<li>Partner cross-functionally (algorithms, human data, firmware/hardware) to debug, profile, and harden systems against real-world variability.</li>\n</ul>\n<p><strong>You might thrive in this role if you:</strong></p>\n<ul>\n<li>Love writing great software and want your work to sit close to novel sensing and edge algorithms.</li>\n</ul>\n<ul>\n<li>Understand algorithm behavior well enough to integrate, debug, and evaluate it—even if you’re not the primary model inventor.</li>\n</ul>\n<ul>\n<li>Have shipped production Python systems and care about clean interfaces, tests, and long-term maintainability.</li>\n</ul>\n<ul>\n<li>Enjoy embedded/on-device work and can debug across hardware, firmware, and higher-level application layers.</li>\n</ul>\n<ul>\n<li>Care about performance engineering and know how to profile and optimize under tight device constraints.</li>\n</ul>\n<ul>\n<li>Take ownership end-to-end and thrive in ambiguous, fast-moving, zero-to-one environments.</li>\n</ul>\n<p><strong>Bonus:</strong></p>\n<ul>\n<li>Zephyr (or similar RTOS) experience.</li>\n</ul>\n<ul>\n<li>On-device ML deployment (NPU/GPU/DSP) and accelerator-aware profiling/optimization.</li>\n</ul>\n<ul>\n<li>Background in multimodal sensing, sensor fusion, or on-device perception.</li>\n</ul>\n<ul>\n<li>Experience building data collection systems and human-in-the-loop workflows (protocols, QA, metadata)</li>\n</ul>\n<p><strong>About OpenAI</strong></p>\n<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences of our users and the broader community.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_440a65d7-eed","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/f6dfb6c0-44af-4512-af8c-967b8bb12867","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$325K • Offers Equity","x-skills-required":["Python","Zephyr","RTOS","Embedded/on-device software development","Data collection and instrumentation tooling","Algorithm integration and evaluation","Clean interfaces and long-term maintainability","Performance engineering and profiling/optimization"],"x-skills-preferred":["Zephyr (or similar RTOS) experience","On-device ML deployment (NPU/GPU/DSP) and accelerator-aware profiling/optimization","Background in multimodal sensing, sensor fusion, or on-device perception","Experience building data collection systems and human-in-the-loop workflows (protocols, QA, metadata)"],"datePosted":"2026-03-06T18:23:18.008Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Zephyr, RTOS, Embedded/on-device software development, Data collection and instrumentation tooling, Algorithm integration and evaluation, Clean interfaces and long-term maintainability, Performance engineering and profiling/optimization, Zephyr (or similar RTOS) experience, On-device ML deployment (NPU/GPU/DSP) and accelerator-aware profiling/optimization, Background in multimodal sensing, sensor fusion, or on-device perception, Experience building data collection systems and human-in-the-loop workflows (protocols, QA, metadata)","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":325000,"maxValue":325000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_238b9e04-44a"},"title":"Lead Engineer, ADAS HiL Test (f/m/d)","description":"<p>Opening. This role exists to develop and maintain HiL test platforms, integrating sensor simulations, vehicle dynamics models, and real-time system debugging.</p>\n<p><strong>What you&#39;ll do</strong></p>\n<p>Construct ADAS simulation scenarios, develop automation scripts, and identify algorithm or system defects in simulations.</p>\n<ul>\n<li>Construct ADAS simulation scenarios (covering L2 and L2++ functionalities) using tools like Carmaker or others, including vehicle dynamics modeling, driver models, and complex traffic environments</li>\n<li>Develop and maintain HiL (Hardware-in-the-Loop) test platforms, integrating sensor simulations (e.g., camera/radar signal injection), vehicle dynamics models, and real-time system debugging</li>\n<li>Design test cases based on functional specifications and regulatory standards (e.g., x-NCAP, i-VISTA), execute SIL/HIL simulations, and deliver test reports while tracking issue resolution</li>\n<li>Develop automation scripts (Python/CAPL) to improve efficiency and integrate with CI/CD pipelines</li>\n<li>Identify algorithm or system defects in simulations, collaborate with R&amp;D to reproduce and resolve issues</li>\n</ul>\n<p><strong>What you need</strong></p>\n<ul>\n<li>Proficiency in VTD, Carmaker or other Simulink tools, with vehicle dynamics modeling experience</li>\n<li>Programming Skill: Strong skills in Python/C++/MATLAB for scripting and tool development</li>\n<li>Automotive Knowledge: Deep understanding of ADAS principles, sensor fusion (camera/radar/LiDAR), and communication protocols (CAN/in-vehicle Ethernet)</li>\n<li>Standards: Familiarity with x-NCAP, and ADAS function regulations</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_238b9e04-44a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Porsche Engineering Group GmbH","sameAs":"https://jobs.porsche.com","logo":"https://logos.yubhub.co/jobs.porsche.com.png"},"x-apply-url":"https://jobs.porsche.com/index.php?ac=jobad&id=19530","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Proficiency in VTD, Carmaker or other Simulink tools","Vehicle dynamics modeling experience","Programming Skill: Strong skills in Python/C++/MATLAB for scripting and tool development","Automotive Knowledge: Deep understanding of ADAS principles, sensor fusion (camera/radar/LiDAR), and communication protocols (CAN/in-vehicle Ethernet)","Standards: Familiarity with x-NCAP, and ADAS function regulations"],"x-skills-preferred":["Experience in applying deep learning to simulation testing (e.g., synthetic data generation)","Cloud-based simulation or large-scale parallel testing expertise","Knowledge of AUTOSAR architecture"],"datePosted":"2026-02-04T13:15:11.695Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Shanghai"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Automotive","skills":"Proficiency in VTD, Carmaker or other Simulink tools, Vehicle dynamics modeling experience, Programming Skill: Strong skills in Python/C++/MATLAB for scripting and tool development, Automotive Knowledge: Deep understanding of ADAS principles, sensor fusion (camera/radar/LiDAR), and communication protocols (CAN/in-vehicle Ethernet), Standards: Familiarity with x-NCAP, and ADAS function regulations, Experience in applying deep learning to simulation testing (e.g., synthetic data generation), Cloud-based simulation or large-scale parallel testing expertise, Knowledge of AUTOSAR architecture"}]}