{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/lidar"},"x-facet":{"type":"skill","slug":"lidar","display":"Lidar","count":11},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_dab43521-cfa"},"title":"Software Engineer, Robotics & Autonomous Systems","description":"<p>In this role, you&#39;ll be a key contributor building production systems for robotics data collection, model training pipelines, and evaluation infrastructure. You&#39;ll have the opportunity to own critical parts of our robotics platform, work directly with cutting-edge robotics and AV customers, and shape the future of embodied AI systems.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Owning and architecting large-scale data processing pipelines for robotics and autonomous vehicle datasets</li>\n<li>Building ML training and fine-tuning pipelines using Scale&#39;s robotics data</li>\n<li>Working across backend (Python, Node.js, C++) and frontend (React, TypeScript) stacks to build end-to-end solutions</li>\n<li>Developing tools and systems for robotics data collection, teleoperation, and model evaluation</li>\n<li>Interacting directly with robotics and AV stakeholders to understand their technical needs and drive product development</li>\n<li>Building real-time systems for robotic control, sensor fusion, and perception pipelines</li>\n<li>Designing comprehensive monitoring and evaluation frameworks for robotics models and data quality</li>\n<li>Collaborating with ML engineers and researchers to bring robotics research into production</li>\n<li>Delivering features at high velocity while maintaining system reliability and performance</li>\n</ul>\n<p>Ideally, you have:</p>\n<ul>\n<li>3+ years of software engineering experience in robotics, autonomous vehicles, or related fields</li>\n<li>Strong programming skills in Python and TypeScript/Node.js for production systems</li>\n<li>Experience with React and modern frontend development for 3D interfaces</li>\n<li>Practical experience with robotics frameworks (ROS/ROS2), simulation environments, or AV systems</li>\n<li>Understanding of distributed systems, workflow orchestration, and cloud infrastructure (AWS, Temporal, Kubernetes, Docker)</li>\n<li>Experience with databases (MongoDB, PostgreSQL) and data processing at scale</li>\n<li>Track record of working with cross-functional teams including ML engineers, researchers, and customers</li>\n<li>Strong communication skills and ability to operate with high autonomy</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Experience with C++</li>\n<li>Experience with robotics hardware platforms (robotic arms, mobile robots, perception systems) with a focus on time synchronization</li>\n<li>Background in computer vision, SLAM, motion planning, or imitation learning</li>\n<li>Familiarity with autonomous vehicle data, lidar technologies, or 3D data processing</li>\n<li>Experience with ML model deployment and serving frameworks</li>\n<li>Knowledge of teleoperation systems (ALOHA, UMI, hand tracking) or VR interfaces</li>\n<li>Experience with workflow orchestration systems (Temporal, Airflow)</li>\n<li>Published research or open-source contributions in robotics or autonomous systems</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_dab43521-cfa","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://www.scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4618065005","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$180,000-$225,000 USD","x-skills-required":["Python","TypeScript","Node.js","React","C++","ROS/ROS2","simulation environments","AV systems","distributed systems","workflow orchestration","cloud infrastructure","databases","data processing"],"x-skills-preferred":["robotics hardware platforms","computer vision","SLAM","motion planning","imitation learning","autonomous vehicle data","lidar technologies","3D data processing","ML model deployment","serving frameworks","teleoperation systems","VR interfaces","workflow orchestration systems"],"datePosted":"2026-04-18T15:59:33.174Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, TypeScript, Node.js, React, C++, ROS/ROS2, simulation environments, AV systems, distributed systems, workflow orchestration, cloud infrastructure, databases, data processing, robotics hardware platforms, computer vision, SLAM, motion planning, imitation learning, autonomous vehicle data, lidar technologies, 3D data processing, ML model deployment, serving frameworks, teleoperation systems, VR interfaces, workflow orchestration systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":180000,"maxValue":225000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d6fc00c5-564"},"title":"Software Engineer, Robotics","description":"<p>We&#39;re seeking a skilled Software Engineer to join our Robotics business unit, focused on solving the data bottleneck in Physical AI across Robotics, Autonomous Vehicles, and Computer Vision. As a key contributor, you&#39;ll own and architect large-scale data processing pipelines, build ML training and fine-tuning pipelines, and develop tools and real-time systems for robotics data collection, teleoperation, model evaluation, data curation, and data annotation.</p>\n<p>In this role, you&#39;ll interact directly with robotics and AV stakeholders to understand their technical needs and drive product development. You&#39;ll also design comprehensive monitoring and evaluation frameworks for robotics models and data quality, and collaborate with ML engineers and researchers to bring robotics research into production.</p>\n<p>To succeed, you&#39;ll need at least 6 years of high-proficiency software engineering experience, with a strong background in complex systems and the ability to independently research, analyze, and unblock hard technical problems. You should have strong programming skills in Python and TypeScript/Node.js for production systems, experience with React and modern frontend development for 3D interfaces, and concurrent and real-time systems expertise.</p>\n<p>We&#39;re looking for someone who can deliver features at high velocity while maintaining system reliability and performance, and has a track record of working with cross-functional teams including ML engineers, researchers, and customers. Strong communication skills and the ability to operate with high autonomy are essential.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d6fc00c5-564","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4612282005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","TypeScript/Node.js","React","Concurrent and real-time systems","Distributed systems","Workflow orchestration","Cloud infrastructure","Databases","Data processing at large scale"],"x-skills-preferred":["C++","Robotics hardware platforms","Computer vision","SLAM","Motion planning","Imitation learning","Autonomous vehicle data","Lidar technologies","3D data processing","ML model deployment and serving frameworks","Teleoperation systems","VR interfaces","Workflow orchestration systems"],"datePosted":"2026-04-18T15:59:19.712Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Argentina; Uruguay"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, TypeScript/Node.js, React, Concurrent and real-time systems, Distributed systems, Workflow orchestration, Cloud infrastructure, Databases, Data processing at large scale, C++, Robotics hardware platforms, Computer vision, SLAM, Motion planning, Imitation learning, Autonomous vehicle data, Lidar technologies, 3D data processing, ML model deployment and serving frameworks, Teleoperation systems, VR interfaces, Workflow orchestration systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ef6605f2-fe0"},"title":"Software Engineer, Robotics","description":"<p>We&#39;re looking for a skilled Software Engineer to join our Robotics business unit. As a key contributor, you&#39;ll build production systems for robotics data collection, model training pipelines, and evaluation infrastructure. You&#39;ll have the opportunity to own critical parts of our robotics platform, work directly with cutting-edge robotics and AV customers, and shape the future of embodied AI systems.</p>\n<p>Your responsibilities will include:</p>\n<ul>\n<li>Owning and architecting large-scale data processing pipelines for robotics and autonomous vehicle datasets</li>\n<li>Building ML training and fine-tuning pipelines using Scale&#39;s robotics data</li>\n<li>Working across backend (Python, Node.js, C++) and frontend (React, TypeScript) stacks to build end-to-end solutions</li>\n<li>Developing tools and real-time systems for robotics data collection, teleoperation, model evaluation, data curation, and data annotation</li>\n<li>Interacting directly with robotics and AV stakeholders to understand their technical needs and drive product development</li>\n<li>Designing comprehensive monitoring and evaluation frameworks for robotics models and data quality</li>\n</ul>\n<p>Ideal candidates will have:</p>\n<ul>\n<li>3+ years of high-proficiency software engineering experience, with a strong background in complex systems and the ability to independently research, analyze, and unblock hard technical problems</li>\n<li>Strong programming skills in Python and TypeScript/Node.js for production systems</li>\n<li>Experience with React and modern frontend development for 3D interfaces</li>\n<li>Concurrent and real-time systems, with special attention to timing constraints</li>\n<li>Understanding of distributed systems, workflow orchestration, and cloud infrastructure (AWS, Temporal, Kubernetes, Docker)</li>\n<li>Experience with databases (MongoDB, PostgreSQL) and data processing at large scale</li>\n<li>Track record of working with cross-functional teams including ML engineers, researchers, and customers</li>\n<li>Strong communication skills and ability to operate with high autonomy</li>\n</ul>\n<p>Nice to have:</p>\n<ul>\n<li>Experience with C++</li>\n<li>Experience with robotics hardware platforms (robotic arms, mobile robots, perception systems) with a focus on time synchronization</li>\n<li>Background in computer vision, SLAM, motion planning, or imitation learning</li>\n<li>Familiarity with autonomous vehicle data, lidar technologies, or 3D data processing</li>\n<li>Experience with ML model deployment and serving frameworks</li>\n<li>Knowledge of teleoperation systems (ALOHA, UMI, hand tracking) or VR interfaces</li>\n<li>Experience with workflow orchestration systems (Temporal, Airflow)</li>\n<li>Published research or open-source contributions in robotics or autonomous systems</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ef6605f2-fe0","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Scale","sameAs":"https://scale.com/","logo":"https://logos.yubhub.co/scale.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/scaleai/jobs/4655050005","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Python","TypeScript","Node.js","C++","React","Distributed systems","Workflow orchestration","Cloud infrastructure","Databases","Data processing"],"x-skills-preferred":["Robotics hardware platforms","Computer vision","SLAM","Motion planning","Imitation learning","Autonomous vehicle data","Lidar technologies","3D data processing","ML model deployment","Serving frameworks","Teleoperation systems","VR interfaces","Workflow orchestration systems"],"datePosted":"2026-04-18T15:58:47.535Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Mexico City, MX"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, TypeScript, Node.js, C++, React, Distributed systems, Workflow orchestration, Cloud infrastructure, Databases, Data processing, Robotics hardware platforms, Computer vision, SLAM, Motion planning, Imitation learning, Autonomous vehicle data, Lidar technologies, 3D data processing, ML model deployment, Serving frameworks, Teleoperation systems, VR interfaces, Workflow orchestration systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ce50312d-7f5"},"title":"Optical Systems Engineer","description":"<p>Anduril Industries is a defence technology company with a mission to transform U.S. and allied military capabilities with advanced technology. We are seeking a Systems Engineer to join our Imaging team, which develops state-of-the-art imaging systems across both hardware and software, deployed to tackle the most significant security challenges of America and its allies.</p>\n<p>The successful candidate will be responsible for defining active EO/IR system architectures, performing radiometric performance modeling, and laboratory, ground, and flight testing. They will also model atmospheric and environmental effects for land, sea, and air targets, and leverage their analyses to inform system-level architectural trades.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Defining system architectures via detailed trade studies and analyses</li>\n<li>Adapting or deriving physics-based models of sensor system performance and deriving requirements based on those models</li>\n<li>Taking ownership over the development of prototype sensor systems in a lab environment</li>\n<li>Supporting both laboratory-based and field-based experiments and demonstrations</li>\n<li>Engaging technically with the hardware, software, and mission teams to define, integrate, and deliver high-performance systems to our customers</li>\n<li>Validating models against laboratory and real-world data</li>\n<li>Coordinating and communicating with both external &amp; internal stakeholders with varying degrees of technical background</li>\n</ul>\n<p>Required qualifications include:</p>\n<ul>\n<li>Model development and validation in one or more domains relevant to remote sensing</li>\n<li>Ability to decompose complex systems into functional parts, with clearly defined interfaces and requirements for each</li>\n<li>Experience in system design and development from initial concept through test and delivery to customer</li>\n<li>Strong laboratory skills (rapid prototyping, design of experiments) and familiarity with opto-electrical test and measurement equipment</li>\n<li>Willingness to travel to test sites, as needed</li>\n<li>Must be able to obtain and hold a U.S. security clearance</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Knowledge of current and next-generation advanced sensing systems and platforms</li>\n<li>Active and passive sensing domain experience (e.g., LiDAR, EO/IR, Radar)</li>\n<li>Experience with laser systems, including opto-mechanical scanning and beam steering techniques</li>\n<li>Knowledge of electro-mechanical control systems, including their performance requirements and verification methodologies</li>\n<li>Experience with LiDAR/Radar signal processing techniques, and knowledge of their implementation on real-time embedded platforms (FPGA, MCU, etc.)</li>\n<li>Experience with signature and/or scenario modeling/software packages (MODTRAN, SPIRITS, AFSIM)</li>\n</ul>\n<p>Salary: $166,000 - $220,000 USD per year, plus highly competitive equity grants and top-tier benefits for full-time employees, including comprehensive medical, dental, and vision plans, income protection, generous time off, family planning and parenting support, mental health resources, professional development, commuter benefits, relocation assistance, and a retirement savings plan.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ce50312d-7f5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5037784007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$166,000 - $220,000 USD per year","x-skills-required":["Model development and validation","System design and development","Laboratory skills","Opto-electrical test and measurement equipment","U.S. security clearance"],"x-skills-preferred":["Current and next-generation advanced sensing systems and platforms","Active and passive sensing domain experience","Laser systems","Electro-mechanical control systems","LiDAR/Radar signal processing techniques"],"datePosted":"2026-04-18T15:58:12.732Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Lexington, Massachusetts, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Model development and validation, System design and development, Laboratory skills, Opto-electrical test and measurement equipment, U.S. security clearance, Current and next-generation advanced sensing systems and platforms, Active and passive sensing domain experience, Laser systems, Electro-mechanical control systems, LiDAR/Radar signal processing techniques","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_eccf1031-6f3"},"title":"Senior Computer Vision Engineer, Space","description":"<p>We are seeking a Senior Computer Vision Engineer to join our rapidly growing team in Washington DC. The ideal candidate will have a strong background in computer vision and machine learning, with experience in developing and implementing computer vision algorithms for various spacecraft efforts in all orbital regimes.</p>\n<p>The Senior Computer Vision Engineer will be responsible for proposing and prototyping innovative solutions to solve real-world problems, developing and maintaining core libraries and runtime applications, integrating classical and geometric methods in computer vision with ML methods, and working with space vehicle CV software and hardware subsystems for various spacecraft efforts in all orbital regimes.</p>\n<p>The successful candidate will have a Master&#39;s or Ph.D. in Machine Learning, Robotics, or Computer Science, with a strong background in computer vision and machine learning. They will also have experience in one or more of the following: objection detection, object tracking, instance segmentation, semantic segmentation, semantic change detection, natural feature tracking (NFT), visual odometry, SLAM, multi-view geometry, structure from motion, 3D geometry, discriminative correlation filters, stereo, neural 3D reconstruction, multi-band sensor processing, RGB-D and LIDAR sensor fusion.</p>\n<p>The Senior Computer Vision Engineer will work closely with related teams, including Sensors, GNC, Avionics, Systems, Flight Software, Mission Operations, and Ground Software, to develop and implement computer vision algorithms for various spacecraft efforts in all orbital regimes.</p>\n<p>The ideal candidate will have excellent communication and organizational skills, including documentation and training material, and will be able to work effectively in a fast-paced environment with tight deadlines.</p>\n<p>The salary range for this role is $191,000-$253,000 USD, and highly competitive equity grants are included in the majority of full-time offers.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_eccf1031-6f3","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5016343007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["Machine Learning","Robotics","Computer Science","Computer Vision","Object Detection","Object Tracking","Instance Segmentation","Semantic Segmentation","Semantic Change Detection","Natural Feature Tracking (NFT)","Visual Odometry","SLAM","Multi-view Geometry","Structure from Motion","3D Geometry","Discriminative Correlation Filters","Stereo","Neural 3D Reconstruction","Multi-band Sensor Processing","RGB-D and LIDAR Sensor Fusion"],"x-skills-preferred":["Matlab","Simulink","Python","Go","C++","Linux systems","OpenCV","NFT"],"datePosted":"2026-04-18T15:55:25.839Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, District of Columbia, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Machine Learning, Robotics, Computer Science, Computer Vision, Object Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT), Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band Sensor Processing, RGB-D and LIDAR Sensor Fusion, Matlab, Simulink, Python, Go, C++, Linux systems, OpenCV, NFT","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cb57c7a1-7e6"},"title":"Senior Computer Vision Engineer, Space","description":"<p>We are looking for a Senior Computer Vision Engineer to join our rapidly growing team in Costa Mesa, CA. In this role, you will be responsible for working on and understanding the design of all perception subsystems to include but not limited to hardware sensors and advanced processing platforms, navigation algorithms, flight software implementation, subsystem integration &amp; test (I&amp;T), and vehicle I&amp;T.</p>\n<p>The computer vision engineering team will work closely with related teams, including Sensors, GNC, Avionics, Systems, Flight Software, Mission Operations, and Ground Software. The Computer Vision Engineer will work algorithm design, truth and physics modeling, scene rendering, simulation and analysis for a wide variety of spacecraft and space missions to include but not limited to LEO, MEO, GEO, Reentry, and RPOD (Rendezvous Proximity Operations and Docking).</p>\n<p>The CV Engineer will help lead successful implementation, validation and CV operations of Anduril’s fleet of space vehicles. This role is directly tied to ongoing, funded programs within Anduril’s Space Business Line. The programs require building and fielding a resilient, software-defined spacecraft systems across numerous mission threads. We work with mission partners and customers to deploy reliable and robust capabilities on operationally-relevant fielding timelines to meet complex challenges across the DOD and IC.</p>\n<p>The position requires a strong background in computer vision, machine learning, and software development, with experience in developing and implementing computer vision algorithms for space-based applications. The ideal candidate will have a deep understanding of computer vision principles, including object detection, tracking, and recognition, as well as experience with software development in languages such as C++ and Python.</p>\n<p>Key Responsibilities:</p>\n<ul>\n<li>Propose and prototype innovative solutions to solve real-world problems, leveraging the latest state-of-the-art techniques in the field</li>\n<li>Develop and maintain core libraries and runtime applications</li>\n<li>Integrate classical and geometric methods in computer vision with ML methods</li>\n<li>Work space vehicle CV software and hardware subsystems for various spacecraft efforts in all orbital regimes and work closely with partners for successful implementation</li>\n<li>Develop modern, software-defined approaches to autonomous spacecraft operations with maneuvering capabilities to successfully accomplish mission objectives</li>\n<li>Develop appropriate test plans and procedures to validate the CV system during ground checkout, on-orbit commissioning and operations</li>\n<li>Collaborate across multiple teams to plan, build, and test complex functionality</li>\n<li>Coordinate with end-users, other operators and customers to turn needs into features while balancing user experience with engineering constraints</li>\n<li>Support challenging schedules during ground testing, launch windows and on-orbit operations of the spacecraft systems</li>\n<li>Design of flight software and firmware, algorithms, and simulation products</li>\n<li>Development of space vehicle autonomy tools for dynamic space operations</li>\n<li>Test process development and execution</li>\n<li>Define automated fault detection and responses</li>\n<li>Provide hardware-in-the-loop and monte-carlo simulation capabilities</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>MS or PhD in Machine Learning, Robotics or Computer Science, Image Science with emphasis on Computer Vision</li>\n<li>BS in Computer Science, Machine Learning, Electrical Engineering, or related field</li>\n<li>Advanced professional experience developing and benchmarking ML algorithms on large-scale datasets</li>\n<li>High proficiency in C++ development in a Linux environment</li>\n<li>Experience in one or more of the following: Objection Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT)</li>\n<li>Experience in one or more of the following: Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band sensor processing, RGB-D and LIDAR Sensor Fusion</li>\n<li>Ability to quickly understand and navigate complex systems and detailed requirements</li>\n<li>Familiarity with terminal guidance, rendezvous proximity operations and docking, orbital mechanics with propulsive spacecraft, and/or spacecraft/missile GNC</li>\n<li>Clear communication and organizational skills including documentation and training material</li>\n<li>Currently possesses and is able to maintain an active U.S. Top Secret security clearance</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Experience with Matlab, Simulink, Python, Go, C++ and/or Linux systems</li>\n<li>A desire to work on critical software and hardware designs in the space domain</li>\n<li>Strong preference for candidates possessing computer vision, vision navigation, image processing, feature tracking, SLAM, Open CV, and NFT</li>\n<li>Experience testing CV subsystems in laboratory environments that mimic the space environmental constraints</li>\n<li>Experience with orbital mechanics and resident space object tracking capabilities</li>\n<li>Experience conducting spacecraft operations and satellite command and control with an emphasis on system reliability and uptime</li>\n<li>Experience with testing/validation leveraging FlatSats, Hardware-in-the-Loop testbeds and digital spacecraft simulators through nominal and fault scenarios</li>\n<li>Experience with computer vision and perception algorithms to support GNC operations</li>\n<li>Experience developing 3-DOF simplified and 6-DOF high-fidelity dynamics simulation models used for GNC systems analysis and validation</li>\n<li>Exposure to US satellite operations policy and constraints for relevant mission threads in all orbits</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cb57c7a1-7e6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5016340007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["Machine Learning","Robotics","Computer Science","Image Science","C++","Python","Objection Detection","Object Tracking","Instance Segmentation","Semantic Segmentation","Semantic Change Detection","Natural Feature Tracking (NFT)","Visual Odometry","SLAM","Multi-view Geometry","Structure from Motion","3D Geometry","Discriminative Correlation Filters","Stereo","Neural 3D Reconstruction","Multi-band sensor processing","RGB-D and LIDAR Sensor Fusion"],"x-skills-preferred":["Matlab","Simulink","Go","Linux systems","Computer vision","Vision navigation","Image processing","Feature tracking","Open CV","NFT"],"datePosted":"2026-04-18T15:52:59.001Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Machine Learning, Robotics, Computer Science, Image Science, C++, Python, Objection Detection, Object Tracking, Instance Segmentation, Semantic Segmentation, Semantic Change Detection, Natural Feature Tracking (NFT), Visual Odometry, SLAM, Multi-view Geometry, Structure from Motion, 3D Geometry, Discriminative Correlation Filters, Stereo, Neural 3D Reconstruction, Multi-band sensor processing, RGB-D and LIDAR Sensor Fusion, Matlab, Simulink, Go, Linux systems, Computer vision, Vision navigation, Image processing, Feature tracking, Open CV, NFT","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9406d4ff-94e"},"title":"Robotics Engineer, Maritime","description":"<p>We are seeking a Robotics Engineer to contribute to the delivery of vehicle perception and planning capability integrated into our products. This includes systems analysis, sensor selection and integration, perception architecture and implementation, motion planning, health management, behavior analysis, simulation and test infrastructure, and interfaces with lower- and higher-level systems.</p>\n<p>As a Robotics Engineer, you will be responsible for implementing trusted safe navigation, collision avoidance, and situational awareness systems that balance constraints, restrictions, and requirements in a multi-stakeholder environment. You will also implement scalable sub-systems including sensor processing, perception, tracking, motion planning, health management, anomaly detection, simulation, testing fixtures, and vehicle interfaces.</p>\n<p>In addition, you will contribute to the development of existing software components across Anduril, with the aim of developing components that are reusable across multiple Anduril product lines. You will utilize advanced techniques in computer vision, sensor fusion, and machine learning to enhance perception and planning capabilities of Anduril vehicles.</p>\n<p>You will conduct thorough testing and validation of perception and planning algorithms through development and use of simulation and analysis of data from real-world experiments, including evolving the data analysis tooling. You will also collaborate with cross-functional teams, including software engineers, mechanical engineers, and systems engineers, to ensure effective system integration and testing.</p>\n<p>This role requires a strong background in robotics, mechatronics, computer science, or engineering, with experience in C++ and/or Python software development. You should have familiarity with autonomous vehicle hardware and sensors such as radar, sonar, LIDAR, and cameras. Demonstrated knowledge of at least one of computer vision, sensor fusion, SLAM, motion planning, or machine learning is required.</p>\n<p>Experience in a senior perception or planning role for the delivery of a robotic system is preferred. You should have the capacity to act as the technical owner for a system, including stakeholder engagement, requirements definition, roadmap management, team coordination, design, implementation, sustainment, and evolution. Ability to collaborate with stakeholders to define and implement robust validation and verification strategies for perception and planning modules is also required.</p>\n<p>Eligibility to obtain and maintain an active U.S. Secret security clearance is necessary for this role.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9406d4ff-94e","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5051580007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["C++","Python","Autonomous vehicle hardware","Radar","Sonar","LIDAR","Cameras","Computer vision","Sensor fusion","SLAM","Motion planning","Machine learning"],"x-skills-preferred":["Simulation tools and frameworks","Safety standards and certification processes for autonomous systems"],"datePosted":"2026-04-18T15:50:59.724Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"C++, Python, Autonomous vehicle hardware, Radar, Sonar, LIDAR, Cameras, Computer vision, Sensor fusion, SLAM, Motion planning, Machine learning, Simulation tools and frameworks, Safety standards and certification processes for autonomous systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_126e36d8-668"},"title":"Perception Engineering Intern","description":"<p>We are seeking a perception engineer with a strong background in computer vision to join our rapidly growing team in Costa Mesa, CA. In this role, you will be at the forefront of developing advanced perception systems for complex autonomous aerial platforms.</p>\n<p>Your expertise in computer vision algorithms, combined with your understanding of robotics principles, will be crucial in solving a wide variety of challenges involving visual perception, SLAM, motion planning, controls, and state estimation. This role requires not only technical expertise in computer vision and robotics but also the ability to make pragmatic engineering tradeoffs, considering the unique constraints of aerial platforms.</p>\n<p>Your work will directly contribute to the seamless integration of Anduril&#39;s products, achieving critical outcomes in autonomous operations. This position demands strong systems-level knowledge and experience, as you&#39;ll be working on the intersection of computer vision, robotics, and autonomous systems.</p>\n<p>If you are passionate about pushing the boundaries of computer vision in robotics, possess a &#39;Whatever It Takes&#39; mindset, and can execute in an expedient, scalable, and pragmatic way while keeping the mission top-of-mind and making sound engineering decisions, then this role is for you.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Work at the intersection of 3D perception and computer vision, developing robust algorithms that power real-time decision-making for autonomous aerial systems.</li>\n</ul>\n<ul>\n<li>Develop and implement advanced structure from motion and SLAM algorithms to create accurate 3D models from multiple camera inputs in real-time.</li>\n</ul>\n<ul>\n<li>Integrate perception outputs with path planning algorithms to enable autonomous navigation in complex, unstructured environments.</li>\n</ul>\n<ul>\n<li>Design experiments, data collection efforts, and curate training/evaluation sets to develop insights for both internal purposes and customers.</li>\n</ul>\n<ul>\n<li>Collaborate closely with robotics, software, and hardware teams to integrate perception algorithms into autonomous aerial systems.</li>\n</ul>\n<ul>\n<li>Work with vendors and government stakeholders to advance the state-of-the-art in perception and world modeling for autonomous aerial systems.</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>BS in Robotics, Computer Science, Mechatronics, Electrical Engineering, Mechanical Engineering, or related field.</li>\n</ul>\n<ul>\n<li>Strong knowledge of 3D computer vision concepts, including multi-view geometry, camera models, photogrammetry, depth estimation, and 3D reconstruction techniques.</li>\n</ul>\n<ul>\n<li>Fluency in standard domain libraries (numpy, opencv, pytorch, etc).</li>\n</ul>\n<ul>\n<li>Proven understanding of data structures, algorithms, concurrency, and code optimization.</li>\n</ul>\n<ul>\n<li>Experience working with Python, PyTorch, or C++ programming languages.</li>\n</ul>\n<ul>\n<li>Experience deploying software to end customers, internal or external.</li>\n</ul>\n<ul>\n<li>Must be willing to travel 25%.</li>\n</ul>\n<ul>\n<li>Eligible to obtain an active U.S. Secret security clearance.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>MS or PhD in Robotics, Computer Science, Engineering, or related field.</li>\n</ul>\n<ul>\n<li>Experience with perception systems for aerial robotics or other highly dynamic platforms.</li>\n</ul>\n<ul>\n<li>Experience with real-world sensor integrations, including LiDAR, RGB-D cameras, IR cameras, stereo cameras, or TOF cameras.</li>\n</ul>\n<ul>\n<li>Experience with GPU / CUDA programming for accelerated computer vision processing.</li>\n</ul>\n<ul>\n<li>Knowledge of path planning algorithms and their integration with perception systems in dynamic environments.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_126e36d8-668","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4830032007","x-work-arrangement":"onsite","x-experience-level":"entry","x-job-type":"internship","x-salary-range":null,"x-skills-required":["computer vision","robotics","Python","PyTorch","C++","numpy","opencv","data structures","algorithms","concurrency","code optimization"],"x-skills-preferred":["perception systems","aerial robotics","LiDAR","RGB-D cameras","IR cameras","stereo cameras","TOF cameras","GPU","CUDA"],"datePosted":"2026-04-18T15:48:07.380Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"INTERN","occupationalCategory":"Engineering","industry":"Technology","skills":"computer vision, robotics, Python, PyTorch, C++, numpy, opencv, data structures, algorithms, concurrency, code optimization, perception systems, aerial robotics, LiDAR, RGB-D cameras, IR cameras, stereo cameras, TOF cameras, GPU, CUDA"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_74277a57-9bf"},"title":"AI Sorcerer","description":"<p>JOB TITLE: AI Sorcerer LOCATION: Costa Mesa, California, United States DEPARTMENT: Air Dominance &amp; Strike : Mission Autonomy Engineering : Mission Software Engineering</p>\n<p>Anduril Industries is a defense technology company with a mission to transform U.S. and allied military capabilities with advanced technology. By bringing the expertise, technology, and business model of the 21st century’s most innovative companies to the defense industry, Anduril is changing how military systems are designed, built and sold.</p>\n<p>As the world enters an era of strategic competition, Anduril is committed to bringing cutting-edge autonomy, AI, computer vision, sensor fusion, and networking technology to the military in months, not years. Anduril is developing and deploying an integrated system to make UxVs highly effective through the entire mission lifecycle. From fleet management, to mission planning, live execution, and integrations into other DOD systems, Lattice is enabling the military to fight at scale with robots across all domains.</p>\n<p>Your Job is to design, implement, and evolve the means by which Agents interact with classical software systems. You will contribute to test and eval infrastructure, deployment software, agent tool calling, prompt engineering, and other agent techniques in order to achieve best-in-class performance with frontier and edge deployed models. You will increase the performance and quality of agentic interaction patterns, and you will expand the surface area by which Agents can interact with the overall Lattice system.</p>\n<p>Required Qualifications:</p>\n<ul>\n<li>Demonstrated deep knowledge of the implementation and use of LLM’s</li>\n<li>Experience integrating agents into classical systems</li>\n<li>Strong programming skills in Python or other core languages (Java, Go etc)</li>\n<li>PhD or Master’s degree in Computer Science, Robotics, Machine Learning, or a related field, or equivalent practical experience.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Experience with simulation or real-world validation for autonomous vehicles is highly desirable</li>\n<li>Novel application track record and experience including first author publications, participation in peer reviewed conferences, contribution to open source projects, and demonstrated contribution to the ML and AI community.</li>\n<li>Experience in multi-modal sensor data processing (e.g., cameras, LiDAR, radar).</li>\n<li>Familiarity with ML Ops best practices, including model versioning and reproducible research pipelines.</li>\n<li>Familiarity with C/C++ is a plus.</li>\n<li>General software engineering experience solving motion planning or related robotics problems.</li>\n</ul>\n<p>US Salary Range $191,000-$253,000 USD</p>\n<p>The salary range for this role is an estimate based on a wide range of compensation factors, inclusive of base salary only. Actual salary offer may vary based on (but not limited to) work experience, education and/or training, critical skills, and/or business considerations. Highly competitive equity grants are included in the majority of full time offers; and are considered part of Anduril&#39;s total compensation package.</p>\n<p>Additional Benefits:</p>\n<ul>\n<li>Healthcare Benefits - US Roles: Comprehensive medical, dental, and vision plans at little to no cost to you.</li>\n<li>UK &amp; AUS Roles: We cover full cost of medical insurance premiums for you and your dependents.</li>\n<li>IE Roles: We offer an annual contribution toward your private health insurance for you and your dependents.</li>\n<li>Income Protection: Anduril covers life and disability insurance for all employees.</li>\n<li>Generous time off: Highly competitive PTO plans with a holiday hiatus in December.</li>\n<li>Caregiver &amp; Wellness Leave is available to care for family members, bond with a new baby, or address your own medical needs.</li>\n<li>Family Planning &amp; Parenting Support: Coverage for fertility treatments (e.g., IVF, preservation), adoption, and gestational carriers, along with resources to support you and your partner from planning to parenting.</li>\n<li>Mental Health Resources: Access free mental health resources 24/7, including therapy and life coaching.</li>\n<li>Additional work-life services, such as legal and financial support, are also available.</li>\n<li>Professional Development: Annual reimbursement for professional development.</li>\n<li>Commuter Benefits: Company-funded commuter benefits based on your region.</li>\n<li>Relocation Assistance: Available depending on role eligibility.</li>\n<li>Retirement Savings Plan - US Roles: Traditional 401(k), Roth, and after-tax (mega backdoor Roth) options.</li>\n<li>UK &amp; IE Roles: Pension plan with employer match.</li>\n<li>AUS Roles: Superannuation plan.</li>\n</ul>\n<p>Protecting Yourself from Recruitment Scams Anduril is committed to maintaining the integrity of our Talent acquisition process and the security of our candidates. We&#39;ve observed a rise in sophisticated phishing and fraudulent schemes where individuals impersonate Anduril representatives, luring job seekers with false interviews or job offers. These scammers often attempt to extract payment or sensitive personal information.</p>\n<p>To ensure your safety and help you navigate your job search with confidence, please keep the following critical points in mind:</p>\n<ul>\n<li>No Financial Requests: Anduril will never solicit payment or demand personal financial details (such as banking information, credit card numbers, or social security numbers) at any stage of our hiring process. Our legitimate recruitment is entirely free for candidates.</li>\n<li>Please always verify communications:</li>\n<li>Direct from Anduril: If you receive an email from one of our recruiters, it will only come from an @anduril.com address.</li>\n<li>Via Agency Partner: If contacted by a recruiting agency for an Anduril role, their email will clearly identify their agency.</li>\n<li>Exercise Caution with Unsolicited Outreach: If you receive any communication that appears suspicious, contains grammatical errors, or makes unusual requests, do not engage.</li>\n<li>Always confirm the sender&#39;s email domain is @anduril.com before providing any personal information or clicking on links.</li>\n<li>What to Do If You Suspect Fraud: Should you encounter any questionable or fraudulent outreach claiming to be from Anduril, please report it immediately to contact@anduril.com.</li>\n</ul>\n<p>Data Privacy To view Anduril&#39;s candidate data privacy policy, please visit https://anduril.com/applicant-privacy-notice.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_74277a57-9bf","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://anduril.com","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5101597007","x-work-arrangement":"onsite","x-experience-level":null,"x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["LLM’s","Python","Java","Go","Computer Science","Robotics","Machine Learning"],"x-skills-preferred":["Simulation","Real-world validation","Autonomous vehicles","Multi-modal sensor data processing","Cameras","LiDAR","Radar","ML Ops","Model versioning","Reproducible research pipelines","C/C++","Motion planning","Robotics problems"],"datePosted":"2026-04-18T15:45:45.554Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"LLM’s, Python, Java, Go, Computer Science, Robotics, Machine Learning, Simulation, Real-world validation, Autonomous vehicles, Multi-modal sensor data processing, Cameras, LiDAR, Radar, ML Ops, Model versioning, Reproducible research pipelines, C/C++, Motion planning, Robotics problems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_6170c199-145"},"title":"Generative AI Integration Engineer","description":"<p>Anduril Industries is seeking a Generative AI Integration Engineer to join their team. As a key member of the Mission Autonomy Engineering team, you will design, implement, and evolve the means by which Agents interact with classical software systems. Your primary responsibility will be to contribute to test and eval infrastructure, deployment software, agent tool calling, prompt engineering, and other agent techniques in order to achieve best-in-class performance with frontier and edge deployed models.</p>\n<p>The ideal candidate will have a strong background in computer science, robotics, machine learning, or a related field, and experience integrating agents into classical systems. They will also have a proven track record of novel application and experience including first-author publications, participation in peer-reviewed conferences, contribution to open-source projects, and demonstrated contribution to the ML and AI community.</p>\n<p>In addition to your technical expertise, you will be a strong communicator and collaborator, able to work effectively with cross-functional teams to drive results. You will also be comfortable working in a fast-paced environment and adapting to changing priorities.</p>\n<p>If you are a motivated and experienced engineer looking to join a dynamic team and contribute to cutting-edge technology, we encourage you to apply.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_6170c199-145","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://anduril.com","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4792948007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$191,000-$253,000 USD","x-skills-required":["Deep knowledge of the implementation and use of LLMs","Experience integrating agents into classical systems","Strong programming skills in Python or other core languages (Java, Go etc)","PhD or Master's degree in Computer Science, Robotics, Machine Learning, or a related field, or equivalent practical experience"],"x-skills-preferred":["Experience with simulation or real-world validation for autonomous vehicles","Novel application track record and experience including first author publications, participation in peer reviewed conferences, contribution to open source projects, and demonstrated contribution to the ML and AI community","Experience in multi-modal sensor data processing (e.g., cameras, LiDAR, radar)","Familiarity with ML Ops best practices, including model versioning and reproducible research pipelines","Familiarity with C/C++"],"datePosted":"2026-04-18T15:42:55.696Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Deep knowledge of the implementation and use of LLMs, Experience integrating agents into classical systems, Strong programming skills in Python or other core languages (Java, Go etc), PhD or Master's degree in Computer Science, Robotics, Machine Learning, or a related field, or equivalent practical experience, Experience with simulation or real-world validation for autonomous vehicles, Novel application track record and experience including first author publications, participation in peer reviewed conferences, contribution to open source projects, and demonstrated contribution to the ML and AI community, Experience in multi-modal sensor data processing (e.g., cameras, LiDAR, radar), Familiarity with ML Ops best practices, including model versioning and reproducible research pipelines, Familiarity with C/C++","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":191000,"maxValue":253000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_edaaa5b1-6da"},"title":"Perception Engineer","description":"<p>We are seeking a Perception Engineer to play a pivotal role in designing, developing, and implementing perception systems for our autonomous surface vessels.</p>\n<p>Our team is focused on making boats go and perform tasks with no human involvement. This job is available at multiple levels, including entry, senior, and staff.</p>\n<p>The successful candidate will develop algorithms and models which allow boats to sense and navigate, as well as develop metrics which allow quantitative analysis of improvements and regressions in boat performance.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Develop algorithms and models which allow boats to sense and navigate</li>\n<li>Develop metrics which allow quantitative analysis of improvements and regressions in boat performance</li>\n<li>Analyze and work with large data systems to enable model training and evaluation</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Strong programming fundamentals</li>\n<li>Extensive programming experience and demonstrated ability to work on large systems</li>\n<li>Computing Fundamentals</li>\n<li>A general understanding of operating systems and or similar large scale systems</li>\n<li>An understanding of basic computer architecture</li>\n<li>A demonstrated willingness to learn and pivot based on new information</li>\n</ul>\n<p>Useful Skills:</p>\n<ul>\n<li>Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch)</li>\n<li>Understanding of various filters and their applications</li>\n<li>Proficiency in Rust</li>\n<li>Experience with maritime or autonomous vehicle projects</li>\n<li>Experience with signals processing or sensor fusion</li>\n<li>Experience with low latency inference and tracking pipelines</li>\n<li>Experience with path planning algorithms</li>\n<li>Experience training and deploying multi modal models</li>\n<li>Experience with various sensors including radar, cameras, and lidar</li>\n<li>Experience developing and optimizing deployed ML systems</li>\n</ul>\n<p>Physical Demands:</p>\n<ul>\n<li>Prolonged periods of sitting at a desk and working on a computer</li>\n<li>Occasional standing and walking within the office</li>\n<li>Manual dexterity to operate a computer keyboard, mouse, and other office equipment</li>\n<li>Visual acuity to read screens, documents, and reports</li>\n<li>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies</li>\n<li>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages)</li>\n</ul>\n<p>Benefits:</p>\n<ul>\n<li>Medical Insurance: Comprehensive health insurance plans covering a range of services</li>\n<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</li>\n<li>Saronic pays 100% of the premium for employees and 80% for dependents</li>\n<li>Time Off: Generous PTO and Holidays</li>\n<li>Parental Leave: Paid maternity and paternity leave to support new parents</li>\n<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</li>\n<li>Retirement Plan: 401(k) plan</li>\n<li>Stock Options: Equity options to give employees a stake in the company’s success</li>\n<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</li>\n<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</li>\n</ul>\n<p>Additional Information:</p>\n<p>This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3).</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_edaaa5b1-6da","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Saronic Technologies","sameAs":"https://www.saronictechnologies.com/","logo":"https://logos.yubhub.co/saronictechnologies.com.png"},"x-apply-url":"https://jobs.lever.co/saronic/30af5320-d158-4127-969f-de7ee92504ce","x-work-arrangement":"onsite","x-experience-level":"entry|senior|staff","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Strong programming fundamentals","Extensive programming experience and demonstrated ability to work on large systems","Computing Fundamentals","Understanding of basic computer architecture","Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch)","Proficiency in Rust","Experience with maritime or autonomous vehicle projects","Experience with signals processing or sensor fusion","Experience with low latency inference and tracking pipelines","Experience with path planning algorithms","Experience training and deploying multi modal models","Experience with various sensors including radar, cameras, and lidar","Experience developing and optimizing deployed ML systems"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:56:56.687Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"London"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Strong programming fundamentals, Extensive programming experience and demonstrated ability to work on large systems, Computing Fundamentals, Understanding of basic computer architecture, Familiarity with deep learning frameworks (e.g., TensorFlow, PyTorch), Proficiency in Rust, Experience with maritime or autonomous vehicle projects, Experience with signals processing or sensor fusion, Experience with low latency inference and tracking pipelines, Experience with path planning algorithms, Experience training and deploying multi modal models, Experience with various sensors including radar, cameras, and lidar, Experience developing and optimizing deployed ML systems"}]}