{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/sensing-systems"},"x-facet":{"type":"skill","slug":"sensing-systems","display":"Sensing Systems","count":8},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ce50312d-7f5"},"title":"Optical Systems Engineer","description":"<p>Anduril Industries is a defence technology company with a mission to transform U.S. and allied military capabilities with advanced technology. We are seeking a Systems Engineer to join our Imaging team, which develops state-of-the-art imaging systems across both hardware and software, deployed to tackle the most significant security challenges of America and its allies.</p>\n<p>The successful candidate will be responsible for defining active EO/IR system architectures, performing radiometric performance modeling, and laboratory, ground, and flight testing. They will also model atmospheric and environmental effects for land, sea, and air targets, and leverage their analyses to inform system-level architectural trades.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Defining system architectures via detailed trade studies and analyses</li>\n<li>Adapting or deriving physics-based models of sensor system performance and deriving requirements based on those models</li>\n<li>Taking ownership over the development of prototype sensor systems in a lab environment</li>\n<li>Supporting both laboratory-based and field-based experiments and demonstrations</li>\n<li>Engaging technically with the hardware, software, and mission teams to define, integrate, and deliver high-performance systems to our customers</li>\n<li>Validating models against laboratory and real-world data</li>\n<li>Coordinating and communicating with both external &amp; internal stakeholders with varying degrees of technical background</li>\n</ul>\n<p>Required qualifications include:</p>\n<ul>\n<li>Model development and validation in one or more domains relevant to remote sensing</li>\n<li>Ability to decompose complex systems into functional parts, with clearly defined interfaces and requirements for each</li>\n<li>Experience in system design and development from initial concept through test and delivery to customer</li>\n<li>Strong laboratory skills (rapid prototyping, design of experiments) and familiarity with opto-electrical test and measurement equipment</li>\n<li>Willingness to travel to test sites, as needed</li>\n<li>Must be able to obtain and hold a U.S. security clearance</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Knowledge of current and next-generation advanced sensing systems and platforms</li>\n<li>Active and passive sensing domain experience (e.g., LiDAR, EO/IR, Radar)</li>\n<li>Experience with laser systems, including opto-mechanical scanning and beam steering techniques</li>\n<li>Knowledge of electro-mechanical control systems, including their performance requirements and verification methodologies</li>\n<li>Experience with LiDAR/Radar signal processing techniques, and knowledge of their implementation on real-time embedded platforms (FPGA, MCU, etc.)</li>\n<li>Experience with signature and/or scenario modeling/software packages (MODTRAN, SPIRITS, AFSIM)</li>\n</ul>\n<p>Salary: $166,000 - $220,000 USD per year, plus highly competitive equity grants and top-tier benefits for full-time employees, including comprehensive medical, dental, and vision plans, income protection, generous time off, family planning and parenting support, mental health resources, professional development, commuter benefits, relocation assistance, and a retirement savings plan.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ce50312d-7f5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5037784007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$166,000 - $220,000 USD per year","x-skills-required":["Model development and validation","System design and development","Laboratory skills","Opto-electrical test and measurement equipment","U.S. security clearance"],"x-skills-preferred":["Current and next-generation advanced sensing systems and platforms","Active and passive sensing domain experience","Laser systems","Electro-mechanical control systems","LiDAR/Radar signal processing techniques"],"datePosted":"2026-04-18T15:58:12.732Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Lexington, Massachusetts, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Model development and validation, System design and development, Laboratory skills, Opto-electrical test and measurement equipment, U.S. security clearance, Current and next-generation advanced sensing systems and platforms, Active and passive sensing domain experience, Laser systems, Electro-mechanical control systems, LiDAR/Radar signal processing techniques","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1044b51e-cc6"},"title":"Senior Manager, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Develop advanced perception algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks by integrating data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques.</li>\n<li>Develop state estimation capabilities by designing and refining algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs.</li>\n<li>Analyze and utilize sensor ICDs to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance by tuning and evaluating perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration by working closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings by leveraging synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams to ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing by contributing novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience.</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.</li>\n<li>7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D.</li>\n<li>2+ years of people leadership experience.</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems.</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>\n<li>Experience deploying perception software on SWaP-constrained platforms.</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments.</li>\n<li>Understanding of sensing challenges in denied or degraded conditions.</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1044b51e-cc6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/cebc0dd3-ffbf-4013-a2ad-ae32732cabd3","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree","10+ years of related experience","7+ years of experience in Unmanned Systems programs in the DoD or applied R&D","2+ years of people leadership experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models","Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches","Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications","Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs","Proficiency with version control, debugging, and test-driven development in cross-functional teams","Ability to obtain a SECRET clearance"],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions","Exposure to perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:04:16.670Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / San Diego, California / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, 10+ years of related experience, 7+ years of experience in Unmanned Systems programs in the DoD or applied R&D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3f0b0cce-7be"},"title":"Manager, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.\nThe role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>We are seeking a skilled and motivated manager to lead technical teams and support direct projects integrating perception solutions for defense platforms.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land.\nOur Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Multidisciplinary Team Leadership – Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D</li>\n<li>2+ years of people leadership experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems.</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>\n<li>Experience deploying perception software on SWaP-constrained platforms.</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments.</li>\n<li>Understanding of sensing challenges in denied or degraded conditions.</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3f0b0cce-7be","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/1120529c-2f7d-4b27-a29b-50976c49c433","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,441 - $330,661 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience","Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience","5+ years of experience in Unmanned Systems programs in the DoD or applied R&D","2+ years of people leadership experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models."],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions"],"datePosted":"2026-04-17T13:04:04.648Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience, 5+ years of experience in Unmanned Systems programs in the DoD or applied R&D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models., Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220441,"maxValue":330661,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_841c78ea-841"},"title":"Senior Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.\nThe role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.\nImplement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.\nDevelop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.\nAnalyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.\nOptimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.\nSupport autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.\nValidate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.\nCollaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.\nDrive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_841c78ea-841","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/d6f1d906-5c1e-4640-87f3-3e31e1b45fa6","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$160,000 - $240,000 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience","Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models","Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches","Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications","Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs","Proficiency with version control, debugging, and test-driven development in cross-functional teams","Ability to obtain a SECRET clearance"],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions","Exposure to perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:46.950Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5f911dd8-860"},"title":"Senior Staff Engineer, Software - Perception","description":"<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5f911dd8-860","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,800 - $331,200 a year","x-skills-required":["algorithm development","sensor fusion","state estimation","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","perception software deployment on SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:35.432Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220800,"maxValue":331200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bed4759c-578"},"title":"Staff Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Required Qualifications:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p>$182,720 - $274,080 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bed4759c-578","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$182,720 - $274,080 a year","x-skills-required":["real-time object detection","sensor fusion","state estimation algorithms","EO/IR cameras","radars","IMUs","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","Interface Control Documents","hardware integration specs","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration or algorithm development with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","vision-based object detection or classification tasks","SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:02:45.901Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":182720,"maxValue":274080,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_91b0ded4-573"},"title":"Student Worker Program, Embedded Software Controls","description":"<p><strong>Job Description</strong></p>\n<p>We are the movers of the world and the makers of the future. At Ford, we’re all a part of something bigger than ourselves. Are you ready to change the way the world moves?</p>\n<p>Ford’s Electric Vehicles, Digital and Design (EVDD) team is charged with delivering the company’s vision of a fully electric transportation future. EVDD is customer-obsessed, entrepreneurial, and data-driven and is dedicated to delivering industry-leading customer experience for electric vehicle buyers and owners. You’ll join an agile team of doers pioneering our EV future by working collaboratively, staying focused on only what matters, and delivering excellence day in and day out. Join us to make positive change by helping build a better world where every person is free to move and pursue their dreams.</p>\n<p><strong>Responsibilities</strong></p>\n<p><strong>Develop and Deliver High-Quality C Code</strong></p>\n<p>Develop and deliver high-quality C code in a real-time embedded environment for body control features such as doors, windows, seats, latches, wipers, mirrors, interior and exterior lighting, and other small motors in the vehicle.</p>\n<p><strong>Specify, Design, and Implement Functionality</strong></p>\n<p>Specify, design, and implement functionality and behaviors of embedded subsystems, from early concept through integration and high-volume manufacturing.</p>\n<p><strong>Design Software Architecture</strong></p>\n<p>Design software architecture and drive software implementation on hardware, working closely with cross-functional teams to ensure robust system integration and validation.</p>\n<p><strong>Perform Hands-on Hardware Bring-up</strong></p>\n<p>Perform hands-on hardware bring-up, system-level debugging, and code optimization on embedded targets and vehicle-level test assets.</p>\n<p><strong>Write Component-level Tests</strong></p>\n<p>Write component-level tests and contribute to test infrastructure to ensure proper functionality, robustness, and reliability of body control features.</p>\n<p><strong>Make Performance and Optimization Trade-offs</strong></p>\n<p>Make performance and optimization trade-offs to meet product requirements, including timing, resource usage, and reliability constraints.</p>\n<p><strong>Collaborate with a Small, Fast-moving Software Team</strong></p>\n<p>Collaborate with a small, fast-moving, and passionate software team, leveraging modern software development tools and practices to build highly robust and reliable embedded systems.</p>\n<p><strong>Qualifications</strong></p>\n<p><strong>Currently Pursuing a Degree in Software Engineering</strong></p>\n<p>Currently pursuing a degree in Software Engineering, Mechatronics, Electrical Engineering, Computer Engineering, Systems Engineering, or a related field of study, with an expected graduation date between May 2026 and May 2027.</p>\n<p><strong>Proficiency in C</strong></p>\n<p>Proficiency in C, with the ability to write clean, efficient, and maintainable code for embedded systems.</p>\n<p><strong>Solid Understanding of Software Engineering Fundamentals</strong></p>\n<p>A solid understanding of software engineering fundamentals, including software architecture, modular design, and long-term maintainability.</p>\n<p><strong>Experience with Embedded Microprocessor Tools</strong></p>\n<p>Experience with embedded microprocessor tools (e.g., IDEs, debuggers, programmers, or similar toolchains used for embedded development).</p>\n<p><strong>Ability to Collaborate Effectively</strong></p>\n<p>Ability to collaborate effectively in a team environment and communicate complex technical concepts clearly to engineers from different disciplines.</p>\n<p><strong>Benefits</strong></p>\n<p><strong>Immediate Medical, Dental, Vision, and Prescription Drug Coverage</strong></p>\n<p>Immediate medical, dental, vision, and prescription drug coverage</p>\n<p><strong>Flexible Family Care Days</strong></p>\n<p>Flexible family care days, paid parental leave, new parent ramp-up programs, subsidized back-up child care, and more</p>\n<p><strong>Family Building Benefits</strong></p>\n<p>Family building benefits including adoption and surrogacy expense reimbursement, fertility treatments, and more</p>\n<p><strong>Vehicle Discount Program</strong></p>\n<p>Vehicle discount program for employees and family members and management leases</p>\n<p><strong>Tuition Assistance</strong></p>\n<p>Tuition assistance</p>\n<p><strong>Established and Active Employee Resource Groups</strong></p>\n<p>Established and active employee resource groups</p>\n<p><strong>Paid Time Off for Individual and Team Community Service</strong></p>\n<p>Paid time off for individual and team community service</p>\n<p><strong>A Generous Schedule of Paid Holidays</strong></p>\n<p>A generous schedule of paid holidays, including the week between Christmas and New Year’s Day</p>\n<p><strong>Paid Time Off and the Option to Purchase Additional Vacation Time</strong></p>\n<p>Paid time off and the option to purchase additional vacation time.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_91b0ded4-573","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Ford Motor Company","sameAs":"https://efds.fa.em5.oraclecloud.com"},"x-apply-url":"https://efds.fa.em5.oraclecloud.com/hcmUI/CandidateExperience/en/sites/CX_1/job/56474","x-work-arrangement":"hybrid","x-experience-level":"entry","x-job-type":"full-time","x-salary-range":"This position is a salary grade 5.","x-skills-required":["C","Embedded Systems","Software Engineering","Embedded Microprocessor Tools","Collaboration"],"x-skills-preferred":["MISRA C","Controls Software","Algorithm Development","Real-time Embedded Systems","Sensing Systems"],"datePosted":"2026-03-09T11:01:43.690Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Palo Alto, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Automotive","skills":"C, Embedded Systems, Software Engineering, Embedded Microprocessor Tools, Collaboration, MISRA C, Controls Software, Algorithm Development, Real-time Embedded Systems, Sensing Systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d7d03868-78f"},"title":"Firmware Engineer, Robotics","description":"<p><strong>Job Posting</strong></p>\n<p><strong>Firmware Engineer, Robotics</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Location Type</strong></p>\n<p>On-site</p>\n<p><strong>Department</strong></p>\n<p>Research</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$185K – $268K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p>More details about our benefits are available to candidates during the hiring process.</p>\n<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>\n<p><strong>About the Team</strong></p>\n<p>Our Robotics team is focused on unlocking general-purpose robotics and advancing toward AGI-level intelligence in dynamic, real-world environments. Working across the full model and systems stack, we integrate cutting-edge hardware and software to explore a broad range of robotic form factors. We strive to seamlessly blend high-level AI capabilities with the physical constraints of real-world systems to improve people’s lives.</p>\n<p><strong>About the Role</strong></p>\n<p>As a Firmware Engineer on the Robotics team, you will help enable the next generation of embodied AI by developing low-level firmware that drives our robotic systems. You will join at an early phase of our firmware development, working alongside electrical, mechanical, and control systems engineers to bring up new boards, integrate novel sensors, and build foundational infrastructure for the distributed system that drives our robots.</p>\n<p>This role is hands-on and bare-metal focused. You will read datasheets and reference manuals, write startup code and peripheral drivers, and debug hardware–firmware interactions during board bring-up and deployment. Your work will span everything from simple single-purpose sensing devices to more complex, safety- and reliability-critical subsystems, with an emphasis on correctness, performance, and scalability.</p>\n<p>By working closely across disciplines, you will help ensure that firmware, hardware, and system-level assumptions align, and that new designs can be brought up, tested, and iterated on quickly. This role offers a unique opportunity to shape the early firmware architecture for advanced robotic systems operating in real-world environments.</p>\n<p>This role is based in San Francisco, CA, and requires in-person presence 4 days a week.</p>\n<p><strong>You might thrive in this role if you:</strong></p>\n<ul>\n<li>Have experience developing firmware for microcontrollers and enjoy working close to the hardware.</li>\n</ul>\n<ul>\n<li>Are comfortable writing bare-metal firmware, or are eager to deepen your understanding of startup code, peripheral drivers, low-level system initialization, and bootloaders.</li>\n</ul>\n<ul>\n<li>Regularly read datasheets, reference manuals, and schematics to understand how new hardware works.</li>\n</ul>\n<ul>\n<li>Have participated in board bring-up, lab debugging, or early hardware validation.</li>\n</ul>\n<ul>\n<li>Are curious about how systems fail and enjoy debugging hardware-firmware interactions using real measurement tools.</li>\n</ul>\n<ul>\n<li>Are comfortable developing in a test-driven environment as well as building testbenches or simple tooling to validate hardware and system behavior.</li>\n</ul>\n<ul>\n<li>Care about writing correct, robust firmware and improving your technical judgment through hands-on experience.</li>\n</ul>\n<p><strong>Additional, preferred qualifications:</strong></p>\n<ul>\n<li>A Bachelor’s or Master’s degree in Computer Science, Computer Engineering, Electrical Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>Experience with common embedded communication protocols (e.g., SPI, I²C, UART, CAN, Ethernet, BiSS).</li>\n</ul>\n<ul>\n<li>Experience writing C++, or Rust for microcontrollers, especially in resource-constrained or bare-metal environments.</li>\n</ul>\n<ul>\n<li>Familiarity with hardware debugging tools such as JTAG/SWD, logic analyzers, oscilloscopes, or similar lab equipment.</li>\n</ul>\n<ul>\n<li>Experience with robotics, sensing systems, data acquisition, or other hardware-centric products.</li>\n</ul>\n<ul>\n<li>Clear written and verbal communication skills, especially when collaborating with hardware and systems engineers.</li>\n</ul>\n<p><strong>About OpenAI</strong></p>\n<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d7d03868-78f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/3f99bfef-5b1a-48ea-aed0-2dbd57b12722","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$185K – $268K • Offers Equity","x-skills-required":["Firmware development","Microcontrollers","Embedded communication protocols","C++","Rust","Hardware debugging tools","Robotics","Sensing systems","Data acquisition"],"x-skills-preferred":["SPI","I²C","UART","CAN","Ethernet","BiSS","JTAG/SWD","Logic analyzers","Oscilloscopes"],"datePosted":"2026-03-06T18:41:50.411Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Firmware development, Microcontrollers, Embedded communication protocols, C++, Rust, Hardware debugging tools, Robotics, Sensing systems, Data acquisition, SPI, I²C, UART, CAN, Ethernet, BiSS, JTAG/SWD, Logic analyzers, Oscilloscopes","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":185000,"maxValue":268000,"unitText":"YEAR"}}}]}