{"version":"0.1","company":{"name":"YubHub","url":"https://yubhub.co","jobsUrl":"https://yubhub.co/jobs/skill/sensing"},"x-facet":{"type":"skill","slug":"sensing","display":"Sensing","count":17},"x-feed-size-limit":100,"x-feed-sort":"enriched_at desc","x-feed-notice":"This feed contains at most 100 jobs (the most recently enriched). For the full corpus, use the paginated /stats/by-facet endpoint or /search.","x-generator":"yubhub-xml-generator","x-rights":"Free to redistribute with attribution: \"Data by YubHub (https://yubhub.co)\"","x-schema":"Each entry in `jobs` follows https://schema.org/JobPosting. YubHub-native raw fields carry `x-` prefix.","jobs":[{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_ce50312d-7f5"},"title":"Optical Systems Engineer","description":"<p>Anduril Industries is a defence technology company with a mission to transform U.S. and allied military capabilities with advanced technology. We are seeking a Systems Engineer to join our Imaging team, which develops state-of-the-art imaging systems across both hardware and software, deployed to tackle the most significant security challenges of America and its allies.</p>\n<p>The successful candidate will be responsible for defining active EO/IR system architectures, performing radiometric performance modeling, and laboratory, ground, and flight testing. They will also model atmospheric and environmental effects for land, sea, and air targets, and leverage their analyses to inform system-level architectural trades.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Defining system architectures via detailed trade studies and analyses</li>\n<li>Adapting or deriving physics-based models of sensor system performance and deriving requirements based on those models</li>\n<li>Taking ownership over the development of prototype sensor systems in a lab environment</li>\n<li>Supporting both laboratory-based and field-based experiments and demonstrations</li>\n<li>Engaging technically with the hardware, software, and mission teams to define, integrate, and deliver high-performance systems to our customers</li>\n<li>Validating models against laboratory and real-world data</li>\n<li>Coordinating and communicating with both external &amp; internal stakeholders with varying degrees of technical background</li>\n</ul>\n<p>Required qualifications include:</p>\n<ul>\n<li>Model development and validation in one or more domains relevant to remote sensing</li>\n<li>Ability to decompose complex systems into functional parts, with clearly defined interfaces and requirements for each</li>\n<li>Experience in system design and development from initial concept through test and delivery to customer</li>\n<li>Strong laboratory skills (rapid prototyping, design of experiments) and familiarity with opto-electrical test and measurement equipment</li>\n<li>Willingness to travel to test sites, as needed</li>\n<li>Must be able to obtain and hold a U.S. security clearance</li>\n</ul>\n<p>Preferred qualifications include:</p>\n<ul>\n<li>Knowledge of current and next-generation advanced sensing systems and platforms</li>\n<li>Active and passive sensing domain experience (e.g., LiDAR, EO/IR, Radar)</li>\n<li>Experience with laser systems, including opto-mechanical scanning and beam steering techniques</li>\n<li>Knowledge of electro-mechanical control systems, including their performance requirements and verification methodologies</li>\n<li>Experience with LiDAR/Radar signal processing techniques, and knowledge of their implementation on real-time embedded platforms (FPGA, MCU, etc.)</li>\n<li>Experience with signature and/or scenario modeling/software packages (MODTRAN, SPIRITS, AFSIM)</li>\n</ul>\n<p>Salary: $166,000 - $220,000 USD per year, plus highly competitive equity grants and top-tier benefits for full-time employees, including comprehensive medical, dental, and vision plans, income protection, generous time off, family planning and parenting support, mental health resources, professional development, commuter benefits, relocation assistance, and a retirement savings plan.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_ce50312d-7f5","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5037784007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$166,000 - $220,000 USD per year","x-skills-required":["Model development and validation","System design and development","Laboratory skills","Opto-electrical test and measurement equipment","U.S. security clearance"],"x-skills-preferred":["Current and next-generation advanced sensing systems and platforms","Active and passive sensing domain experience","Laser systems","Electro-mechanical control systems","LiDAR/Radar signal processing techniques"],"datePosted":"2026-04-18T15:58:12.732Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Lexington, Massachusetts, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Model development and validation, System design and development, Laboratory skills, Opto-electrical test and measurement equipment, U.S. security clearance, Current and next-generation advanced sensing systems and platforms, Active and passive sensing domain experience, Laser systems, Electro-mechanical control systems, LiDAR/Radar signal processing techniques","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":166000,"maxValue":220000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_69596d11-8ad"},"title":"Field Engineer","description":"<p>Anduril Industries is seeking a Field Engineer to join the Imaging team. The successful candidate will be responsible for integrating and testing EO/IR cameras at customer facilities. This will involve integrating sensors onto aircraft, performing health and status checkouts, managing safety standards and best practices around flight lines, and debugging issues as needed.</p>\n<p>The Field Engineer will also be responsible for testing and debugging sensors that are in the build and characterization phase of production. They will work closely with the engineering team to identify and troubleshoot system hardware and software issues.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Planning, leading, and managing the integration and testing of high-performance imaging systems from first build through field deployment.</li>\n<li>Travelling to test sites and working with internal and external stakeholders.</li>\n<li>Performing hands-on integration of the electrical, firmware, mechanical, and software components of the system and providing feedback to the engineering team.</li>\n<li>Identifying and troubleshooting system hardware and software issues.</li>\n<li>Managing prototyping lab facility and activities, including test hardware, test processes, and procedures.</li>\n</ul>\n<p>The ideal candidate will have a strong background in system design and development, with experience in laboratory skills, such as design of experiments. They will also have a degree in a relevant technical discipline and be able to work independently with minimal oversight in a fast-paced, dynamic environment.</p>\n<p>Preferred qualifications include experience with advanced sensing domain, image processing techniques, and analytical toolchains/languages, such as MATLAB, Python, or C++.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_69596d11-8ad","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.anduril.com/","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/5105245007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$146,000-$194,000 USD","x-skills-required":["System design and development","Laboratory skills","Design of experiments","Electrical engineering","Optical engineering","Mechanical engineering","Thermal management"],"x-skills-preferred":["Advanced sensing domain","Image processing techniques","Analytical toolchains/languages","MATLAB","Python","C++"],"datePosted":"2026-04-18T15:57:27.700Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Lexington, Massachusetts, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"System design and development, Laboratory skills, Design of experiments, Electrical engineering, Optical engineering, Mechanical engineering, Thermal management, Advanced sensing domain, Image processing techniques, Analytical toolchains/languages, MATLAB, Python, C++","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":146000,"maxValue":194000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5962f7f0-71a"},"title":"Mission Systems Integration Engineer, Air Dominance & Strike, Active Clearance","description":"<p>We are seeking a Mission Systems Integration Engineer to join our team. As a Mission Systems Integration Engineer, you will be responsible for lab and field integration and test of payloads, subcomponents, amplifiers, and antennas. You will leverage your expertise in electronics/RF design, testing, and troubleshooting to rapidly test and integrate both internal and third-party built RF systems. You will develop test set-ups that allow for the validation of device performance using state-of-the-art RF equipment, such as spectrum and network analyzers. You will support flight test events and large-scale, customer-facing validation exercises. This will require taking ownership of end-to-end system test and working across various Anduril internal teams to appropriately integrate capabilities.</p>\n<p>If you are someone who enjoys building and delivering compelling next-generation autonomous capabilities and supporting actual vehicle integration, then this role is for you.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Test RF systems (power measurements, pulse response/synchronization, network analysis, environmental &amp; stress testing, etc.)</li>\n<li>Lead air vehicle integration and checkout efforts (EMI/interference testing, installed performance testing, data analysis)</li>\n<li>Validate system performance in close cooperation with design &amp; system engineers</li>\n<li>Work closely with AD&amp;S teams to successfully integrate &amp; deliver high-performance systems to our customers</li>\n<li>Communicate results in multiple forms internally and externally</li>\n</ul>\n<p>Requirements:</p>\n<ul>\n<li>Currently possesses and is able to maintain an active U.S. Secret security clearance</li>\n<li>Eligible to obtain and maintain an active U.S. Top Secret security clearance</li>\n<li>Communications, sensors, and effector domain expertise in RF based systems (e.g. Radar, ESM, EW, etc.)</li>\n<li>Proven, hands-on experience with RF test equipment and measurement techniques</li>\n<li>2+ years in RF system integration and validation, the more the better</li>\n<li>5+ years in relevant Science and/or Engineering related field</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Hands-on experience with both analog and digital systems integration.</li>\n<li>Ability to develop SW to enable closed-loop or automated testing of systems under test</li>\n<li>Experience with simulation and modeling tools (e.g., ADS, Ansys HFSS, COMSOL, MATLAB, etc.)</li>\n<li>Advanced sensing domain experience (e.g., EO/IR, ESM, EA, EW, LIDAR)</li>\n<li>Experience with air vehicle design, integration, test, &amp; fielding</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5962f7f0-71a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://anduril.com","logo":"https://logos.yubhub.co/anduril.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4998518007","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$146,000-$194,000 USD","x-skills-required":["RF system integration","RF test equipment","Measurement techniques","Communication systems","Sensor systems","Effector systems","Radar systems","ESM systems","EW systems"],"x-skills-preferred":["Analog systems integration","Digital systems integration","SW development","Simulation and modeling tools","Advanced sensing domain experience","Air vehicle design","Integration","Test","Fielding"],"datePosted":"2026-04-18T15:46:33.321Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"RF system integration, RF test equipment, Measurement techniques, Communication systems, Sensor systems, Effector systems, Radar systems, ESM systems, EW systems, Analog systems integration, Digital systems integration, SW development, Simulation and modeling tools, Advanced sensing domain experience, Air vehicle design, Integration, Test, Fielding","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":146000,"maxValue":194000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_38b7c64e-222"},"title":"Chief Engineer, Space","description":"<p>We are seeking a Chief Engineer to join our rapidly growing space team in Costa Mesa, CA. As a technical lead, you will be responsible for driving technology developments through design, build, test and delivery for on-orbit operations across high dV propulsion systems, optical sensor suites, closed loop guidance and navigation and systems engineering in close concert with new and legacy ground software enterprises.</p>\n<p>The Chief Engineer will develop a technical strategy and roadmap in conjunction with the Head of Engineering for Space Missions and the Growth/Strategy leads of the Space team that effectively leverages current Anduril technologies and newly developed technologies to serve the customer mission areas. This role will address pre-award demonstrations, early contract phases, and full-contract execution.</p>\n<p>The Chief Engineer, working with a multi-disciplinary engineering team, will ensure the efficient development and delivery of hardware and software solutions for the customer. These efforts will include leading complex, integrated systems for on-orbit operations and demonstrations to include sub-contracted teammates and strategic partnerships.</p>\n<p>The chief engineer will need to facilitate effective technical development and integration that spans company-boundaries in an agile development approach in the context of a continuous integration/continuous deployment paradigm. They will need to balance development of new capabilities with the incorporation of several pre-existing Anduril or industry capabilities and need to effectively communicate the development status to internal and external stakeholders.</p>\n<p>The Chief Engineer will be responsible for technical oversight of all aspects of program execution from proposals to SRR, PDR, CDR, MRR, TRR, assembly, qualification testing, and delivery of flight units for launch. The Chief Engineer in conjunction with the Head of Engineering for Space Missions will assist in building out the necessary infrastructure for Anduril to execute on all necessary aspects a space based interceptor consisting of satellite and payload build and test to include working with external vendors for establishing test requirements and pass/fail criteria.</p>\n<p>The Chief Engineer will also work closely with the Anduril Space C2 team to enable and deliver Space Ground C2 products to both the customer and to further Anduril’s penetration into customer ecosystems.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Providing technical expertise across multiple spacecraft subsystems and ground systems</li>\n<li>Leveraging strategic partnerships and teammates for building out new products to satisfy customer requirements in new and challenging mission areas</li>\n<li>Providing space industry expertise to refine and advance the Anduril Space Based Interceptor strategy for customer engagements and demonstrations</li>\n<li>Contributing technical inputs to proposals to include engineering execution plans</li>\n<li>Providing oversight to a multi-disciplinary team of engineers through all phases of program execution up to and including delivery and operation of capabilities on-orbit</li>\n<li>Performing mission and system level trade studies and analysis to balance risk, cost, and performance</li>\n</ul>\n<p>Required qualifications include:</p>\n<ul>\n<li>Experience in architecting, developing, and delivering defense/IC satellite and space hardware and software systems</li>\n<li>Experience in high dV spacecraft, dynamic space operations, space based sensing and understand mission operations for surveillance and reconnaissance at multiple orbits</li>\n<li>Understanding of the recent/current technical state of key USSF and IC space systems and mission areas</li>\n<li>Demonstrated ability to decompose complex systems with competing objectives (e.g., performance, schedule, openness, cybersecurity, MLS) into executable requirements and execution then delivering on those to meet government expectations</li>\n<li>Experience with spacecraft standards for qualification testing and acceptance testing to include the use of GEVS, SMC-S-016 (MIL-STD-1540), etc.</li>\n<li>Experience with US government space mission areas and associated CONOPS for successfully executing missions to enable space for the warfighter</li>\n<li>Design and development of space vehicle and payload architectures and detailed mission solutions to complex space domain challenges</li>\n<li>Strong technical communication skills, providing the ability to deliver critical technical briefings and conclusions to technical leadership, company leadership, and customers</li>\n<li>Strong technical fundamentals in space systems, space ground infrastructure, and product development</li>\n<li>Technical experience leading US Government contracts</li>\n<li>Eligible to obtain and maintain an active U.S. Secret security clearance</li>\n</ul>\n<p>Salary: $254,000 - $336,000 USD</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_38b7c64e-222","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Anduril Industries","sameAs":"https://www.andurilindustries.com/","logo":"https://logos.yubhub.co/andurilindustries.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/andurilindustries/jobs/4921486007","x-work-arrangement":"onsite","x-experience-level":"executive","x-job-type":"full-time","x-salary-range":"$254,000 - $336,000 USD","x-skills-required":["architecting","developing","delivering","defense","IC","satellite","space","hardware","software","systems","high","dV","spacecraft","dynamic","operations","based","sensing","understand","mission","surveillance","reconnaissance","multiple","orbits","recent","current","technical","state","key","USSF","areas","decompose","complex","competing","objectives","performance","schedule","openness","cybersecurity","MLS","executable","requirements","execution","government","expectations","standards","qualification","testing","acceptance","GEVS","SMC-S-016","MIL-STD-1540","US","associated","CONOPS","successfully","executing","missions","enable","warfighter","design","development","vehicle","payload","architectures","detailed","solutions","domain","challenges","strong","communication","skills","deliver","critical","briefings","conclusions","leadership","company","customers","fundamentals","ground","infrastructure","product","experience","leading","contracts","eligible","obtain","maintain","active","U.S.","Secret","security","clearance"],"x-skills-preferred":[],"datePosted":"2026-04-18T15:46:32.723Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Costa Mesa, California, United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"architecting, developing, delivering, defense, IC, satellite, space, hardware, software, systems, high, dV, spacecraft, dynamic, operations, based, sensing, understand, mission, surveillance, reconnaissance, multiple, orbits, recent, current, technical, state, key, USSF, areas, decompose, complex, competing, objectives, performance, schedule, openness, cybersecurity, MLS, executable, requirements, execution, government, expectations, standards, qualification, testing, acceptance, GEVS, SMC-S-016, MIL-STD-1540, US, associated, CONOPS, successfully, executing, missions, enable, warfighter, design, development, vehicle, payload, architectures, detailed, solutions, domain, challenges, strong, communication, skills, deliver, critical, briefings, conclusions, leadership, company, customers, fundamentals, ground, infrastructure, product, experience, leading, contracts, eligible, obtain, maintain, active, U.S., Secret, security, clearance","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":254000,"maxValue":336000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_1044b51e-cc6"},"title":"Senior Manager, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Develop advanced perception algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks by integrating data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques.</li>\n<li>Develop state estimation capabilities by designing and refining algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs.</li>\n<li>Analyze and utilize sensor ICDs to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance by tuning and evaluating perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration by working closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings by leveraging synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams to ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing by contributing novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement – Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience.</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience.</li>\n<li>7+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D.</li>\n<li>2+ years of people leadership experience.</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems.</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>\n<li>Experience deploying perception software on SWaP-constrained platforms.</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments.</li>\n<li>Understanding of sensing challenges in denied or degraded conditions.</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_1044b51e-cc6","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/cebc0dd3-ffbf-4013-a2ad-ae32732cabd3","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$229,233 - $343,849 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree","10+ years of related experience","7+ years of experience in Unmanned Systems programs in the DoD or applied R&D","2+ years of people leadership experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models","Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches","Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications","Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs","Proficiency with version control, debugging, and test-driven development in cross-functional teams","Ability to obtain a SECRET clearance"],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions","Exposure to perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:04:16.670Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC / San Diego, California / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, 10+ years of related experience, 7+ years of experience in Unmanned Systems programs in the DoD or applied R&D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":229233,"maxValue":343849,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_3f0b0cce-7be"},"title":"Manager, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.\nThe role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>We are seeking a skilled and motivated manager to lead technical teams and support direct projects integrating perception solutions for defense platforms.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land.\nOur Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Multidisciplinary Team Leadership – Lead teams across autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution.</li>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p>Required Qualifications:</p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>5+ years of experience in Unmanned Systems programs in the DoD or applied R&amp;D</li>\n<li>2+ years of people leadership experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models.</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches.</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications.</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs.</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams.</li>\n<li>Ability to obtain a SECRET clearance.</li>\n</ul>\n<p>Preferred Qualifications:</p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems.</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks.</li>\n<li>Experience deploying perception software on SWaP-constrained platforms.</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments.</li>\n<li>Understanding of sensing challenges in denied or degraded conditions.</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_3f0b0cce-7be","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/1120529c-2f7d-4b27-a29b-50976c49c433","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,441 - $330,661 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience","Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience","5+ years of experience in Unmanned Systems programs in the DoD or applied R&D","2+ years of people leadership experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models."],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions"],"datePosted":"2026-04-17T13:04:04.648Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Washington, DC"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience, 5+ years of experience in Unmanned Systems programs in the DoD or applied R&D, 2+ years of people leadership experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models., Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220441,"maxValue":330661,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_841c78ea-841"},"title":"Senior Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs.\nThe role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.\nImplement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.\nDevelop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.\nAnalyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.\nOptimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.\nSupport autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.\nValidate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.\nCollaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.\nDrive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_841c78ea-841","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/d6f1d906-5c1e-4640-87f3-3e31e1b45fa6","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$160,000 - $240,000 a year","x-skills-required":["BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience","Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience","Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models","Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches","Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications","Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs","Proficiency with version control, debugging, and test-driven development in cross-functional teams","Ability to obtain a SECRET clearance"],"x-skills-preferred":["Hands-on integration or algorithm development with airborne sensing systems","Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks","Experience deploying perception software on SWaP-constrained platforms","Familiarity with validating perception systems during flight test events or operational environments","Understanding of sensing challenges in denied or degraded conditions","Exposure to perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:46.950Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience, Typically requires a minimum of 5 years of related experience with a Bachelor’s degree; or 4 years and a Master’s degree; or 2 years with a PhD; or equivalent work experience, Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models, Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches, Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications, Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs, Proficiency with version control, debugging, and test-driven development in cross-functional teams, Ability to obtain a SECRET clearance, Hands-on integration or algorithm development with airborne sensing systems, Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks, Experience deploying perception software on SWaP-constrained platforms, Familiarity with validating perception systems during flight test events or operational environments, Understanding of sensing challenges in denied or degraded conditions, Exposure to perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":160000,"maxValue":240000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_5f911dd8-860"},"title":"Senior Staff Engineer, Software - Perception","description":"<p>This role is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Requirements:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 10 years of related experience with a Bachelor’s degree; or 9 years and a Master’s degree; or 7 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferences:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_5f911dd8-860","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/5cf8609e-ce9a-47e9-8956-00dae756e406","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$220,800 - $331,200 a year","x-skills-required":["algorithm development","sensor fusion","state estimation","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","perception software deployment on SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:03:35.432Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"algorithm development, sensor fusion, state estimation, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, version control, debugging, test-driven development, hands-on integration with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, perception software deployment on SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":220800,"maxValue":331200,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_bed4759c-578"},"title":"Staff Engineer, Software - Perception","description":"<p>This position is ideal for an individual who thrives on building advanced perception systems that enable autonomous aircraft to operate effectively in complex and contested environments.</p>\n<p>A successful candidate will be skilled in developing real-time object detection, sensor fusion, and state estimation algorithms using data from diverse mission sensors such as EO/IR cameras, radars, and IMUs. The role requires strong algorithmic thinking, deep familiarity with airborne sensing systems, and the ability to deliver performant software in simulation and real-world conditions.</p>\n<p>Shield AI is committed to developing cutting-edge autonomy for unmanned aircraft operating across all Department of Defense (DoD) domains, including air, sea, and land. Our Perception Engineers are instrumental in creating the situational awareness that underpins autonomy, ensuring our systems understand and respond to the operational environment with speed, precision, and resilience.</p>\n<p><strong>Responsibilities:</strong></p>\n<ul>\n<li>Develop advanced perception algorithms , Design and implement robust algorithms for object detection, classification, and multi-target tracking across diverse sensor modalities.</li>\n<li>Implement sensor fusion frameworks , Integrate data from vision systems, radars, and other mission sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness.</li>\n<li>Develop state estimation capabilities , Design and refine algorithms for localization and pose estimation using IMU, GPS, vision, and other onboard sensing inputs to enable stable and accurate navigation.</li>\n<li>Analyze and utilize sensor ICDs , Interpret interface control documents (ICDs) and technical specifications for aircraft-mounted sensors to ensure correct data handling, interpretation, and synchronization.</li>\n<li>Optimize perception performance , Tune and evaluate perception pipelines for performance, robustness, and real-time efficiency in both simulation and real-world environments.</li>\n<li>Support autonomy integration , Work closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules.</li>\n<li>Validate in simulated and operational settings , Leverage synthetic data, simulation environments, and field testing to validate algorithm accuracy and mission readiness.</li>\n<li>Collaborate with hardware and sensor teams , Ensure seamless integration of perception algorithms with onboard compute platforms and diverse sensor payloads.</li>\n<li>Drive innovation in airborne sensing , Contribute novel ideas and state-of-the-art techniques to advance real-time perception capabilities for unmanned aircraft operating in complex, GPS-denied, or contested environments.</li>\n<li>Travel Requirement , Members of this team typically travel around 10-15% of the year (to different office locations, customer sites, and flight integration events).</li>\n</ul>\n<p><strong>Required Qualifications:</strong></p>\n<ul>\n<li>BS/MS in Computer Science, Electrical Engineering, Mechanical Engineering, Aerospace Engineering, and/or similar degree, or equivalent practical experience</li>\n<li>Typically requires a minimum of 7 years of related experience with a Bachelor’s degree; or 5 years and a Master’s degree; or 4 years with a PhD; or equivalent work experience</li>\n<li>Background in implementing algorithms such as Kalman Filters, multi-target tracking, or deep learning-based detection models</li>\n<li>Familiarity with fusing data from radar, EO/IR cameras, or other sensors using probabilistic or rule-based approaches</li>\n<li>Familiarity with SLAM, visual-inertial odometry, or sensor-fused localization approaches in real-time applications</li>\n<li>Ability to interpret and work with Interface Control Documents (ICDs) and hardware integration specs</li>\n<li>Proficiency with version control, debugging, and test-driven development in cross-functional teams</li>\n<li>Ability to obtain a SECRET clearance</li>\n</ul>\n<p><strong>Preferred Qualifications:</strong></p>\n<ul>\n<li>Hands-on integration or algorithm development with airborne sensing systems</li>\n<li>Experience with ML frameworks such as PyTorch or Tensorflow, particularly for vision-based object detection or classification tasks</li>\n<li>Experience deploying perception software on SWaP-constrained platforms</li>\n<li>Familiarity with validating perception systems during flight test events or operational environments</li>\n<li>Understanding of sensing challenges in denied or degraded conditions</li>\n<li>Exposure to perception applications across air, maritime, and ground platforms</li>\n</ul>\n<p>$182,720 - $274,080 a year</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_bed4759c-578","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/8739c509-b6ea-4640-bcc1-c8b5b1de31b2","x-work-arrangement":"onsite","x-experience-level":"staff","x-job-type":"full-time","x-salary-range":"$182,720 - $274,080 a year","x-skills-required":["real-time object detection","sensor fusion","state estimation algorithms","EO/IR cameras","radars","IMUs","Kalman Filters","multi-target tracking","deep learning-based detection models","probabilistic or rule-based approaches","SLAM","visual-inertial odometry","sensor-fused localization","Interface Control Documents","hardware integration specs","version control","debugging","test-driven development"],"x-skills-preferred":["hands-on integration or algorithm development with airborne sensing systems","ML frameworks such as PyTorch or Tensorflow","vision-based object detection or classification tasks","SWaP-constrained platforms","validating perception systems during flight test events or operational environments","sensing challenges in denied or degraded conditions","perception applications across air, maritime, and ground platforms"],"datePosted":"2026-04-17T13:02:45.901Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Diego, California / Washington, DC / Boston, MA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"real-time object detection, sensor fusion, state estimation algorithms, EO/IR cameras, radars, IMUs, Kalman Filters, multi-target tracking, deep learning-based detection models, probabilistic or rule-based approaches, SLAM, visual-inertial odometry, sensor-fused localization, Interface Control Documents, hardware integration specs, version control, debugging, test-driven development, hands-on integration or algorithm development with airborne sensing systems, ML frameworks such as PyTorch or Tensorflow, vision-based object detection or classification tasks, SWaP-constrained platforms, validating perception systems during flight test events or operational environments, sensing challenges in denied or degraded conditions, perception applications across air, maritime, and ground platforms","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":182720,"maxValue":274080,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_44b0e889-15a"},"title":"Director of Business Development, Middle East","description":"<p>We are hiring a Director of Business Development to lead our Middle East growth strategy, building out a team responsible for significant annual orders targets. The ideal candidate will have deep defense, national security, or aerospace experience in the region, established government relationships, and a track record of closing complex defense programs.</p>\n<p>Key responsibilities include:</p>\n<ul>\n<li>Owning and executing Middle East go-to-market strategy for Shield AI&#39;s product portfolio, including X-BAT, V-BAT, ViDAR, and Hivemind.</li>\n<li>Maintaining direct responsibility for annual order targets, multi-year pipeline development, and order growth.</li>\n<li>Leading end-to-end capture and sales efforts from early engagement through contract award and follow-on growth.</li>\n<li>Engaging directly with senior military, government, and defense industry stakeholders across the region.</li>\n<li>Differentiating Shield AI in competitive and regulated defense procurements.</li>\n<li>Providing actionable market and customer intelligence to influence product roadmap and broader international strategy.</li>\n</ul>\n<p>Requirements include:</p>\n<ul>\n<li>10+ years business development experience across the Middle East region within defense, aerospace, or national security.</li>\n<li>Specific prior defense or government-facing experience in the Middle East with a deep understanding of regional procurement processes, offset requirements, export sensitivities, local defense primes, and indigenization requirements.</li>\n<li>Fluency in English (written and spoken). Fluency in Arabic strongly preferred.</li>\n<li>Demonstrated success closing complex defense or government programs - ideally across both hardware (aircraft/UAS) and software (autonomy/ai).</li>\n<li>Technical experience with autonomy, UAVs, ISR, and advanced sensing technologies.</li>\n<li>Strong negotiation, relationship-building, and stakeholder management skills.</li>\n<li>Ability to manage multiple concurrent pursuits, across multiple markets, involving technical products, in ambiguous environments.</li>\n<li>Proven cross-functional leadership across operations, product, and BD teams.</li>\n<li>Exceptional written, verbal, and presentation skills.</li>\n<li>Willingness and ability to travel frequently in region and internationally (~25%).</li>\n<li>Willingness and ability to work across multiple time zones (e.g. US HQ, Asia based colleagues, Middle East partners)</li>\n<li>Based in or willing to relocate to the Middle East (Abu Dhabi, United Arab Emirates).</li>\n<li>Mastery of critical business tools (Microsoft Suite – PPT, Excel, Word, LLMs, etc.)</li>\n<li>Experience forecasting, tracking, and managing deal cycles using Salesforce.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_44b0e889-15a","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Shield AI","sameAs":"https://www.shield.ai","logo":"https://logos.yubhub.co/shield.ai.png"},"x-apply-url":"https://jobs.lever.co/shieldai/1797f337-f646-4fd1-8f31-a8163213eaec","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Business development","Defense","National security","Aerospace","Government relations","Complex program management","Autonomy","UAVs","ISR","Advanced sensing technologies","Negotiation","Relationship-building","Stakeholder management","Cross-functional leadership","Microsoft Suite","Salesforce"],"x-skills-preferred":["Arabic language fluency","US citizenship","Existing US or allied security clearance","Prior military service or operational leadership experience in the Middle East","UAV operations","Mission autonomy","AI-enabled C2 systems"],"datePosted":"2026-04-17T13:01:56.685Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Abu Dhabi"}},"employmentType":"FULL_TIME","occupationalCategory":"Sales","industry":"Technology","skills":"Business development, Defense, National security, Aerospace, Government relations, Complex program management, Autonomy, UAVs, ISR, Advanced sensing technologies, Negotiation, Relationship-building, Stakeholder management, Cross-functional leadership, Microsoft Suite, Salesforce, Arabic language fluency, US citizenship, Existing US or allied security clearance, Prior military service or operational leadership experience in the Middle East, UAV operations, Mission autonomy, AI-enabled C2 systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_cab4499b-7c8"},"title":"Senior Software Engineer, Scientific Computing","description":"<p>At KoBold we believe that a modern scientific computing stack will enable systematic mineral exploration and materially improve our rate of mineral discovery. This role is a key ingredient to this strategy. As a member of our scientific computing team, you will apply software engineering and machine learning to remote-sensing, drillhole, imaging, geophysics and other mineral exploration data in order to build scalable ML systems to help make high-speed, high-quality decisions for our mineral exploration projects. Collaborating with our exceptional team of data scientists and geologists, you will tackle complex scientific problems head-on and collectively pave the way for discoveries of vital energy transition metals like lithium, copper, nickel, and cobalt. Together we can shape the future of mineral exploration and contribute to building a sustainable world.</p>\n<p>Responsibilities:</p>\n<ul>\n<li>Architect, implement, and maintain foundational scientific computing libraries that will be used in KoBold’s mineral exploration analyses.</li>\n<li>Build tooling to increase the velocity of our machine learning progress, including enabling rapid prototyping in Jupyter notebooks; build experimentation, evaluation, and simulation frameworks; turning successful R&amp;D into robust, scalable ML pipelines; and organizing models and their outputs for repeatability and discoverability.</li>\n<li>In collaboration with data scientists, build models to make statistically valid predictions about the locations of economic concentrations of ore metals within the Earth’s crust.</li>\n<li>Apply–and coach team members to use–engineering best practices such as writing robust, testable and composable code</li>\n<li>Collaborate with data scientists, geoscientists and engineers to invent the modern scientific computing stack for mineral exploration</li>\n<li>Occasional travel to exploration sites around the world to observe the impact of scientific computing on KoBold’s exploration products and design new technologies to further discovery. Travel is approximately twice per year depending on project needs.</li>\n</ul>\n<p>Qualifications:</p>\n<ul>\n<li>At least 5 years of experience as a software engineer, data scientist or ML engineer, though most great candidates will have closer to 10.</li>\n<li>Track record of building production quality data processing solutions or tooling that have delivered business value</li>\n<li>Proficiency with foundational concepts of ML, including statistical, traditional and deep-learning approaches</li>\n<li>Proficiency in Python, ideally including array-based packages such as xarray and numpy</li>\n<li>Deep experience with measured scientific data</li>\n<li>Experience in visualizing scientific data for domain experts</li>\n<li>Experience in MLops and in the making of robust ML systems</li>\n<li>Drive to increase the velocity and effectiveness of our data scientists in both experimental and production workflows</li>\n<li>Capacity to dive deep on novel challenging problems in applying ML to mineral exploration, including understanding a complex domain of geology and mineral exploration practices as well as working with limited, disparate and noisy data sources</li>\n<li>Collaborative attitude to work with stakeholders with different backgrounds (data scientists, geoscientists, software engineers, operations)</li>\n</ul>\n<p>Work practices and motivation:</p>\n<ul>\n<li>Ability to take ownership and responsibility of large projects.</li>\n<li>Intellectual curiosity and eagerness to learn about all aspects of mineral exploration, particularly in the geology domain. Open to working directly with geologists in the field. Enjoys constantly learning such that you are driving insights and innovations.</li>\n<li>Ability to explain technical problems to and collaborate on solutions with domain experts who aren’t software developers. A strong communicator who enjoys working with colleagues across the company.</li>\n<li>Excitement about joining a fast-growing early-stage company, comfort with a dynamic work environment, and eagerness to take on a range of responsibilities.</li>\n<li>Keen not just to build cool technology, but to figure out what technical product to build to best achieve the business objectives of the company.</li>\n<li>Ability to independently prioritize multiple tasks effectively.</li>\n</ul>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_cab4499b-7c8","directApply":true,"hiringOrganization":{"@type":"Organization","name":"KoBold Metals","sameAs":"https://www.koboldmetals.com/","logo":"https://logos.yubhub.co/koboldmetals.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/koboldmetals/jobs/4624038005","x-work-arrangement":"remote","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$170,000 - $215,000","x-skills-required":["Python","Machine Learning","Scientific Computing","Data Science","Geophysics","Remote Sensing","Drillhole Imaging","Jupyter Notebooks","MLops","Robust ML Systems"],"x-skills-preferred":[],"datePosted":"2026-04-17T12:40:56.506Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Remote"}},"jobLocationType":"TELECOMMUTE","employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Machine Learning, Scientific Computing, Data Science, Geophysics, Remote Sensing, Drillhole Imaging, Jupyter Notebooks, MLops, Robust ML Systems","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":170000,"maxValue":215000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d98d3d4a-4c2"},"title":"EW Architect","description":"<p>We&#39;re looking for an EW Architect to help us define our products in the broader context of the Pentagon&#39;s warfighting and procurement enterprise. This individual will be responsible for interfacing with force capability planners, wargaming and simulation experts, and requirements working groups to shape the Pentagon&#39;s thinking and later roadmaps and budget programming requests.</p>\n<p>The ideal candidate will have 10 years of experience working with complex EW or RF sensing programs on the acquisition side, as well as experience working with force capability planning, modeling &amp; simulation, and Pentagon budgeting. They should also possess a US Secret Security Clearance, with Top Secret preferred.</p>\n<p>Additional responsibilities include working with program execution offices (PEOs) and Defense RFPs, understanding modeling &amp; simulation and wargaming simulations, and having a firm understanding of modern and legacy EW systems and how they are employed.</p>\n<p>We offer a competitive salary, stock options, and benefits, including health, vision, and dental. We also provide 401K enrollment at 90 days, generous PTO, and professional growth and development opportunities.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d98d3d4a-4c2","directApply":true,"hiringOrganization":{"@type":"Organization","name":"CX2","sameAs":"https://cx2.com/","logo":"https://logos.yubhub.co/cx2.com.png"},"x-apply-url":"https://jobs.lever.co/cx2/e51dbdf7-4d9c-4683-bfbc-328ba4a7d133","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["EW","RF sensing","force capability planning","modeling & simulation","Pentagon budgeting","program execution offices (PEOs)","Defense RFPs","modern and legacy EW systems"],"x-skills-preferred":["autonomous systems","drone development","OODA loop understanding"],"datePosted":"2026-04-17T12:28:01.523Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"United States"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"EW, RF sensing, force capability planning, modeling & simulation, Pentagon budgeting, program execution offices (PEOs), Defense RFPs, modern and legacy EW systems, autonomous systems, drone development, OODA loop understanding"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d13ea291-b17"},"title":"Research Scientist, AnthroKrishi","description":"<p>As a Research Scientist on the AnthroKrishi team, you will develop next-generation AI to address global challenges in food security and climate change. You will lead research that pushes the boundaries of computer vision and machine learning, with a direct path to impacting global agricultural systems.</p>\n<p>Key responsibilities include pioneering novel computer vision models to create a unified understanding of agriculture from diverse satellite data sources, solving core AI problems by developing generalizable models that are robust across varied agricultural systems, leading research toward the grand challenge of field-level crop yield forecasting, designing and executing large-scale experiments, writing high-quality, reusable code, and contributing to a production-ready system.</p>\n<p>You will also mentor junior researchers, collaborate with cross-functional teams across Google, and publish your work at top-tier conferences.</p>\n<p>We are looking for a passionate and talented researcher with a strong foundation and a proven ability to conduct impactful research in AI. You should have a PhD or equivalent practical research experience in Computer Science, AI, or a related field with a focus on computer vision and/or machine learning, a strong publication record in top-tier AI conferences, hands-on experience building and training deep learning models in frameworks such as JAX, TensorFlow, or PyTorch, and demonstrated expertise in one or more of the following: generative models, segmentation algorithms, multi-modal fusion, spatio-temporal analysis.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d13ea291-b17","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Google DeepMind","sameAs":"https://www.deepmind.com/","logo":"https://logos.yubhub.co/deepmind.com.png"},"x-apply-url":"https://job-boards.greenhouse.io/deepmind/jobs/7142337","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":null,"x-skills-required":["Computer Vision","Machine Learning","Deep Learning","JAX","TensorFlow","PyTorch"],"x-skills-preferred":["Generative Models","Segmentation Algorithms","Multi-Modal Fusion","Spatio-Temporal Analysis","Remote Sensing","Geospatial Data"],"datePosted":"2026-03-16T14:42:45.592Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Bangalore, India"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Computer Vision, Machine Learning, Deep Learning, JAX, TensorFlow, PyTorch, Generative Models, Segmentation Algorithms, Multi-Modal Fusion, Spatio-Temporal Analysis, Remote Sensing, Geospatial Data"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_91b0ded4-573"},"title":"Student Worker Program, Embedded Software Controls","description":"<p><strong>Job Description</strong></p>\n<p>We are the movers of the world and the makers of the future. At Ford, we’re all a part of something bigger than ourselves. Are you ready to change the way the world moves?</p>\n<p>Ford’s Electric Vehicles, Digital and Design (EVDD) team is charged with delivering the company’s vision of a fully electric transportation future. EVDD is customer-obsessed, entrepreneurial, and data-driven and is dedicated to delivering industry-leading customer experience for electric vehicle buyers and owners. You’ll join an agile team of doers pioneering our EV future by working collaboratively, staying focused on only what matters, and delivering excellence day in and day out. Join us to make positive change by helping build a better world where every person is free to move and pursue their dreams.</p>\n<p><strong>Responsibilities</strong></p>\n<p><strong>Develop and Deliver High-Quality C Code</strong></p>\n<p>Develop and deliver high-quality C code in a real-time embedded environment for body control features such as doors, windows, seats, latches, wipers, mirrors, interior and exterior lighting, and other small motors in the vehicle.</p>\n<p><strong>Specify, Design, and Implement Functionality</strong></p>\n<p>Specify, design, and implement functionality and behaviors of embedded subsystems, from early concept through integration and high-volume manufacturing.</p>\n<p><strong>Design Software Architecture</strong></p>\n<p>Design software architecture and drive software implementation on hardware, working closely with cross-functional teams to ensure robust system integration and validation.</p>\n<p><strong>Perform Hands-on Hardware Bring-up</strong></p>\n<p>Perform hands-on hardware bring-up, system-level debugging, and code optimization on embedded targets and vehicle-level test assets.</p>\n<p><strong>Write Component-level Tests</strong></p>\n<p>Write component-level tests and contribute to test infrastructure to ensure proper functionality, robustness, and reliability of body control features.</p>\n<p><strong>Make Performance and Optimization Trade-offs</strong></p>\n<p>Make performance and optimization trade-offs to meet product requirements, including timing, resource usage, and reliability constraints.</p>\n<p><strong>Collaborate with a Small, Fast-moving Software Team</strong></p>\n<p>Collaborate with a small, fast-moving, and passionate software team, leveraging modern software development tools and practices to build highly robust and reliable embedded systems.</p>\n<p><strong>Qualifications</strong></p>\n<p><strong>Currently Pursuing a Degree in Software Engineering</strong></p>\n<p>Currently pursuing a degree in Software Engineering, Mechatronics, Electrical Engineering, Computer Engineering, Systems Engineering, or a related field of study, with an expected graduation date between May 2026 and May 2027.</p>\n<p><strong>Proficiency in C</strong></p>\n<p>Proficiency in C, with the ability to write clean, efficient, and maintainable code for embedded systems.</p>\n<p><strong>Solid Understanding of Software Engineering Fundamentals</strong></p>\n<p>A solid understanding of software engineering fundamentals, including software architecture, modular design, and long-term maintainability.</p>\n<p><strong>Experience with Embedded Microprocessor Tools</strong></p>\n<p>Experience with embedded microprocessor tools (e.g., IDEs, debuggers, programmers, or similar toolchains used for embedded development).</p>\n<p><strong>Ability to Collaborate Effectively</strong></p>\n<p>Ability to collaborate effectively in a team environment and communicate complex technical concepts clearly to engineers from different disciplines.</p>\n<p><strong>Benefits</strong></p>\n<p><strong>Immediate Medical, Dental, Vision, and Prescription Drug Coverage</strong></p>\n<p>Immediate medical, dental, vision, and prescription drug coverage</p>\n<p><strong>Flexible Family Care Days</strong></p>\n<p>Flexible family care days, paid parental leave, new parent ramp-up programs, subsidized back-up child care, and more</p>\n<p><strong>Family Building Benefits</strong></p>\n<p>Family building benefits including adoption and surrogacy expense reimbursement, fertility treatments, and more</p>\n<p><strong>Vehicle Discount Program</strong></p>\n<p>Vehicle discount program for employees and family members and management leases</p>\n<p><strong>Tuition Assistance</strong></p>\n<p>Tuition assistance</p>\n<p><strong>Established and Active Employee Resource Groups</strong></p>\n<p>Established and active employee resource groups</p>\n<p><strong>Paid Time Off for Individual and Team Community Service</strong></p>\n<p>Paid time off for individual and team community service</p>\n<p><strong>A Generous Schedule of Paid Holidays</strong></p>\n<p>A generous schedule of paid holidays, including the week between Christmas and New Year’s Day</p>\n<p><strong>Paid Time Off and the Option to Purchase Additional Vacation Time</strong></p>\n<p>Paid time off and the option to purchase additional vacation time.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_91b0ded4-573","directApply":true,"hiringOrganization":{"@type":"Organization","name":"Ford Motor Company","sameAs":"https://efds.fa.em5.oraclecloud.com"},"x-apply-url":"https://efds.fa.em5.oraclecloud.com/hcmUI/CandidateExperience/en/sites/CX_1/job/56474","x-work-arrangement":"hybrid","x-experience-level":"entry","x-job-type":"full-time","x-salary-range":"This position is a salary grade 5.","x-skills-required":["C","Embedded Systems","Software Engineering","Embedded Microprocessor Tools","Collaboration"],"x-skills-preferred":["MISRA C","Controls Software","Algorithm Development","Real-time Embedded Systems","Sensing Systems"],"datePosted":"2026-03-09T11:01:43.690Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"Palo Alto, CA"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Automotive","skills":"C, Embedded Systems, Software Engineering, Embedded Microprocessor Tools, Collaboration, MISRA C, Controls Software, Algorithm Development, Real-time Embedded Systems, Sensing Systems"},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_d7d03868-78f"},"title":"Firmware Engineer, Robotics","description":"<p><strong>Job Posting</strong></p>\n<p><strong>Firmware Engineer, Robotics</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Location Type</strong></p>\n<p>On-site</p>\n<p><strong>Department</strong></p>\n<p>Research</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$185K – $268K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p>More details about our benefits are available to candidates during the hiring process.</p>\n<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>\n<p><strong>About the Team</strong></p>\n<p>Our Robotics team is focused on unlocking general-purpose robotics and advancing toward AGI-level intelligence in dynamic, real-world environments. Working across the full model and systems stack, we integrate cutting-edge hardware and software to explore a broad range of robotic form factors. We strive to seamlessly blend high-level AI capabilities with the physical constraints of real-world systems to improve people’s lives.</p>\n<p><strong>About the Role</strong></p>\n<p>As a Firmware Engineer on the Robotics team, you will help enable the next generation of embodied AI by developing low-level firmware that drives our robotic systems. You will join at an early phase of our firmware development, working alongside electrical, mechanical, and control systems engineers to bring up new boards, integrate novel sensors, and build foundational infrastructure for the distributed system that drives our robots.</p>\n<p>This role is hands-on and bare-metal focused. You will read datasheets and reference manuals, write startup code and peripheral drivers, and debug hardware–firmware interactions during board bring-up and deployment. Your work will span everything from simple single-purpose sensing devices to more complex, safety- and reliability-critical subsystems, with an emphasis on correctness, performance, and scalability.</p>\n<p>By working closely across disciplines, you will help ensure that firmware, hardware, and system-level assumptions align, and that new designs can be brought up, tested, and iterated on quickly. This role offers a unique opportunity to shape the early firmware architecture for advanced robotic systems operating in real-world environments.</p>\n<p>This role is based in San Francisco, CA, and requires in-person presence 4 days a week.</p>\n<p><strong>You might thrive in this role if you:</strong></p>\n<ul>\n<li>Have experience developing firmware for microcontrollers and enjoy working close to the hardware.</li>\n</ul>\n<ul>\n<li>Are comfortable writing bare-metal firmware, or are eager to deepen your understanding of startup code, peripheral drivers, low-level system initialization, and bootloaders.</li>\n</ul>\n<ul>\n<li>Regularly read datasheets, reference manuals, and schematics to understand how new hardware works.</li>\n</ul>\n<ul>\n<li>Have participated in board bring-up, lab debugging, or early hardware validation.</li>\n</ul>\n<ul>\n<li>Are curious about how systems fail and enjoy debugging hardware-firmware interactions using real measurement tools.</li>\n</ul>\n<ul>\n<li>Are comfortable developing in a test-driven environment as well as building testbenches or simple tooling to validate hardware and system behavior.</li>\n</ul>\n<ul>\n<li>Care about writing correct, robust firmware and improving your technical judgment through hands-on experience.</li>\n</ul>\n<p><strong>Additional, preferred qualifications:</strong></p>\n<ul>\n<li>A Bachelor’s or Master’s degree in Computer Science, Computer Engineering, Electrical Engineering, or a related field.</li>\n</ul>\n<ul>\n<li>Experience with common embedded communication protocols (e.g., SPI, I²C, UART, CAN, Ethernet, BiSS).</li>\n</ul>\n<ul>\n<li>Experience writing C++, or Rust for microcontrollers, especially in resource-constrained or bare-metal environments.</li>\n</ul>\n<ul>\n<li>Familiarity with hardware debugging tools such as JTAG/SWD, logic analyzers, oscilloscopes, or similar lab equipment.</li>\n</ul>\n<ul>\n<li>Experience with robotics, sensing systems, data acquisition, or other hardware-centric products.</li>\n</ul>\n<ul>\n<li>Clear written and verbal communication skills, especially when collaborating with hardware and systems engineers.</li>\n</ul>\n<p><strong>About OpenAI</strong></p>\n<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_d7d03868-78f","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/3f99bfef-5b1a-48ea-aed0-2dbd57b12722","x-work-arrangement":"onsite","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$185K – $268K • Offers Equity","x-skills-required":["Firmware development","Microcontrollers","Embedded communication protocols","C++","Rust","Hardware debugging tools","Robotics","Sensing systems","Data acquisition"],"x-skills-preferred":["SPI","I²C","UART","CAN","Ethernet","BiSS","JTAG/SWD","Logic analyzers","Oscilloscopes"],"datePosted":"2026-03-06T18:41:50.411Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Firmware development, Microcontrollers, Embedded communication protocols, C++, Rust, Hardware debugging tools, Robotics, Sensing systems, Data acquisition, SPI, I²C, UART, CAN, Ethernet, BiSS, JTAG/SWD, Logic analyzers, Oscilloscopes","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":185000,"maxValue":268000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_9eb55da5-7fd"},"title":"Power Architect","description":"<p><strong>Power Architect</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Department</strong></p>\n<p>Scaling</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$266K – $445K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p>More details about our benefits are available to candidates during the hiring process.</p>\n<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>\n<p><strong>About the Team</strong></p>\n<p>OpenAI’s Hardware organization develops silicon and system-level solutions designed for the unique demands of advanced AI workloads. The team is responsible for building the next generation of AI-native silicon while working closely with software and research partners to co-design hardware tightly integrated with AI models. In addition to delivering production-grade silicon for OpenAI’s supercomputing infrastructure, the team also creates custom design tools and methodologies that accelerate innovation and enable hardware optimized specifically for AI.</p>\n<p><strong>About the Role</strong></p>\n<p>We are seeking a highly skilled cross-stack power architect with deep expertise in making ML systems energy efficient. This hands-on individual contributor will sit within our silicon implementation team and work closely with architecture, kernels, chip design, silicon implementation, platform design, and the broader industry ecosystem to architect, implement, and deploy performance-per-watt optimized next-generation AI accelerator chips and systems.</p>\n<p><strong>In this role, you will:</strong></p>\n<ul>\n<li>Oversee power architecture, implementation, and execution in silicon from concept to high-volume deployment, and propose high-ROI features to maximize performance under power envelope</li>\n</ul>\n<ul>\n<li>Build chip and system-level power models grounded in empirical data and experience to guide organization-wide energy efficiency strategy. This requires a detailed understanding of ML workloads, ML chip and system architecture, silicon design, implementation, and characterization</li>\n</ul>\n<ul>\n<li>Collaborate with chip and platform architecture/design teams to explore and implement power management features, including the specification and implementation of digital/mixed-signal IP, sensing and telemetry, firmware/system software, and silicon characterization methodology (in partnership with engineering teams)</li>\n</ul>\n<ul>\n<li>Partner with silicon design and implementation teams, to optimize performance under power envelope. This includes (but is not limited to) clocking and power domain architecture, voltage/frequency selection, microarchitecture and physical-design driven power reduction, post-silicon voltage margin optimization and workload-informed power optimization</li>\n</ul>\n<ul>\n<li>Work with ecosystem partners (EDA, ASIC, IP, component vendors) to drive innovations that can improve energy efficiency</li>\n</ul>\n<p><strong>Qualifications:</strong></p>\n<ul>\n<li>Relevant degree and strong industry experience focused on the end-to-end energy-efficient ML silicon codesign</li>\n</ul>\n<ul>\n<li>Hands-on experience with power architecture, power estimation, power management and power optimization is required.</li>\n</ul>\n<ul>\n<li>Fundamental understanding of ML chip and platform architecture, performance modeling and workload power/performance characteristics is strongly preferred.</li>\n</ul>\n<ul>\n<li>Hands-on experience with power bring-up and power validation is strongly preferred.</li>\n</ul>\n<p><strong>About OpenAI</strong></p>\n<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_9eb55da5-7fd","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/12ae9cf8-54e8-40fb-aba4-1f737ce68052","x-work-arrangement":"onsite","x-experience-level":"senior","x-job-type":"full-time","x-salary-range":"$266K – $445K • Offers Equity","x-skills-required":["Power architecture","Power estimation","Power management","Power optimization","ML chip and system architecture","Silicon design","Implementation","Characterization"],"x-skills-preferred":["Digital/mixed-signal IP","Sensing and telemetry","Firmware/system software","Silicon characterization methodology","Clocking and power domain architecture","Voltage/frequency selection","Microarchitecture and physical-design driven power reduction","Post-silicon voltage margin optimization","Workload-informed power optimization"],"datePosted":"2026-03-06T18:38:42.250Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Power architecture, Power estimation, Power management, Power optimization, ML chip and system architecture, Silicon design, Implementation, Characterization, Digital/mixed-signal IP, Sensing and telemetry, Firmware/system software, Silicon characterization methodology, Clocking and power domain architecture, Voltage/frequency selection, Microarchitecture and physical-design driven power reduction, Post-silicon voltage margin optimization, Workload-informed power optimization","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":266000,"maxValue":445000,"unitText":"YEAR"}}},{"@context":"https://schema.org","@type":"JobPosting","identifier":{"@type":"PropertyValue","name":"YubHub","value":"job_440a65d7-eed"},"title":"Software Engineer - Sensing, Consumer Products","description":"<p><strong>Software Engineer - Sensing, Consumer Products</strong></p>\n<p><strong>Location</strong></p>\n<p>San Francisco</p>\n<p><strong>Employment Type</strong></p>\n<p>Full time</p>\n<p><strong>Department</strong></p>\n<p>Consumer Products</p>\n<p><strong>Compensation</strong></p>\n<ul>\n<li>$325K • Offers Equity</li>\n</ul>\n<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>\n<p><strong>Benefits</strong></p>\n<ul>\n<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>\n</ul>\n<ul>\n<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>\n</ul>\n<ul>\n<li>401(k) retirement plan with employer match</li>\n</ul>\n<ul>\n<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>\n</ul>\n<ul>\n<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>\n</ul>\n<ul>\n<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>\n</ul>\n<ul>\n<li>Mental health and wellness support</li>\n</ul>\n<ul>\n<li>Employer-paid basic life and disability coverage</li>\n</ul>\n<ul>\n<li>Annual learning and development stipend to fuel your professional growth</li>\n</ul>\n<ul>\n<li>Daily meals in our offices, and meal delivery credits as eligible</li>\n</ul>\n<ul>\n<li>Relocation support for eligible employees</li>\n</ul>\n<ul>\n<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>\n</ul>\n<p><strong>About the Team</strong></p>\n<p>Consumer Products Research prototypes the future of computing: we explore new modalities, interaction patterns, and system behaviors, then do the engineering required to make those ideas real in rigorous prototypes. The Neosensing team sits at the intersection of sensing, edge algorithms, and systems engineering. We build the end-to-end software that turns new signals into dependable capabilities—collection tooling and protocols, algorithm integration and evaluation hooks, and on-device loops that stay stable under real-world variability. We care deeply about software quality and iteration speed: clean interfaces, debuggability, observability, and performance under tight device constraints.</p>\n<p><strong>About the Role</strong></p>\n<p>As a Software Engineer on Consumer Products Research, you’ll sit at the boundary between algorithm development and shippable systems. You’ll work closely with algorithm engineers to translate prototypes into clean interfaces, reliable pipelines, and efficient on-device implementations—with strong attention to performance, observability, and real-world failure modes.</p>\n<p>This is a software role first: we’re looking for someone who loves writing great code every day, takes pride in engineering craft, and is comfortable going deep enough into the algorithmic details to make the system work end-to-end.</p>\n<p><strong>This role is based in San Francisco, CA. We use a hybrid work model of four days in the office per week and offer relocation assistance to new employees.</strong></p>\n<p><strong>In this role, you will:</strong></p>\n<ul>\n<li>Build and ship production software for sensing algorithms, translating algorithm prototypes into reliable end-to-end systems.</li>\n</ul>\n<ul>\n<li>Implement and own key parts of the Python shipping pipeline (integration surfaces, evaluation hooks, and quality/performance guardrails).</li>\n</ul>\n<ul>\n<li>Develop embedded/on-device software in an RTOS environment (e.g., Zephyr) and deploy models to device runtimes and hardware accelerators.</li>\n</ul>\n<ul>\n<li>Optimize real-time on-device perception loops (e.g., detection/tracking-style pipelines) for stability, latency, power, and memory constraints.</li>\n</ul>\n<ul>\n<li>Create data collection + instrumentation tooling to bring up new sensing modalities and accelerate iteration from prototype → dataset → model → device.</li>\n</ul>\n<ul>\n<li>Partner cross-functionally (algorithms, human data, firmware/hardware) to debug, profile, and harden systems against real-world variability.</li>\n</ul>\n<p><strong>You might thrive in this role if you:</strong></p>\n<ul>\n<li>Love writing great software and want your work to sit close to novel sensing and edge algorithms.</li>\n</ul>\n<ul>\n<li>Understand algorithm behavior well enough to integrate, debug, and evaluate it—even if you’re not the primary model inventor.</li>\n</ul>\n<ul>\n<li>Have shipped production Python systems and care about clean interfaces, tests, and long-term maintainability.</li>\n</ul>\n<ul>\n<li>Enjoy embedded/on-device work and can debug across hardware, firmware, and higher-level application layers.</li>\n</ul>\n<ul>\n<li>Care about performance engineering and know how to profile and optimize under tight device constraints.</li>\n</ul>\n<ul>\n<li>Take ownership end-to-end and thrive in ambiguous, fast-moving, zero-to-one environments.</li>\n</ul>\n<p><strong>Bonus:</strong></p>\n<ul>\n<li>Zephyr (or similar RTOS) experience.</li>\n</ul>\n<ul>\n<li>On-device ML deployment (NPU/GPU/DSP) and accelerator-aware profiling/optimization.</li>\n</ul>\n<ul>\n<li>Background in multimodal sensing, sensor fusion, or on-device perception.</li>\n</ul>\n<ul>\n<li>Experience building data collection systems and human-in-the-loop workflows (protocols, QA, metadata)</li>\n</ul>\n<p><strong>About OpenAI</strong></p>\n<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences of our users and the broader community.</p>\n<p style=\"margin-top:24px;font-size:13px;color:#666;\">XML job scraping automation by <a href=\"https://yubhub.co\">YubHub</a></p>","url":"https://yubhub.co/jobs/job_440a65d7-eed","directApply":true,"hiringOrganization":{"@type":"Organization","name":"OpenAI","sameAs":"https://jobs.ashbyhq.com","logo":"https://logos.yubhub.co/openai.com.png"},"x-apply-url":"https://jobs.ashbyhq.com/openai/f6dfb6c0-44af-4512-af8c-967b8bb12867","x-work-arrangement":"hybrid","x-experience-level":"mid","x-job-type":"full-time","x-salary-range":"$325K • Offers Equity","x-skills-required":["Python","Zephyr","RTOS","Embedded/on-device software development","Data collection and instrumentation tooling","Algorithm integration and evaluation","Clean interfaces and long-term maintainability","Performance engineering and profiling/optimization"],"x-skills-preferred":["Zephyr (or similar RTOS) experience","On-device ML deployment (NPU/GPU/DSP) and accelerator-aware profiling/optimization","Background in multimodal sensing, sensor fusion, or on-device perception","Experience building data collection systems and human-in-the-loop workflows (protocols, QA, metadata)"],"datePosted":"2026-03-06T18:23:18.008Z","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":"San Francisco"}},"employmentType":"FULL_TIME","occupationalCategory":"Engineering","industry":"Technology","skills":"Python, Zephyr, RTOS, Embedded/on-device software development, Data collection and instrumentation tooling, Algorithm integration and evaluation, Clean interfaces and long-term maintainability, Performance engineering and profiling/optimization, Zephyr (or similar RTOS) experience, On-device ML deployment (NPU/GPU/DSP) and accelerator-aware profiling/optimization, Background in multimodal sensing, sensor fusion, or on-device perception, Experience building data collection systems and human-in-the-loop workflows (protocols, QA, metadata)","baseSalary":{"@type":"MonetaryAmount","currency":"USD","value":{"@type":"QuantitativeValue","minValue":325000,"maxValue":325000,"unitText":"YEAR"}}}]}