<?xml version="1.0" encoding="UTF-8"?>
<source>
  <jobs>
    <job>
      <externalid>be1832d1-b0d</externalid>
      <Title>Werkstudent (m/w/d) Master Data Governance</Title>
      <Description><![CDATA[<p>As a Werkstudent (part-time student) in our Master Data Governance team, you will support the operational management of master data for business partners in the MDG system. Your tasks will include participating in regular audits and documentation of data quality checks, supporting the operational maintenance of corresponding central post offices, and performing various master data changes. You will also contribute to the visualisation and optimisation of existing process landscapes and participate in corresponding development projects. Additionally, you will be involved in internal and external communication during various internal control activities in the area of master data management.</p>
<p>To be successful in this role, you should have a strong analytical mindset, excellent communication skills, and the ability to work effectively in a team. You should also be proficient in MS Office, particularly MS Excel, and have some experience with SAP.</p>
<p>As a student at Porsche, you will have the opportunity to gain valuable work experience, develop your skills, and build a network of contacts in the industry. Our company offers a dynamic and inclusive work environment, with a focus on innovation, teamwork, and personal growth.</p>
<p>If you are interested in this exciting opportunity, please submit your application, including your resume, cover letter, and relevant documents. We look forward to hearing from you!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>part-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Master Data Governance, MS Office, MS Excel, SAP, Analytical skills, Communication skills</Skills>
      <Category>Engineering</Category>
      <Industry>Automotive</Industry>
      <Employername>Dr. Ing. h.c. F. Porsche AG</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.porsche.com.png</Employerlogo>
      <Employerdescription>Porsche is a renowned automobile manufacturer with a global presence and a loyal customer base.</Employerdescription>
      <Employerwebsite>https://jobs.porsche.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.porsche.com/index.php?ac=jobad&amp;id=20394</Applyto>
      <Location>Zuffenhausen</Location>
      <Country></Country>
      <Postedate>2026-04-22</Postedate>
    </job>
    <job>
      <externalid>db261609-388</externalid>
      <Title>Principal Data &amp; Ontology Architect - AI Enablement</Title>
      <Description><![CDATA[<p>We are looking for a Principal Data &amp; Ontology Architect to support the implementation and adoption of data and ontology enablement practices and standards within Control Tower Operations to support scalable, governed, and business-aligned AI initiatives.</p>
<p>The successful candidate will serve as the primary bridge between Business Units, Global IT, and Control Tower Operations, ensuring shared understanding of data practices, workflows, and requirements. They will apply established standards for semantic modeling, domain alignment, concept reuse, and ontology lifecycle management.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Supporting the implementation and ongoing maintenance of ontology enablement practices and operating model strategy to support AI, analytics, and digital initiatives across multiple Business Units</li>
<li>Applying established standards for semantic modeling, domain alignment, concept reuse, and ontology lifecycle management</li>
<li>Serving as the enterprise subject-matter authority for ontology-related topics, providing recommendations and guidance to governance and leadership forums</li>
<li>Collaborating with Global IT and enterprise data architecture to ensure ontology practices align with enterprise data platforms and Control Tower operational processes</li>
</ul>
<ul>
<li>Partnering with Business Units to understand domain concepts, terminology, operational data, and AI use cases, translating them into ontology-aligned data structures</li>
<li>Guiding Business Units in contributing domain models, metadata, and data assets into the enterprise ontology using defined governance and intake processes</li>
<li>Enabling repeatable onboarding of Business Unit data into AI initiatives, reducing reliance on ad-hoc IT engagement and minimizing duplicated effort</li>
</ul>
<ul>
<li>Serving as a liaison between Business Units and Global IT for AI data and ontology-related matters</li>
<li>Engaging with Global IT teams to understand enterprise data platforms, workflows, standards, and operational constraints</li>
<li>Translating Global IT practices, requirements, and workflows into clear, actionable guidance for Business Unit data stewards</li>
</ul>
<ul>
<li>Educating, guiding, and supporting Business Unit data stewards on their roles in data governance, ontology contribution, and AI data enablement</li>
<li>Supporting the development and documentation of workflows, expectations, and operating models for how BU data stewards engage with the Control Tower and Global IT</li>
</ul>
<ul>
<li>Ensuring Business Unit Data Stewards understand how to prepare, govern, and submit data assets for ontology integration and AI use</li>
<li>Promoting consistent adoption of governance, quality, and semantic standards across Business Units</li>
</ul>
<ul>
<li>Supporting integration of data and ontology enablement into Control Tower workflows</li>
<li>Providing operational insight into data readiness, semantic risks, and governance gaps to inform Control Tower decision-making</li>
<li>Identifying systemic issues and contributing recommendations to drive continuous improvement of data enablement processes</li>
</ul>
<ul>
<li>Ensuring semantic integrity, data quality, lineage, and consistency are maintained as data assets flow into AI solutions</li>
<li>Identifying systemic issues and recommending continuous improvement opportunities to Control Tower Operations leadership</li>
<li>Influencing corrective actions, tooling investments, or governance updates to mitigate long-term risk</li>
</ul>
<p>This role requires a minimum of 10 years of relevant work experience in data architecture, data governance, ontology development, semantic modeling, or related disciplines, supporting cross-functional initiatives spanning multiple business units and IT organizations.</p>
<p>The ideal candidate will have in-depth expertise in ontology design, semantic modeling, and domain-driven data architecture, as well as experience contributing to the development and implementation of data and ontology strategies. They will also have demonstrated experience serving as a bridge between business stakeholders and IT organizations, with a strong ability to translate technical platforms, workflows, and constraints into business-understandable guidance.</p>
<p>A Bachelor&#39;s level degree or diploma in Computer Science, Data Science/Engineering, Applied Mathematics/Statistics, Electronics/Electrical, Information Technology/Information Sciences, or a related field of study is required. A Master&#39;s or Ph.D. degree is preferred.</p>
<p>The successful candidate will be comfortable operating in ambiguous, evolving environments with enterprise-level impact, and will have a systems-thinking mindset with understanding of AI, analytics, and enterprise data platforms.</p>
<p>Highly desirable skills include proficiency in OWL (Web Ontology Language), RDF/RDFS – graph-based data model, storage in graph databases such as Neo4j or Amazon Neptune, and querying using SPARQL for RDF-based ontologies.</p>
<p>This is an onsite job based at our ADC, Raymond, OH office. One telecommuting workday per week may be possible with prior departmental approval.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$120,400.00 - $150,500.00</Salaryrange>
      <Skills>ontology design, semantic modeling, domain-driven data architecture, data governance, AI data enablement, data quality, lineage, consistency, OWL (Web Ontology Language), RDF/RDFS – graph-based data model, graph databases, Neo4j, Amazon Neptune, SPARQL, ontology development, data architecture, data science, electronics, electrical, information technology, information sciences</Skills>
      <Category>Engineering</Category>
      <Industry>Automotive</Industry>
      <Employername>Honda</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.honda.com.png</Employerlogo>
      <Employerdescription>Honda is a multinational Japanese conglomerate that produces automobiles, motorcycles, and power equipment. It is one of the largest automobile manufacturers in the world.</Employerdescription>
      <Employerwebsite>https://careers.honda.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.honda.com/us/en/job/10812/Principal-Data-Ontology-Architect-AI-Enablement</Applyto>
      <Location>Raymond</Location>
      <Country></Country>
      <Postedate>2026-04-22</Postedate>
    </job>
    <job>
      <externalid>78270c8d-016</externalid>
      <Title>Operations Data Governance &amp; Controls Specialist</Title>
      <Description><![CDATA[<p>As an Operations Control Specialist – Data Governance &amp; Controls, you will design, implement, and support technical data governance solutions with a focus on the firm&#39;s Trader Master and related reference data domains.</p>
<p>This role requires a strong technical background in Data Management, Data Architecture, Data Lineage, Data Quality, Master Data Management (MDM), and automation within Financial Services and/or Technology.</p>
<p>You will contribute to and help lead the technical design of data governance controls, data models, and integration patterns, partnering closely with Technology and Operations teams.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Build/enhance data governance frameworks, controls, standards, and workflows (policies, definitions, entitlements).</li>
<li>Create data quality rules and monitoring; automate exception detection, alerting, remediation, SLAs, and RCA.</li>
<li>Develop Python/SQL/ETL-ELT automation for checks, controls, and reporting; deliver Tableau/Power BI dashboards and KPIs.</li>
<li>Contribute to conceptual/logical/physical data modeling for Trader Master and core domains.</li>
<li>Support MDM capabilities: golden record, matching/merging, survivorship, stewardship workflows; help shape MDM strategy.</li>
<li>Implement access/entitlement governance (RBAC, row/column security) across DB/warehouse/BI with audit compliance.</li>
<li>Maintain catalog, glossary, lineage, schema history, impact analysis; manage structured change workflows.</li>
<li>Define integration patterns (batch/API/streaming) and build reconciliations/validations across systems.</li>
<li>Manage historical/temporal data (validation, backfills, remediation) supporting regulatory/reporting/analytics.</li>
<li>Produce technical documentation (designs, runbooks, data dictionaries), share knowledge, and mentor juniors.</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>Bachelor’s degree in Computer Science, Engineering, Information Systems, Mathematics, Finance, or related field; advanced degree (MS, MBA, or equivalent) is a plus.</li>
<li>5–8 years of experience in financial services or fintech with hands-on work in data engineering, data management, or data architecture roles; exposure to trading strategies, fund structures, and financial products strongly preferred.</li>
</ul>
<p>Technical Expertise (Required):</p>
<ul>
<li>Strong Python and SQL; experience with data warehousing + ETL/ELT.</li>
<li>Familiarity with MDM/data governance tools (e.g., Collibra, Informatica, Alation) and Tableau/Power BI.</li>
<li>Proven ability to lead delivery, solve complex data issues, and communicate with technical/non-technical stakeholders.</li>
<li>Preferred certs: DAMA/CDMP, cloud (AWS/Azure/GCP), Scrum, BI/data engineering.</li>
</ul>
<p>Millennium pays a total compensation package which includes a base salary, discretionary performance bonus, and a comprehensive benefits package.</p>
<p>The estimated base salary range for this position is $70,000 to $160,000, which is specific to New York and may change in the future.</p>
<p>When finalizing an offer, we take into consideration an individual’s experience level and the qualifications they bring to the role to formulate a competitive total compensation package.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$70,000 to $160,000</Salaryrange>
      <Skills>Python, SQL, ETL/ELT, Data Warehousing, Tableau/Power BI, MDM/data governance tools, Collibra, Informatica, Alation, DAMA/CDMP, cloud (AWS/Azure/GCP), Scrum, BI/data engineering</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Ops &amp; MO Control</Employername>
      <Employerlogo>https://logos.yubhub.co/mlp.eightfold.ai.png</Employerlogo>
      <Employerdescription>Ops &amp; MO Control provides data governance and control services.</Employerdescription>
      <Employerwebsite>https://mlp.eightfold.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://mlp.eightfold.ai/careers/job/755954926796</Applyto>
      <Location>New York, New York, United States of America</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>528bf454-d13</externalid>
      <Title>Data Analytics Engineer</Title>
      <Description><![CDATA[<p>We are seeking a Senior Analytics Engineer to join our team. As a key member of our data organization, you will be responsible for transforming raw data into a strategic asset by designing high-performance data models that power our financial reporting, product forecasting, and GTM strategy.</p>
<p>Your 12-Month Journey</p>
<p>During the first 3 months, you will learn about our existing stack (GCP, BigQuery, Airbyte, dbt), core business data models, and understand the current pain points in our data flow. You will deliver and optimize your first high-priority models for product usage and financial reporting. You will partner with the Data Engineer to align on the new infrastructure roadmap.</p>
<p>Within 6 months, you will implement a robust semantic layer to standardize KPIs across the company and enable AI-readiness and advanced natural language querying.</p>
<p>After 1 year, you will fully own the company&#39;s data modeling architecture, ensuring it is prepared for AI and machine learning applications. You will act as a strategic advisor to department heads, using data to help shape the company&#39;s long-term growth and forecasting strategies.</p>
<p>What You&#39;ll Be Doing</p>
<p>Strategic Data Product Ownership: Manage the end-to-end lifecycle of our internal data products. You will partner with stakeholders to translate complex business questions into technical requirements, selecting the right tools to ensure our reporting is scalable, accessible, and high-impact.</p>
<p>Advanced Analytics Engineering: Design, build, and maintain our core data models using dbt Labs. You will own the logic for mission-critical datasets, including financial reporting, churn forecasting, and reverse-ETL flows that sync warehouse data back into our business tools (e.g., Planhat, HubSpot).</p>
<p>Data Governance &amp; Semantic Layering: Act as the guardian of &#39;The Truth.&#39; You will implement data governance standards and build our semantic layer to ensure metrics are consistent across the company.</p>
<p>Data Democratization &amp; Enablement: In collaboration with RevOps, you will design and deliver training programs and documentation. Your goal is to empower users across Finance, Product, and GTM to independently navigate data products and derive their own insights.</p>
<p>Collaboration: You will be the central hub of our data organization. You will work daily with the Data Engineer to align on the roadmap, while frequently consulting with Finance, GTM, and Product leaders to ensure our data products solve their most pressing problems.</p>
<p>What You Bring</p>
<p>Solid experience in Analytics Engineering, Data Analysis, or Data Engineering, with a track record of independently delivering data products that enable reporting, decision-making, and CDP use cases.</p>
<p>You are an expert in SQL and understand how to write performant, modular code. Familiarity with Python and Git for optimizing and versioning data transformations is a significant advantage.</p>
<p>Deep, hands-on experience with dbt and BigQuery is a must. You should also be comfortable navigating ELT tools like Airbyte or Fivetran.</p>
<p>Commercially savvy: you understand the business. You can spot opportunities where data can improve ARR, reduce churn, or optimize spend.</p>
<p>You thrive in fast-paced environments and are comfortable creating structure out of the uncertainty of a scaling company.</p>
<p>Strong project management and stakeholder management skills. You are a &#39;bilingual&#39; communicator who can discuss warehouse schemas with an engineer and ARR growth with a CFO.</p>
<p>Fluency in English, both written and spoken, at a minimum C1 level</p>
<p>What We Offer</p>
<p>Flexibility to work from home in the Netherlands and from our beautiful canal-side office in Amsterdam</p>
<p>A chance to be part of and shape one of the most ambitious scale-ups in Europe</p>
<p>Work in a diverse and multicultural team</p>
<p>€1,500 annual training budget plus internal training</p>
<p>Pension plan, travel reimbursement, and wellness perks</p>
<p>28 paid holiday days + 2 additional days to relax in 2026</p>
<p>Work from anywhere for 4 weeks/year</p>
<p>An inclusive and international work environment with a whole lot of fun thrown in!</p>
<p>Apple MacBook and tools</p>
<p>€200 Home Office budget</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>EUR 70000–90000 / year</Salaryrange>
      <Skills>SQL, dbt, BigQuery, Airbyte, Python, Git, ELT tools, Data governance, Semantic layering, Data democratization, Enablement</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Tellent</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.tellent.com.png</Employerlogo>
      <Employerdescription>Tellent is a Talent Management Suite designed to empower HR &amp; People teams across the entire employee journey, with 250+ team members globally, 7,000+ customers in 100+ countries.</Employerdescription>
      <Employerwebsite>https://careers.tellent.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.tellent.com/o/data-analytics-engineer</Applyto>
      <Location>Amsterdam</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>86babbe8-c9d</externalid>
      <Title>Data Governance Lead, Bayer Digital Farming Solutions</Title>
      <Description><![CDATA[<p>At Bayer, we&#39;re seeking a Data Governance Lead to join our Digital Farming Solutions team. As a Data Governance Lead, you will be responsible for coordinating and maturing the data governance framework across Digital Farming Solutions and Bayer Crop Science.</p>
<p>The successful candidate will have a strong background in privacy, data protection, security, risk management, compliance, or related fields, with a minimum of 12 years of experience. You will be responsible for leading the DFS Data Governance Program in alignment with Bayer and BCS global policies, overseeing the integrity, classification, usability, and security of governed data assets, and holding decision rights for FieldView data governance.</p>
<p>In addition to your technical expertise, you will possess strong communication skills across organizational levels, foundational project/program management experience, and the ability to influence senior stakeholders. You will also be responsible for defining and tracking KPIs and governance metrics to evaluate and improve program maturity.</p>
<p>If you&#39;re a motivated and experienced professional looking to join a dynamic team, please submit your application.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$150k-215k</Salaryrange>
      <Skills>data governance, privacy, data protection, security, risk management, compliance, project management, communication, influence, KPIs, governance metrics</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Bayer</Employername>
      <Employerlogo>https://logos.yubhub.co/talent.bayer.com.png</Employerlogo>
      <Employerdescription>Bayer is a multinational pharmaceutical and life sciences company that develops and manufactures a wide range of products, including crop protection, seeds, and pharmaceuticals.</Employerdescription>
      <Employerwebsite>https://talent.bayer.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://talent.bayer.com/careers/job/562949976568695</Applyto>
      <Location>Creve Coeur</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3fa0b80f-842</externalid>
      <Title>Staff Software Engineer, Public Sector</Title>
      <Description><![CDATA[<p>Job Title: Staff Software Engineer, Public Sector</p>
<p>We are seeking a highly skilled Staff Software Engineer to join our Public Sector team. As a Staff Software Engineer, you will be responsible for designing and implementing software solutions for the public sector. You will work closely with cross-functional teams to develop and deploy software applications that meet the needs of government agencies.</p>
<p>Responsibilities:</p>
<ul>
<li>Design and implement software solutions for the public sector</li>
<li>Work closely with cross-functional teams to develop and deploy software applications</li>
<li>Collaborate with stakeholders to understand their needs and develop software solutions that meet those needs</li>
<li>Develop and maintain software documentation</li>
<li>Participate in code reviews and ensure that code meets quality standards</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Computer Science or related field</li>
<li>5+ years of experience in software development</li>
<li>Proficiency in programming languages such as Java, Python, or C++</li>
<li>Experience with Agile development methodologies</li>
<li>Strong understanding of software design patterns and principles</li>
<li>Excellent communication and collaboration skills</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Master&#39;s degree in Computer Science or related field</li>
<li>10+ years of experience in software development</li>
<li>Experience with cloud-based technologies such as AWS or Azure</li>
<li>Experience with DevOps practices</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunities for professional growth and development</li>
<li>Collaborative and dynamic work environment</li>
</ul>
<p>Salary Range: $252,000-$362,000 USD</p>
<p>Required Skills:</p>
<ul>
<li>Full Stack Development</li>
<li>Cloud-Native Technologies</li>
<li>Data Engineering</li>
<li>AI Application Integration</li>
<li>Problem Solving</li>
<li>Collaboration and Communication</li>
<li>Adaptability and Learning Agility</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Experience with modern web development frameworks</li>
<li>Familiarity with cloud platforms</li>
<li>Understanding of containerization and container orchestration</li>
<li>Knowledge of ETL processes</li>
<li>Understanding of data modeling, data warehousing, and data governance principles</li>
<li>Familiarity with integrating Large Language Models</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$252,000-$362,000 USD</Salaryrange>
      <Skills>Full Stack Development, Cloud-Native Technologies, Data Engineering, AI Application Integration, Problem Solving, Collaboration and Communication, Adaptability and Learning Agility, Experience with modern web development frameworks, Familiarity with cloud platforms, Understanding of containerization and container orchestration, Knowledge of ETL processes, Understanding of data modeling, data warehousing, and data governance principles, Familiarity with integrating Large Language Models</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://www.scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4674913005</Applyto>
      <Location>San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>7f1a5b85-116</externalid>
      <Title>Mission Software Engineer, Public Sector</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled and motivated Mission Software Engineer to join our dynamic Federal Engineering team. As a part of this team, you will play a critical role in supporting Scale&#39;s government customers by scoping and developing onsite solutions.</p>
<p>Our scalable, high-performance platform is the foundation for these customer solutions, and your expertise will be instrumental in designing and implementing systems that can handle interactions with existing customer systems to help our products integrate into existing customer workflows.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Work directly with customers to understand their problems and translate those into features in Scale&#39;s platform.</li>
<li>Be open to &gt;50% travel or relocation to a key customer geographic location.</li>
<li>Collaborate with cross-functional teams to define and execute the vision for backend solutions, ensuring they meet the unique needs of government agencies operating in secure environments.</li>
<li>Implement end-to-end data integrations, syncing customer&#39;s data to Scale&#39;s platform and back.</li>
<li>Deploy and maintain Scale software at customer sites.</li>
<li>Develop customer requested features and work closely with them to ensure that they win customer love.</li>
<li>Build robust and reliable backend systems that can serve as standalone products, empowering customers to accelerate their own AI ambitions.</li>
<li>Participate actively in customer engagements, working closely with stakeholders to understand requirements and deliver innovative solutions.</li>
</ul>
<p>Ideal Candidate:</p>
<ul>
<li>Track record of success as a hybrid customer facing engineer, forward deployed software engineer, and ability to quickly adapt to different roles.</li>
<li>Prior experience developing with Python and JavaScript, or other modern software languages. Familiarity with Node and React is a plus.</li>
<li>Cloud-Native Technologies: Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and experience in developing and deploying applications in a cloud-native environment. Understanding of containerization (e.g., Docker) and container orchestration (e.g., Kubernetes) is a plus.</li>
<li>Linux experience: Understanding of shell scripting, operating systems, etc.</li>
<li>Networking experience: Understanding of networking technologies, configuration (ports, protocols, etc)</li>
<li>Data Engineering: Knowledge of ETL (Extract, Transform, Load) processes and experience in building data pipelines to integrate and process diverse data sources. Understanding of data modeling, data warehousing, and data governance principles.</li>
<li>Problem Solving: Strong analytical and problem-solving skills to understand complex challenges and devise effective solutions. Ability to think critically, identify root causes, and propose innovative approaches to overcome technical obstacles.</li>
</ul>
<p>Compensation packages at Scale for eligible roles include base salary, equity, and benefits. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries for the position, determined by work location and additional factors, including job-related skills, experience, interview performance, and relevant education or training. Scale employees in eligible roles are also granted equity based compensation, subject to Board of Director approval.</p>
<p>Benefits:</p>
<ul>
<li>Comprehensive health, dental and vision coverage,</li>
<li>Retirement benefits,</li>
<li>A learning and development stipend,</li>
<li>Generous PTO,</li>
<li>Commuter stipend</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$138,000-$292,560 USD</Salaryrange>
      <Skills>Python, JavaScript, Node, React, Cloud-Native Technologies, Linux, Networking, Data Engineering, ETL, Data Modeling, Data Warehousing, Data Governance</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops reliable AI systems for the world&apos;s most important decisions, providing high-quality data and full-stack technologies to power leading models.</Employerdescription>
      <Employerwebsite>https://scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4481921005</Applyto>
      <Location>Boston, Massachusetts ; Honolulu, HI; San Diego, CA; San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>bfddfcc3-e38</externalid>
      <Title>Senior Software Engineer, Public Sector</Title>
      <Description><![CDATA[<p>As a Senior Software Engineer, you will lead the development of a vertical feature or a horizontal capability to include defining requirements with stakeholders and implementation until it is accepted by the stakeholders.</p>
<p>You will:</p>
<p>Lead the design and implementation of scalable backend systems and distributed architectures for Federal customers. Manage the full lifecycle of feature development from requirement definition to deployment on classified networks. Direct the orchestration of asynchronous agent fleets to meet mission requirements. Lead customer engagements to translate mission needs into technical requirements. Own the communication with stakeholders to ensure implementation meets defined acceptance criteria. Conduct technical reviews and identify risks within machine learning infrastructure and model serving. Drive the platform roadmap by providing technical specifications for Federal product offerings.</p>
<p>Ideally you will have:</p>
<p>Full Stack Development: Proficiency in front-end, back-end development and infrastructure, including experience with modern web development frameworks, programming languages, and databases Cloud-Native Technologies: Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and experience in developing and deploying applications in a cloud-native environment. Understanding of containerization (e.g., Docker) and container orchestration (e.g., Kubernetes) is a plus Data Engineering: Knowledge of ETL (Extract, Transform, Load) processes and experience in building data pipelines to integrate and process diverse data sources. Understanding of data modeling, data warehousing, and data governance principles AI Application Integration: Familiarity with integrating Large Language Models (LLMs) and building agentic workflows. Understanding of prompt engineering, retrieval-augmented generation (RAG), and agent orchestration is beneficial. Problem Solving: Strong analytical and problem-solving skills to understand complex challenges and devise effective solutions. Ability to think critically, identify root causes, and propose innovative approaches to overcome technical obstacles Collaboration and Communication: Excellent interpersonal and communication skills to effectively collaborate with cross-functional teams, stakeholders, and customers. Ability to clearly articulate technical concepts to non-technical audiences and foster a collaborative work environment Adaptability and Learning Agility: Willingness to embrace new technologies, learn new skills, and adapt to defining and evolving project requirements. Ability to quickly grasp and apply new concepts and stay up-to-date with emerging trends in software engineering</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>$216,000-$311,000 USD (San Francisco, New York, Seattle) $194,400-$279,000 USD (Hawaii, Washington DC, Texas, Colorado) $162,400-$233,000 USD (St. Louis)</Salaryrange>
      <Skills>Full Stack Development, Cloud-Native Technologies, Data Engineering, AI Application Integration, Problem Solving, Collaboration and Communication, Adaptability and Learning Agility, Docker, Kubernetes, AWS, Azure, GCP, ETL, data modeling, data warehousing, data governance, Large Language Models, prompt engineering, retrieval-augmented generation, agent orchestration</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Scale</Employername>
      <Employerlogo>https://logos.yubhub.co/scale.com.png</Employerlogo>
      <Employerdescription>Scale develops reliable AI systems for the world&apos;s most important decisions.</Employerdescription>
      <Employerwebsite>https://www.scale.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/scaleai/jobs/4674911005</Applyto>
      <Location>San Francisco, CA; St. Louis, MO; New York, NY; Washington, DC</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>53024247-9d6</externalid>
      <Title>Senior Solutions Architect - Lakewatch</Title>
      <Description><![CDATA[<p>We are seeking a Senior Solutions Architect to join our Lakewatch team in London. As a Senior Solutions Architect, you will provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment.</p>
<p>Collaborate with GTM leadership and account teams to design and execute high-impact engagement strategies across your territory, driving Lakewatch adoption from initial data offload through full SIEM augmentation or replacement.</p>
<p>As a trusted advisor, serve as an expert Solutions Architect building technical credibility with CISOs, security architects, SOC leadership, and security analysts to drive product adoption and vision.</p>
<p>Enable clients at scale through workshops, POC execution, and developing customer-facing collateral that increases technical knowledge and demonstrates the value of an open agentic SIEM architecture.</p>
<p>Influence product roadmap by translating field-derived, data-driven insights into strategic recommendations for Product and Engineering teams.</p>
<p>Handle the most complex technical challenges in this product line by acting as the tier-3 escalation point for the field, ensuring customer success in mission-critical security environments.</p>
<p>Establish and refine the sales qualification and POC intake process, ensuring well-scoped engagements that maximize customer success and minimize friction for R&amp;D.</p>
<p>The ideal candidate will have 5+ years of experience in a customer-facing, pre-sales or consulting role influencing technical executives, driving high-level security strategy and product adoption.</p>
<p>Experience with design and implementation of data and AI applications in cybersecurity, including anomaly detection, behavioral analytics, and agentic AI workflows for triage and investigation.</p>
<p>Proficient in programming, debugging, and problem-solving using SQL and Python and with AI tools.</p>
<p>Experience collaborating with Global System Integrators (GSIs) and third-party consulting organizations to drive customer outcomes in cybersecurity.</p>
<p>Hands-on experience building solutions within major public cloud environments (AWS, Azure, or GCP), with an understanding of cloud-native security logging and monitoring.</p>
<p>Deep experience in security operations, with broad familiarity across one or more of the following: data engineering, data warehousing, AI/ML for security, data governance, and streaming.</p>
<p>Undergraduate degree (or higher) in a technical field such as Computer Science, Cybersecurity, Applied Mathematics, Engineering or similar.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>cybersecurity engineering, security operations, security architecture, design and implementation of data and AI applications, anomaly detection, behavioral analytics, agentic AI workflows, SQL, Python, AI tools, cloud-native security logging and monitoring, data engineering, data warehousing, AI/ML for security, data governance, streaming</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that unifies and democratizes data, analytics, and AI for over 10,000 organizations worldwide.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8493140002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>53ee0ef3-c62</externalid>
      <Title>Staff Data Engineer, Analytics Data Engineering</Title>
      <Description><![CDATA[<p>We are looking for a Staff Data Engineer to join our Analytics Data Engineering (ADE) team within Data Science &amp; AI Platform. As a Staff Data Engineer, you will be responsible for solving cross-cutting data challenges that span multiple lines of business while driving standardization in how we build, deploy, and govern analytics pipelines across Dropbox.</p>
<p>This is not a maintenance role. We are modernizing our analytics platform, upgrading orchestration infrastructure, building shared and reusable data models with conformed dimensions, establishing a certified metrics framework, and laying the foundation for AI-native data development. You will partner closely with Data Science, Data Infrastructure, Product Engineering, and Business Intelligence teams to make this happen.</p>
<p>You will play a crucial role in establishing analytics engineering standards, designing scalable data models, and driving cross-functional alignment on data governance. You will get substantial exposure to senior leadership, shape the technical direction of analytics infrastructure at Dropbox, and directly influence how data powers product and business decisions.</p>
<p>Responsibilities:</p>
<ul>
<li>Lead the design and implementation of shared, reusable data models, defining shared fact tables, conformed dimensions, and a semantic/metrics layer that serves as the single source of truth across analytics functions</li>
</ul>
<ul>
<li>Drive standardization of data engineering practices across ADE and functional analytics teams, including pipeline patterns, CI/CD workflows, naming conventions, and data modeling standards</li>
</ul>
<ul>
<li>Partner with Data Infrastructure to modernize orchestration, improve pipeline decomposition, and establish secure dev/test environments with production data access</li>
</ul>
<ul>
<li>Architect and implement a shift-left data governance strategy, working with upstream data producers to establish data contracts, SLOs, and code-enforced quality gates that catch issues before production</li>
</ul>
<ul>
<li>Collaborate with Data Science leads and Product Management to translate metric definitions into reliable, certified data pipelines that power executive dashboards, WBR reporting, and growth measurement</li>
</ul>
<ul>
<li>Reduce operational burden by improving pipeline granularity, observability, and failure recovery, establishing runbooks and alerting standards that make on-call sustainable</li>
</ul>
<ul>
<li>Evaluate and integrate AI-native tooling into the data development lifecycle, enabling conversational data exploration with guardrails and AI-assisted pipeline development</li>
</ul>
<p>Requirements:</p>
<ul>
<li>BS degree in Computer Science or related technical field, or equivalent technical experience</li>
</ul>
<ul>
<li>12+ years of experience in data engineering or analytics engineering with increasing scope and technical leadership</li>
</ul>
<ul>
<li>12+ years of SQL experience, including complex analytical queries, window functions, and performance optimization at scale (Spark SQL)</li>
</ul>
<ul>
<li>8+ years of Python development experience, including building and maintaining production data pipelines</li>
</ul>
<ul>
<li>Deep expertise in dimensional data modeling, schema design, and scalable data architecture, with hands-on experience building shared data models across multiple business domains</li>
</ul>
<ul>
<li>Strong experience with orchestration tools (Airflow strongly preferred) and dbt, including pipeline design, scheduling strategies, and failure recovery patterns</li>
</ul>
<p>Preferred Qualifications:</p>
<ul>
<li>Experience with Databricks (Unity Catalog, Delta Lake) and modern lakehouse architectures</li>
</ul>
<ul>
<li>Experience leading orchestration or platform modernization efforts at scale</li>
</ul>
<ul>
<li>Familiarity with data governance and observability tools such as Atlan, Monte Carlo, Great Expectations, or similar</li>
</ul>
<ul>
<li>Experience building or contributing to a metrics/semantic layer (dbt MetricFlow, Databricks Metric Views, or equivalent)</li>
</ul>
<ul>
<li>Track record of establishing data engineering standards and best practices in a federated analytics organization</li>
</ul>
<p>Compensation:</p>
<p>US Zone 2 $198,900-$269,100 USD</p>
<p>US Zone 3 $176,800-$239,200 USD</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$198,900-$269,100 USD</Salaryrange>
      <Skills>SQL, Python, Dimensional data modeling, Schema design, Scalable data architecture, Orchestration tools, dbt, Databricks, Modern lakehouse architectures, Data governance and observability tools, Metrics/semantic layer</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Dropbox</Employername>
      <Employerlogo>https://logos.yubhub.co/dropbox.com.png</Employerlogo>
      <Employerdescription>Dropbox is a technology company that provides cloud storage and file sharing services.</Employerdescription>
      <Employerwebsite>https://www.dropbox.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/dropbox/jobs/7595183</Applyto>
      <Location>Remote - US: Select locations</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8317ba42-502</externalid>
      <Title>Senior Technical Solutions Engineer (Platform)</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Frontline Senior Technical Solutions Engineer with over 7+ years of experience to join our Platform Support team.</p>
<p>This role is pivotal in delivering exceptional support for our Databricks Data Intelligence platform, addressing complex technical challenges, and ensuring the seamless operation of our data solutions.</p>
<p>As a frontline engineer, you will be the primary point of contact for critical issues, working closely with both internal teams and customers to resolve high-impact problems and drive platform improvements.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Frontline Support: Serve as the primary technical point of contact for escalated issues related to the Databricks Data Intelligence platform. Provide expert-level troubleshooting, diagnostics, and resolution for complex problems affecting system performance and reliability.</li>
</ul>
<ul>
<li>Customer Interaction: Engage with customers directly to understand their technical issues and requirements. Provide timely, clear, and actionable solutions to ensure high levels of customer satisfaction.</li>
</ul>
<ul>
<li>Incident Management: Lead the resolution of high-priority incidents, coordinating with various teams to address and mitigate issues swiftly. Conduct thorough root cause analyses and develop preventive measures to avoid recurrence.</li>
</ul>
<ul>
<li>Collaboration: Work closely with engineering, product management, and DevOps teams to share insights, identify recurring issues, and drive improvements to the Databricks Data Intelligence platform.</li>
</ul>
<ul>
<li>Documentation and Knowledge Sharing: Create and maintain detailed documentation on support procedures, known issues, and solutions. Contribute to internal knowledge bases and create training materials to assist other support engineers.</li>
</ul>
<ul>
<li>Performance Monitoring: Monitor and analyze platform performance metrics to identify potential issues before they impact customers. Implement optimizations and enhancements to improve platform stability and efficiency.</li>
</ul>
<ul>
<li>Platform Upgrades: Manage and oversee the deployment of Databricks Data Intelligence platform upgrades and patches, ensuring minimal disruption to services and maintaining system integrity.</li>
</ul>
<ul>
<li>Innovation and Improvement: Stay abreast of industry trends and advancements in Databricks technology. Propose and drive initiatives to enhance platform capabilities and support processes.</li>
</ul>
<ul>
<li>Customer Feedback: Collect and analyze customer feedback to drive continuous improvement in support processes and platform features.</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>Experience: Minimum of 7+ years of hands-on experience in a technical support or engineering role related to Databricks Data Intelligence platform, cloud data platforms, or big data technologies.</li>
</ul>
<ul>
<li>Technical Skills: A deep understanding of Databricks architecture and Apache Spark, along with experience in cloud platforms like AWS, Azure, or GCP, is essential. Strong capabilities in designing and managing data pipelines, distributed computing are required. Proficiency in Unix/Linux administration, familiarity with DevOps practices, and skills in log analysis and monitoring tools are also crucial for effective troubleshooting and system optimization.</li>
</ul>
<ul>
<li>Problem-Solving: Demonstrated ability to diagnose and resolve complex technical issues with a strong analytical and methodical approach.</li>
</ul>
<ul>
<li>Communication: Exceptional verbal and written communication skills, with the ability to effectively convey technical information to both technical and non-technical stakeholders.</li>
</ul>
<ul>
<li>Customer Focus: Proven experience in managing high-impact customer interactions and ensuring a positive customer experience.</li>
</ul>
<ul>
<li>Collaboration: Ability to work effectively in a team environment, collaborating with engineering, product, and customer-facing teams.</li>
</ul>
<ul>
<li>Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Advanced degree or relevant certifications are highly desirable.</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Experience with additional big data tools and technologies such as Hadoop, Kafka, or NoSQL databases.</li>
</ul>
<ul>
<li>Familiarity with automation tools and CI/CD pipelines.</li>
</ul>
<ul>
<li>Understanding of data governance and compliance requirements.</li>
</ul>
<p>Why Join Us?</p>
<ul>
<li>Innovative Environment: Work with cutting-edge technology in a fast-paced, innovative company.</li>
</ul>
<ul>
<li>Career Growth: Opportunities for professional development and career advancement.</li>
</ul>
<ul>
<li>Team Culture: Collaborate with a talented and motivated team dedicated to excellence and continuous improvement.</li>
</ul>
<p>PLEASE NOTE: THE ROLE INVOLVES WORKING IN THE EMEA TIMEZONE</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Databricks architecture, Apache Spark, AWS, Azure, GCP, Unix/Linux administration, DevOps practices, log analysis and monitoring tools, Hadoop, Kafka, NoSQL databases, automation tools, CI/CD pipelines, data governance and compliance requirements</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8041698002</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>a57339aa-939</externalid>
      <Title>Staff Data Engineer, tvScientific</Title>
      <Description><![CDATA[<p>We&#39;re seeking a Staff Data Engineer to lead the design, implementation, and evolution of our identity services and data governance platform. This role is critical to ensuring trusted, privacy-safe, and well-governed data across the organization.</p>
<p>You will work at the intersection of data engineering, identity resolution, privacy, and platform reliability. This is an individual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.</p>
<p>Responsibilities:</p>
<ul>
<li>Design and maintain a scalable identity resolution platform</li>
<li>Build pipelines and services to ingest, normalize, link, and version identity data across multiple sources</li>
<li>Ensure deterministic and probabilistic matching logic that is transparent, auditable, and measurable</li>
<li>Partner with product and analytics teams to expose identity data through reliable, well-documented APIs and datasets</li>
<li>Build and operate batch and streaming pipelines using modern data stack tools</li>
<li>Create clear documentation, standards, and runbooks for identity and governance systems</li>
<li>Own data governance foundations including data lineage, quality checks, schema enforcement, and access controls</li>
<li>Implement privacy-by-design principles (PII handling, consent enforcement, retention policies)</li>
<li>Collaborate with legal, privacy, and security teams to operationalize regulatory requirements (e.g., GDPR, CCPA)</li>
<li>Establish monitoring and alerting for data quality, freshness, and integrity</li>
</ul>
<p>What we&#39;re looking for:</p>
<ul>
<li>Production data engineering experience</li>
<li>Bachelor’s degree in computer science, related field or equivalent experience</li>
<li>Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala</li>
<li>Experience in delivering significant technical initiatives and building reliable, large scale services</li>
<li>Experience in delivering APIs backed by relationship-heavy datasets</li>
<li>Experience implementing data governance practices, including data quality, metadata management, and access controls</li>
<li>Strong understanding of privacy-by-design principles and handling of sensitive or regulated data</li>
<li>Familiarity with data lakes, cloud warehouses, and storage formats</li>
<li>Strong proficiency in AWS services</li>
<li>Excellent written and verbal communication skills</li>
<li>Successful design and implementation of scalable and efficient data infrastructure</li>
<li>High attention to detail in implementation of automated data quality checks</li>
<li>Effective collaboration with cross-functional teams</li>
<li>Demonstrated ability to use AI to improve speed and quality in your day-to-day workflow for relevant outputs</li>
<li>Strong track record of critical evaluation and verification of AI-assisted work (e.g., testing, source-checking, data validation, peer review)</li>
<li>High integrity and ownership: you protect sensitive data, avoid over-reliance on AI, and remain accountable for final decisions and deliverables</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$177,185-$364,795 USD</Salaryrange>
      <Skills>Spark, Scala, Data Engineering, Identity Resolution, Privacy, Platform Reliability, Data Governance, Data Lineage, Quality Checks, Schema Enforcement, Access Controls, AWS Services</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>tvScientific</Employername>
      <Employerlogo>https://logos.yubhub.co/tvscientific.com.png</Employerlogo>
      <Employerdescription>tvScientific is a technology company that provides a CTV advertising platform for performance marketers.</Employerdescription>
      <Employerwebsite>https://www.tvscientific.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/pinterest/jobs/7642253</Applyto>
      <Location>San Francisco, CA, US; Remote, US</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>47807ca3-e36</externalid>
      <Title>Strategic AI/BI Account Executive</Title>
      <Description><![CDATA[<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie in APJ.</p>
<p>You will help organisations move beyond static dashboards to governed, conversational, AI-powered analytics at the centre of the convergence of business intelligence, data platforms, and generative AI. Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>
<li>Engage C-level, analytics, and line-of-business leaders to modernise analytics strategies</li>
<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>
<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>
<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>
<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>
<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>
<li>Strong understanding of modern analytics architectures and data governance</li>
<li>Ability to sell to both technical and business stakeholders</li>
<li>Executive presence and experience navigating complex buying cycles</li>
<li>Passion for AI and the impact of GenAI on enterprise analytics</li>
<li>Experience operating in a specialist or overlay sales model</li>
<li>Ability to translate technical capabilities into clear business value</li>
<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>
</ul>
<p>Preferred qualifications include:</p>
<ul>
<li>Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot</li>
<li>Familiarity with semantic layers, metrics stores, or governed data models</li>
<li>Understanding of lakehouse architectures and cloud data platforms</li>
<li>Exposure to GenAI, natural language interfaces, or conversational applications</li>
<li>Consulting or solution design experience in customer-facing roles</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics, Experience with modern BI platforms such as Tableau, Power BI, Looker, or ThoughtSpot, Familiarity with semantic layers, metrics stores, or governed data models, Understanding of lakehouse architectures and cloud data platforms, Exposure to GenAI, natural language interfaces, or conversational applications, Consulting or solution design experience in customer-facing roles</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company with over 10,000 organisations worldwide relying on its data intelligence platform.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8441884002</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>276f3a05-2e9</externalid>
      <Title>Field CTO - America Industries</Title>
      <Description><![CDATA[<p>We are seeking a Field Chief Technology Officer (Field CTO) for the Americas Industries Business Unit to be a senior, customer-facing technology and business transformation thought leader for our most strategic, often global, accounts in regulated industries.</p>
<p>This individual contributor role sits at the intersection of data and AI strategy, industry transformation, and executive relationship-building, working closely with C-level leaders to drive multi-year change on the data platform while representing real-world needs back into Databricks.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Building and maintaining trusted-advisor relationships with C-level executives in large US-based and global accounts, especially in highly regulated industries.</li>
<li>Cultivating a strong social and professional network across customer executives, boards, key industry bodies, and partners.</li>
<li>Shaping executive thinking on modern data and AI architectures, with emphasis on Lakehouse and data platform modernization as the primary lever for long-term Gen AI impact.</li>
<li>Leading C-level briefings, strategy sessions, and multi-day workshops that connect business outcomes, regulatory constraints, and operating model change to concrete Databricks-based roadmaps.</li>
<li>Serving as a deep technical counterpart in the field, maintaining L200–L300 proficiency across Databricks products and being able to credibly engage architects, data engineers, and data scientists on solution design and trade-offs.</li>
<li>Generalizing patterns from the field into reusable reference architectures, industry blueprints, and best practices for regulated industries, and sharing them through blogs, webinars, whitepapers, and conference keynotes.</li>
<li>Orchestrating the broader ecosystem (cloud providers, GSIs, consultancies, ISVs) around customer objectives, ensuring Databricks is at the center of multi-year transformation programs rather than isolated projects.</li>
<li>Partnering with Account Executives, Solutions Architects, Industry Leads, and Product Specialists to drive complex, multi-year sales cycles, securing platform decisions and expansions while influencing ACV and consumption growth.</li>
<li>Providing structured, prioritized feedback from strategic customers into Product, Engineering, and Field leadership to influence product roadmap, especially around data, governance, security, and regulated-industry requirements.</li>
<li>Mentoring senior Field Engineering and industry-focused talent, contributing to a pipeline of principal- and CTO-level leaders and codifying ways of working for complex, regulated accounts.</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>15+ years of experience spanning enterprise technology and consulting, including leading or advising on multi-year data platform and analytics transformations in large, complex organizations.</li>
<li>Significant time spent inside a large enterprise software or cloud company in roles that required navigating matrixed organizations and driving change at scale, combined with direct industry exposure rather than a career spent solely in horizontal software.</li>
<li>Experience in or with regulated industries, with familiarity with regulatory and compliance considerations affecting data and AI platforms.</li>
<li>A background that blends hands-on technology and architecture work on data platforms and analytics, organizational and operating model change, executive consulting or advisory, and proven ability to operate as a highly credible peer to C-level executives.</li>
<li>Strong, proactive networker who is naturally curious about which associations, councils, and forums matter for a given customer set, and who uses those networks to create new executive entry points and opportunities.</li>
<li>Demonstrated longevity and impact in prior roles, with evidence of building and sustaining long-term customer relationships and programs rather than frequent short stints.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$249,800-$343,400 USD</Salaryrange>
      <Skills>data and AI strategy, industry transformation, executive relationship-building, Lakehouse and data platform modernization, Gen AI impact, L200–L300 proficiency across Databricks products, solution design and trade-offs, reference architectures, industry blueprints, best practices for regulated industries, cloud providers, GSIs, consultancies, ISVs, complex, multi-year sales cycles, platform decisions and expansions, ACV and consumption growth, product roadmap, data governance, security, regulated-industry requirements</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform. It has over 10,000 organizations worldwide as clients.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8306218002</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e0907526-c49</externalid>
      <Title>Senior Privacy Architect Manager</Title>
      <Description><![CDATA[<p>We are looking for a Senior Manager, Privacy Architect to join our Privacy &amp; Data Security Team. As our growth accelerates through AI-powered personalization and innovative social features, privacy takes on new importance,fueling our ability to deliver magical, personalized experiences while ensuring our users feel safe, respected, and in control.</p>
<p>The ideal candidate will have deep knowledge of current privacy and technology trends, with a strong passion for data governance/management and AI/ML. They will have demonstrated experience in ensuring privacy-by-design principles are applied throughout the design, construction, and operation of digital products and services at scale.</p>
<p>Responsibilities include leading high-impact initiatives and defining technical requirements for compliance and the responsible use of technology at Airbnb. The candidate will work closely with the Chief Privacy Officer, Legal, and other Privacy &amp; Data Security Team members, as well as engineering and data science teams.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Collaborating with technical teams in the identification and effective management of company-wide risks to privacy and the responsible use of technology</li>
<li>Leading definition and implementation of company-wide standards, practices, and patterns to protect and manage personal data in accordance with privacy and AI regulations</li>
<li>Working with Legal, Data Science, Data Governance, and InfoSec to introduce Privacy by Design principles in company products and infrastructure</li>
<li>Creating privacy training for technical roles, including data engineers, developers, and data scientists</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>15+ years of total experience, with 5+ years of experience in technical program/project management or privacy engineering focused on building technology products and/or systems</li>
<li>Deep understanding of large-scale, “Big Data” data stores and technologies</li>
<li>Strong familiarity with the AI/ML development lifecycle: from data collection and curation, through model architecture selection, training, testing, A/B testing, and deployment</li>
<li>Solid understanding of Large Language Models (LLMs), Generative AI and AI Agents, including compliance and responsible use challenges arising from their deployment in B2C services</li>
<li>Strong familiarity with Privacy Enhancing Technologies (PETs), such as various types of encryption, de-identification methods (e.g., k-anonymity, differential privacy), and AI/ML interpretability techniques (e.g., SHAP, LIME)</li>
</ul>
<p>Preferred qualifications include:</p>
<ul>
<li>Professional certifications such as Certified Information Privacy Professional (CIPP), Certified Information Privacy Manager (CIPM), or AI Governance Professional (AIGP) or equivalent</li>
<li>BA/BS and/or advanced degree in engineering, computer science, mathematics, statistics, physics, or a related field</li>
<li>Experience with programming languages and tools commonly used in AI, such as R, Python and Github</li>
</ul>
<p>This position is US - Remote Eligible. The role may include occasional work at an Airbnb office or attendance at offsites, as agreed to with your manager.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Privacy, Data Governance, AI/ML, Large Scale Data Stores, Big Data, Large Language Models, Generative AI, AI Agents, Privacy Enhancing Technologies, Encryption, De-identification, AI/ML Interpretability, Certified Information Privacy Professional (CIPP), Certified Information Privacy Manager (CIPM), AI Governance Professional (AIGP), R, Python, Github</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Airbnb</Employername>
      <Employerlogo>https://logos.yubhub.co/airbnb.com.png</Employerlogo>
      <Employerdescription>Airbnb is a global online marketplace for short-term vacation rentals. It was founded in 2007 and has since grown to become one of the largest online marketplaces for accommodations.</Employerdescription>
      <Employerwebsite>https://www.airbnb.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/airbnb/jobs/7782533</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>760c3e88-e35</externalid>
      <Title>Senior Product Manager, Data</Title>
      <Description><![CDATA[<p>Job Title: Senior Product Manager, Data</p>
<p>We are seeking a Senior Product Manager to support the development of CoreWeave&#39;s Enterprise Data Platform within the CIO organization. This role will contribute to building a scalable, high-performance data lake and data architecture, integrating data from key sources across Operations, Engineering, Sales, Finance, and other IT partners.</p>
<p>As a Senior Product Manager for Data Infrastructure and Analytics, you will help drive data ingestion, transformation, governance, and analytics enablement. You will collaborate with engineering, analytics, finance, and business teams to help deliver data lake and pipeline orchestration solutions, ensuring accessible data for business insights.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Own and evangelize Data Platform and Business Analytics roadmap and strategy across CoreWeave</li>
<li>Assist with the execution of CoreWeave&#39;s enterprise data architecture, helping enable the data lake and domain-driven data layer</li>
<li>Support the development and enhancement of data ingestion, transformation, and orchestration pipelines for scalability, efficiency, and reliability</li>
<li>Work with the Engineering and Data teams to maintain and enhance data pipelines for both structured and unstructured data, enabling efficient data movement across the organization</li>
<li>Collaborate with Finance, GTM, Infrastructure, Data Center, and Supply Chain teams to help unify and model data from core systems (ERP, CRM, Asset Mgmt, Supply Chain systems, etc.)</li>
<li>Contribute to data governance and quality initiatives, focusing on data consistency, lineage tracking, and compliance with security standards</li>
<li>Support the BI and analytics layer by partnering with stakeholders to enable data products, dashboards, and reporting capabilities</li>
<li>Help prioritize data-driven initiatives, ensuring alignment with business goals and operational needs in coordination with leadership</li>
</ul>
<p>Requirements:</p>
<ul>
<li>5+ years of experience in data product management, data architecture, or enterprise data engineering roles</li>
<li>Familiarity with data lakes, data warehouses, ETL/ELT and streaming pipelines, and data governance frameworks</li>
<li>Hands-on experience with modern data stack technologies (such as Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka)</li>
<li>Understanding of data modeling, domain-driven design, and creating scalable data platforms</li>
<li>Experience supporting the end-to-end data product lifecycle, including requirements gathering and implementation</li>
<li>Strong collaboration skills with engineering, analytics, and business teams to help deliver data initiatives</li>
<li>Awareness of data security, compliance, and governance best practices</li>
<li>Understanding of BI and analytics platforms (such as Tableau, Looker, Power BI) and supporting self-service analytics</li>
</ul>
<p>Why CoreWeave?</p>
<p>At CoreWeave, we work hard, have fun, and move fast! We&#39;re in an exciting stage of hyper-growth that you will not want to miss out on. We&#39;re not afraid of a little chaos, and we&#39;re constantly learning. Our team cares deeply about how we build our product and how we work together, which is represented through our core values:</p>
<ul>
<li>Be Curious at Your Core</li>
<li>Act Like an Owner</li>
<li>Empower Employees</li>
<li>Deliver Best-in-Class Client Experiences</li>
<li>Achieve More Together</li>
</ul>
<p>We support and encourage an entrepreneurial outlook and independent thinking. We foster an environment that encourages collaboration and provides the opportunity to develop innovative solutions to complex problems. As we get set for take off, the growth opportunities within the organization are constantly expanding. You will be surrounded by some of the best talent in the industry, who will want to learn from you, too. Come join us!</p>
<p>Salary Range: $143,000 to $210,000</p>
<p>Benefits:</p>
<ul>
<li>Medical, dental, and vision insurance - 100% paid for by CoreWeave</li>
<li>Company-paid Life Insurance</li>
<li>Voluntary supplemental life insurance</li>
<li>Short and long-term disability insurance</li>
<li>Flexible Spending Account</li>
<li>Health Savings Account</li>
<li>Tuition Reimbursement</li>
<li>Ability to Participate in Employee Stock Purchase Program (ESPP)</li>
<li>Mental Wellness Benefits through Spring Health</li>
<li>Family-Forming support provided by Carrot</li>
<li>Paid Parental Leave</li>
<li>Flexible, full-service childcare support with Kinside</li>
<li>401(k) with a generous employer match</li>
<li>Flexible PTO</li>
<li>Catered lunch each day in our office and data center locations</li>
<li>A casual work environment</li>
<li>A work culture focused on innovative disruption</li>
</ul>
<p>Workplace:</p>
<p>While we prioritize a hybrid work environment, remote work may be considered for candidates located more than 30 miles from an office, based on role requirements for specialized skill sets. New hires will be invited to attend onboarding at one of our hubs within their first month. Teams also gather quarterly to support collaboration.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$143,000 to $210,000</Salaryrange>
      <Skills>data product management, data architecture, enterprise data engineering, data lakes, data warehouses, ETL/ELT and streaming pipelines, data governance frameworks, modern data stack technologies, Snowflake, BigQuery, Databricks, Apache Spark, Airflow, DBT, Kafka, data modeling, domain-driven design, scalable data platforms, BI and analytics platforms, Tableau, Looker, Power BI</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>CoreWeave</Employername>
      <Employerlogo>https://logos.yubhub.co/coreweave.com.png</Employerlogo>
      <Employerdescription>CoreWeave is a cloud-based platform that enables innovators to build and scale AI with confidence.</Employerdescription>
      <Employerwebsite>https://www.coreweave.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/coreweave/jobs/4649824006</Applyto>
      <Location>Livingston, NJ / New York, NY / Sunnyvale, CA / Bellevue, WA/San Francisco, CA</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>41a793fc-9ff</externalid>
      <Title>Staff Data Analyst (Bengaluru)</Title>
      <Description><![CDATA[<p>We are looking for an experienced data analyst to join Okta&#39;s enterprise data team. The successful candidate will have a strong background in financial and business performance analytics, and a proven track record of proactively identifying and solving complex business problems through data.</p>
<p>In this role, you will be focusing on Finance data and reporting, partnering with Finance, Accounting, Sales Operations, and Executive Leadership to implement enhancements and build end-to-end financial insights across the organization.</p>
<p>Responsibilities: Proactively partner with Finance and Accounting leadership to set the analytics roadmap and identify high-impact opportunities for data to drive business value. Serve as a trusted advisor to senior Finance and business stakeholders, influencing their strategy and decision-making through data-driven narratives. Translate ambiguous business questions into clear, structured analytical requirements and measurable project plans. Partner with Finance and Operations teams to provide best practices in financial metric definition, dashboard design, and modeling. Conduct deep-dive, root-cause analyses on performance variances, translating complex data into clear, strategic recommendations. Design and build scalable data models to support enterprise-wide financial reporting. Own the entire lifecycle of financial data products, from initial concept to driving adoption and measuring business impact. Enable self-service data consumption for business users. Develop and champion new analytical methods and tools to continuously improve financial reporting and decision-making processes. Work with Data Engineering to define, implement, and build new data sources and transformations.</p>
<p>Requirements: 8+ years&#39; experience as a Data Analyst 6+ years&#39; hands-on SQL experience in a work environment Expertise in developing and maintaining complex financial models, including scenario planning and predictive forecasting, and analysis. Experience with and had built large scale data models (e.g., using dbt or Airflow), including proven experience in modeling intricate financial metrics. Experience with data management, documenting processes and data flows, and ensuring data quality. Familiarity with data quality frameworks and monitoring tools. Experience with building AI-ready data and semantic layers. Experience with building reports and visualizations to represent data intuitively in Tableau or similar data visualization tools. Exceptional communication, presentation, and storytelling skills, with the ability to convey complex analytical findings to executive audiences. Demonstrated ability to operate independently and execute projects with minimal supervision. Experience with ETL processes, software development, and lifecycle awareness. Familiarity with data governance and report/data catalog applications (Collibra, Aliation, Data.world).</p>
<p>The Okta Experience Supporting Your Well-Being Driving Social Impact Developing Talent and Fostering Connection + Community</p>
<p>Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, data management, financial modeling, data quality frameworks, data visualization, ETL processes, software development, data governance</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>Okta</Employername>
      <Employerlogo>https://logos.yubhub.co/okta.com.png</Employerlogo>
      <Employerdescription>Okta provides identity and access management solutions. It is a large technology company.</Employerdescription>
      <Employerwebsite>https://www.okta.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/okta/jobs/7645984</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3e74b079-420</externalid>
      <Title>Senior Workday Functional Analyst</Title>
      <Description><![CDATA[<p>We are looking for a Senior Workday Functional Analyst to join our Business Solutions Team. As a Senior Workday Functional Analyst, you will partner closely with the People Technology leadership and play a key role in our Workday transformation efforts.</p>
<p>In this critical role, you will collaborate with stakeholders across People, Payroll, IT, Finance, and international offices to optimize and scale our HR systems and processes. This role is ideal for someone who is passionate about solving complex business challenges, driving innovation through automation, and delivering a best-in-class user experience.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Lead full lifecycle Phase X implementations of Workday HCM and Workday Talent or Workday Benefits</li>
<li>Conduct discovery sessions to gather, analyze, and document business requirements</li>
<li>Translate requirements into scalable Workday configurations</li>
<li>Provide strategic recommendations aligned with HR best practices</li>
<li>Serve as a Workday subject matter expert (SME), with strong knowledge across multiple modules HCM and Workday Talent or Workday Benefits</li>
</ul>
<p>Required skills and qualifications include:</p>
<ul>
<li>5+ years of Workday Functional consulting experience</li>
<li>One or Multiple full lifecycle implementations of Workday HCM and Benefits/Talent</li>
<li>Strong expertise in either Workday Talent (Performance, Succession, Career, Talent Review), or Workday Benefits (Plans, Eligibility, Open Enrollment, Life Events)</li>
<li>Deep knowledge of Workday business process configuration</li>
<li>Strong knowledge of Microsoft Excel, including advanced formulas, PivotTables, data analysis, charts, and reporting</li>
<li>Experience with security configuration and reporting</li>
<li>Strong client-facing consulting and workshop facilitation skills</li>
<li>Hands-on experience developing and enhancing custom reports and dashboards in Workday</li>
<li>Strong proficiency in EIB (Enterprise Interface Builder), and Report as a Service (RaaS) integrations</li>
<li>Strong proficiency with Workday Calculated Fields</li>
<li>Strong analytical and problem-solving skills, with the ability to propose and implement scalable solutions</li>
<li>Excellent interpersonal and communication skills, with the ability to translate complex technical concepts to business audiences</li>
<li>Comfortable working in a fast-paced, diverse, and collaborative environment</li>
<li>Eagerness to learn and expand Workday knowledge across additional modules or areas of functionality</li>
</ul>
<p>Preferred skills include:</p>
<ul>
<li>Workday Pro or Workday Certification in HCM and/or Talent/Benefits</li>
<li>Experience with global HCM implementations</li>
<li>Background working in mid-to-large enterprise environments or consulting firms</li>
<li>Experience with SOX compliance, data governance, and audits</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Workday Functional consulting experience, Workday HCM and Benefits/Talent, Workday business process configuration, Microsoft Excel, Security configuration and reporting, Custom reports and dashboards in Workday, EIB (Enterprise Interface Builder), Report as a Service (RaaS) integrations, Workday Calculated Fields, Analytical and problem-solving skills, Workday Pro or Workday Certification in HCM and/or Talent/Benefits, Global HCM implementations, Mid-to-large enterprise environments or consulting firms, SOX compliance, data governance, and audits</Skills>
      <Category>IT</Category>
      <Industry>Technology</Industry>
      <Employername>Okta</Employername>
      <Employerlogo>https://logos.yubhub.co/okta.com.png</Employerlogo>
      <Employerdescription>Okta is a software company that provides identity and access management solutions.</Employerdescription>
      <Employerwebsite>https://www.okta.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/okta/jobs/7653064</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>80b94e35-0f3</externalid>
      <Title>Staff Technical Solutions Engineer (Platform)</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Frontline Staff Technical Solutions Engineer with over 12+ years of experience to join our Platform Support team. This role is pivotal in delivering exceptional support for our Databricks Data Intelligence platform, addressing complex technical challenges, and ensuring the seamless operation of our data solutions.</p>
<p>As a frontline engineer, you will be the primary point of contact for critical issues, working closely with both internal teams and customers to resolve high-impact problems and drive platform improvements.</p>
<p>Key Responsibilities:</p>
<ul>
<li>Frontline Support: Serve as the primary technical point of contact for escalated issues related to the Databricks Data Intelligence platform. Provide expert-level troubleshooting, diagnostics, and resolution for complex problems affecting system performance and reliability.</li>
<li>Customer Interaction: Engage with customers directly to understand their technical issues and requirements. Provide timely, clear, and actionable solutions to ensure high levels of customer satisfaction.</li>
<li>Incident Management: Lead the resolution of high-priority incidents, coordinating with various teams to address and mitigate issues swiftly. Conduct thorough root cause analyses and develop preventive measures to avoid recurrence.</li>
<li>Collaboration: Work closely with engineering, product management, and DevOps teams to share insights, identify recurring issues, and drive improvements to the Databricks Data Intelligence platform.</li>
<li>Documentation and Knowledge Sharing: Create and maintain detailed documentation on support procedures, known issues, and solutions. Contribute to internal knowledge bases and create training materials to assist other support engineers.</li>
<li>Performance Monitoring: Monitor and analyze platform performance metrics to identify potential issues before they impact customers. Implement optimizations and enhancements to improve platform stability and efficiency.</li>
<li>Platform Upgrades: Manage and oversee the deployment of Databricks Data Intelligence platform upgrades and patches, ensuring minimal disruption to services and maintaining system integrity.</li>
<li>Innovation and Improvement: Stay abreast of industry trends and advancements in Databricks technology. Propose and drive initiatives to enhance platform capabilities and support processes.</li>
<li>Customer Feedback: Collect and analyze customer feedback to drive continuous improvement in support processes and platform features.</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>Experience: Minimum of 12 years of hands-on experience in a technical support or engineering role related to Databricks Data Intelligence platform, cloud data platforms, or big data technologies.</li>
<li>Technical Skills: A deep understanding of Databricks architecture and Apache Spark, along with experience in cloud platforms like AWS, Azure, or GCP, is essential. Strong capabilities in designing and managing data pipelines, distributed computing are required. Proficiency in Unix/Linux administration, familiarity with DevOps practices, and skills in log analysis and monitoring tools are also crucial for effective troubleshooting and system optimisation.</li>
<li>Problem-Solving: Demonstrated ability to diagnose and resolve complex technical issues with a strong analytical and methodical approach.</li>
<li>Communication: Exceptional verbal and written communication skills, with the ability to effectively convey technical information to both technical and non-technical stakeholders.</li>
<li>Customer Focus: Proven experience in managing high-impact customer interactions and ensuring a positive customer experience.</li>
<li>Collaboration: Ability to work effectively in a team environment, collaborating with engineering, product, and customer-facing teams.</li>
<li>Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Advanced degree or relevant certifications are highly desirable.</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Experience with additional big data tools and technologies such as Hadoop, Kafka, or NoSQL databases.</li>
<li>Familiarity with automation tools and CI/CD pipelines.</li>
<li>Understanding of data governance and compliance requirements.</li>
</ul>
<p>Why Join Us?</p>
<ul>
<li>Innovative Environment: Work with cutting-edge technology in a fast-paced, innovative company.</li>
<li>Career Growth: Opportunities for professional development and career advancement.</li>
<li>Team Culture: Collaborate with a talented and motivated team dedicated to excellence and continuous improvement.</li>
</ul>
<p>About Databricks</p>
<p>Databricks is the data and AI company. More than 10,000 organizations worldwide , including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 , rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.</p>
<p>Benefits</p>
<p>At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.</p>
<p>Our Commitment to Diversity and Inclusion</p>
<p>At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Databricks architecture, Apache Spark, AWS, Azure, GCP, Unix/Linux administration, DevOps practices, log analysis and monitoring tools, Hadoop, Kafka, NoSQL databases, automation tools, CI/CD pipelines, data governance and compliance requirements</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/7845334002</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>3dddf8d5-2d4</externalid>
      <Title>Senior Workday Functional Analyst</Title>
      <Description><![CDATA[<p>We are looking for a Senior Workday Functional Analyst to join our Business Solutions Team. In this critical role, you will partner closely with the People Technology leadership and play a key role in our Workday transformation efforts.</p>
<p>As part of a cross-functional team, you will collaborate with stakeholders across People, Payroll, IT, Finance, and international offices to optimize and scale our HR systems and processes.</p>
<p>This role is ideal for someone who is passionate about solving complex business challenges, driving innovation through automation, and delivering a best-in-class user experience.</p>
<p><strong>Job Duties and Responsibilities</strong></p>
<ul>
<li>Lead full lifecycle Phase X implementations of Workday HCM and Workday Talent or Workday Benefits</li>
<li>Conduct discovery sessions to gather, analyze, and document business requirements</li>
<li>Translate requirements into scalable Workday configurations</li>
<li>Provide strategic recommendations aligned with HR best practices</li>
<li>Serve as a Workday subject matter expert (SME), with strong knowledge across multiple modules HCM and Workday Talent or Workday Benefits</li>
</ul>
<p><strong>Required Skills and Qualifications</strong></p>
<ul>
<li>5+ years of Workday Functional consulting experience</li>
<li>One or Multiple full lifecycle implementations of Workday HCM and Benefits/Talent</li>
<li>Supported, minimum of one annual event, such as Merit Cycle/Performance Review/Open Enrollment</li>
<li>Strong expertise in either:</li>
<li>Workday Talent (Performance, Succession, Career, Talent Review), or</li>
<li>Workday Benefits (Plans, Eligibility, Open Enrollment, Life Events)</li>
<li>Deep knowledge of Workday business process configuration</li>
<li>Strong knowledge of Microsoft Excel, including advanced formulas, PivotTables, data analysis, charts, and reporting</li>
<li>Experience with security configuration and reporting</li>
<li>Strong client-facing consulting and workshop facilitation skills</li>
<li>Hands-on experience developing and enhancing custom reports and dashboards in Workday</li>
<li>Strong proficiency in EIB (Enterprise Interface Builder), and Report as a Service (RaaS) integrations</li>
<li>Strong proficiency with Workday Calculated Fields</li>
<li>Strong analytical and problem-solving skills, with the ability to propose and implement scalable solutions.</li>
<li>Excellent interpersonal and communication skills, with the ability to translate complex technical concepts to business audiences.</li>
<li>Comfortable working in a fast-paced, diverse, and collaborative environment.</li>
<li>Eagerness to learn and expand Workday knowledge across additional modules or areas of functionality.</li>
</ul>
<p><strong>Preferred Skills</strong></p>
<ul>
<li>Workday Pro or Workday Certification in HCM and/or Talent/Benefits</li>
<li>Experience with global HCM implementations</li>
<li>Background working in mid-to-large enterprise environments or consulting firms.</li>
<li>Experience with SOX compliance, data governance, and audits.</li>
</ul>
<p><strong>Key Competencies</strong></p>
<ul>
<li>Strategic thinking and solution design</li>
<li>Executive communication skills</li>
<li>Stakeholder management</li>
<li>Problem-solving and analytical ability</li>
</ul>
<p><strong>The Okta Experience</strong></p>
<p>Okta is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran.</p>
<p>If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation.</p>
<p>Notice for New York City Applicants &amp; Employees: Okta may use Automated Employment Decision Tools (AEDT), as defined by New York City Local Law 144, that use artificial intelligence, machine learning, or other automated processes to assist in our recruitment and hiring process.</p>
<p>Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Personnel and Job Candidate Privacy Notice at https://www.okta.com/legal/personnel-policy/</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Workday Functional consulting experience, Workday HCM and Benefits/Talent, Microsoft Excel, Security configuration and reporting, Custom reports and dashboards in Workday, EIB (Enterprise Interface Builder), Report as a Service (RaaS) integrations, Workday Calculated Fields, Analytical and problem-solving skills, Workday Pro or Workday Certification in HCM and/or Talent/Benefits, Global HCM implementations, Mid-to-large enterprise environments or consulting firms, SOX compliance, data governance, and audits</Skills>
      <Category>IT</Category>
      <Industry>Technology</Industry>
      <Employername>Okta</Employername>
      <Employerlogo>https://logos.yubhub.co/okta.com.png</Employerlogo>
      <Employerdescription>Okta is a software company that provides identity and access management solutions.</Employerdescription>
      <Employerwebsite>https://www.okta.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/okta/jobs/6596372</Applyto>
      <Location>Bengaluru, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>c6d7f1a0-882</externalid>
      <Title>Resident Solutions Architect - Mumbai</Title>
      <Description><![CDATA[<p>We are seeking an experienced Resident Solution Architect (RSA) to join our Professional Services team and work directly with strategic customers on their data and AI transformation initiatives using the Databricks platform.</p>
<p>As an RSA, you will serve as a trusted technical advisor and hands-on expert, guiding customers to solve complex big data challenges using the Databricks platform.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Collaborating with customers to understand their data and AI transformation goals and developing tailored solutions using the Databricks platform</li>
<li>Designing and implementing scalable and secure data architectures using Apache Spark, Delta Lake, and other Databricks technologies</li>
<li>Providing expert-level technical guidance and support to customers during the implementation process</li>
<li>Identifying and addressing potential roadblocks and providing creative solutions to overcome them</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>10+ years of experience with Big Data Technologies such as Apache Spark, Kafka, and Data Lakes in a customer-facing post-sales, technical architecture, or consulting role</li>
<li>4+ years of experience as a Solution Architect creating designs, solving Big Data challenges for customers</li>
<li>Expertise in Apache Spark, distributed computing, and Databricks platform capabilities</li>
<li>Comfortable writing code in Python, PySpark, and Scala</li>
<li>Exceptional SQL, Spark SQL, Spark-streaming skills</li>
<li>Advanced knowledge of Spark optimizations, Delta, Databricks Lakehouse Platforms</li>
<li>Expertise in Azure</li>
<li>Expertise in NoSQL databases (MongoDB, Redis, HBase)</li>
<li>Expertise in data governance and security (Unity Catalog, RBAC)</li>
<li>Ability to work with Partner Organization and deliver complex programs</li>
<li>Ability to lead large technical delivery teams</li>
<li>Understands the larger competitive landscape, such as EMR, Snowflake, and Sagemaker</li>
<li>Experience of migration from On-prem / Cloud to Databricks is a plus</li>
<li>Excellent communication and client-facing consulting skills, with the ability to simplify complex technical concepts</li>
<li>Willingness to travel for onsite customer engagements within India</li>
<li>Documentation and white-boarding skills</li>
</ul>
<p>Good-to-have Skills:</p>
<ul>
<li>Experience with ML libraries/frameworks: Scikit-learn, TensorFlow, PyTorch</li>
<li>Familiarity with MLOps tools and processes, including MLflow for tracking and deployment</li>
<li>Experience delivering LLM and GenAI solutions at scale (RAG architectures, prompt engineering)</li>
<li>Extensive experience on Hadoop, Trino, Ranger and other open-source technology stack</li>
<li>Expertise on cloud platforms like AWS and GCP</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Apache Spark, Kafka, Data Lakes, Python, PySpark, Scala, SQL, Spark SQL, Spark-streaming, Azure, NoSQL databases, data governance, security, Unity Catalog, RBAC, ML libraries/frameworks, MLOps tools and processes, LLM and GenAI solutions, Hadoop, Trino, Ranger, AWS, GCP</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8107166002</Applyto>
      <Location>Mumbai, India</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>8aca0e87-abb</externalid>
      <Title>Strategic AI/BI Account Executive</Title>
      <Description><![CDATA[<p>We are seeking a Strategic AI/BI Account Executive to help enterprise customers transform how business users interact with data. This high-impact role sits within the AI Go-To-Market team and partners closely with Enterprise Account Executives to drive adoption of Databricks AI/BI and Genie.</p>
<p>You will help organizations move beyond static dashboards to governed, conversational, AI-powered analytics at the center of the convergence of business intelligence, data platforms, and generative AI.</p>
<p>Enterprise analytics is rapidly evolving from dashboards and static reporting to conversational, AI-driven decision platforms. Databricks AI/BI and Genie empower business users to securely interact with governed data using natural language, transforming the data platform into a true decision platform.</p>
<p>If you want to be at the forefront of AI-powered analytics transformation at one of the fastest-growing data and AI companies in the world, this is your opportunity.</p>
<p>The impact you will have:</p>
<ul>
<li>Partner with Enterprise AEs to identify, qualify, and close AI/BI opportunities</li>
<li>Engage C-level, analytics, and line-of-business leaders to modernize analytics strategies</li>
<li>Displace or expand legacy BI platforms with AI-powered, governed analytics solutions</li>
<li>Lead conversations around semantic governance, self-service analytics, and natural language data access</li>
<li>Drive proof-of-value engagements and scale enterprise-wide adoption</li>
<li>Align AI/BI initiatives to measurable business outcomes (productivity, speed to insight, revenue impact)</li>
<li>Enable field teams and serve as a subject matter expert on modern analytics architectures</li>
</ul>
<p>What we look for:</p>
<ul>
<li>Enterprise sales experience in BI, analytics, data platforms, or AI/ML</li>
<li>Strong understanding of modern analytics architectures and data governance</li>
<li>Ability to sell to both technical and business stakeholders</li>
<li>Executive presence and experience navigating complex buying cycles</li>
<li>Passion for AI and the impact of GenAI on enterprise analytics</li>
<li>Experience operating in a specialist or overlay sales model</li>
<li>Ability to translate technical capabilities into clear business value</li>
<li>7+ years of Enterprise Sales experience, exceeding quotas in larger accounts</li>
<li>Bachelors Degree or equivalent experience</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience in BI, analytics, data platforms, or AI/ML, Strong understanding of modern analytics architectures and data governance, Ability to sell to both technical and business stakeholders, Executive presence and experience navigating complex buying cycles, Passion for AI and the impact of GenAI on enterprise analytics</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified and democratized data, analytics, and AI platform. It has over 10,000 organizations worldwide as clients.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8441888002</Applyto>
      <Location>London, United Kingdom</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>47483e13-115</externalid>
      <Title>Staff Product Manager - Technical</Title>
      <Description><![CDATA[<p>As a Technical Product Manager, you will work closely with product managers, engineering teams, and technical field organizations to ensure the features we design and ship deliver outstanding user experiences.</p>
<p>You will help shape our transactional database capabilities to meet the performance, reliability, and scalability requirements of modern applications and AI agents, or you will help ensure data assets are governed effectively, enabling controlled access, compliance, and visibility across the organization.</p>
<p>This role requires you to deeply understand both functional and non-functional requirements, such as performance, scalability, security, and compliance and how customers meet these requirements today. You will evaluate how these workloads are implemented on the Databricks Data Intelligence Platform and identify opportunities to improve the product experience.</p>
<p>You will act as a bridge between technical field teams and product and engineering. Insights from customer PoCs, benchmarks, and real-world implementations will directly inform product decisions. You will also help ensure that product improvements are clearly communicated back to the field.</p>
<p>The impact you will have:</p>
<ul>
<li>Identify and drive impactful product improvements in your domain of expertise</li>
</ul>
<ul>
<li>Define and run performance benchmarks (OLTP focus) or governance best practices and reference architectures (governance focus)</li>
</ul>
<ul>
<li>Shape and prioritize a meaningful product roadmap</li>
</ul>
<ul>
<li>Support go-to-market efforts and guide product adoption</li>
</ul>
<ul>
<li>For governance focus: define processes and mechanisms for how AI agents securely and compliantly access the Databricks Data Intelligence Platform</li>
</ul>
<p>What we look for:</p>
<ul>
<li>5+ years of experience with a strong, hands-on technical background</li>
</ul>
<ul>
<li>Strong empathy for customers across full spectrum of Data Platform users</li>
</ul>
<ul>
<li>Deep domain expertise in one of the following:</li>
</ul>
<ul>
<li>Transactional databases (OLTP), cloud-native databases, or distributed systems</li>
</ul>
<ul>
<li>Data governance, data catalogs, lineage, and access management</li>
</ul>
<ul>
<li>Experience evaluating and comparing technologies across dimensions such as performance, reliability, governance, and compliance</li>
</ul>
<ul>
<li>Strong Python and SQL skills</li>
</ul>
<ul>
<li>Experience using AI-assisted development tools</li>
</ul>
<ul>
<li>Experience with systems design and architecture</li>
</ul>
<ul>
<li>Proven ability to work effectively across product, engineering, and technical field teams</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Transactional databases, Cloud-native databases, Distributed systems, Data governance, Data catalogs, Lineage, Access management, Python, SQL, AI-assisted development tools, Systems design and architecture</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Databricks</Employername>
      <Employerlogo>https://logos.yubhub.co/databricks.com.png</Employerlogo>
      <Employerdescription>Databricks is a data and AI company that provides a unified platform for data, analytics, and AI. It was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow.</Employerdescription>
      <Employerwebsite>https://databricks.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/databricks/jobs/8394060002</Applyto>
      <Location>Amsterdam, Netherlands</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>0efa9467-f65</externalid>
      <Title>Senior Solutions Architect</Title>
      <Description><![CDATA[<p>We&#39;re seeking a Senior Solutions Architect to join our Post Sales team and partner with enterprise customers to design and implement complex, AI-enabled Airtable solutions.</p>
<p>In this high-impact role, you&#39;ll lead the architecture and delivery of enterprise implementations, translating business workflows into scalable, AI-powered systems that accelerate time-to-value and drive long-term adoption. You&#39;ll work closely with Engagement Managers, Account teams, and Product to shape the future of how organisations leverage Airtable AI to transform their operations.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Architect and deliver paid SOW enterprise Airtable implementations for strategic customers</li>
<li>Lead scoping and design of Airtable AI solutions during professional services engagements</li>
<li>Develop repeatable AI solution patterns, including AI workflows, automations, and structured data pipelines</li>
<li>Establish best practices for data modelling, governance, and AI-enabled workflow design</li>
<li>Partner with Engagement Managers to scope services engagements and define implementation plans</li>
<li>Produce solution architecture diagrams, data models, and workflow documentation for enterprise customers</li>
<li>Drive adoption of Airtable AI capabilities across multiple enterprise implementations</li>
<li>Provide technical guidance and troubleshoot architectural challenges during implementations</li>
<li>Collaborate with Product and internal teams to relay AI feature feedback and influence the roadmap</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5+ years of solution architecture, solution engineering, or technical consulting experience for enterprise SaaS platforms</li>
<li>Strong understanding of data modelling, database design, and governance</li>
<li>Experience designing workflow automation and operational systems</li>
<li>Familiarity with AI-enabled SaaS capabilities (LLMs, AI automation, AI-assisted workflows, or AI copilots)</li>
<li>Ability to translate complex business processes into scalable technical architectures</li>
<li>Strong stakeholder communication skills across technical and executive audiences</li>
<li>Experience scoping and delivering enterprise implementations</li>
<li>Proficiency in process mapping and architecture documentation tools (e.g., Lucidchart, Visio)</li>
<li>Experience designing AI-driven workflow solutions (LLMs, prompt design, AI agents, AI automation)</li>
<li>Experience implementing platforms like Airtable, Notion, ServiceNow, Salesforce, Workato, or similar workflow/data systems</li>
<li>Familiarity with APIs, integrations, and automation frameworks</li>
<li>Experience with enterprise data governance and security models</li>
<li>Experience in Professional Services or consulting organisations</li>
<li>Technical familiarity with scripting or low-code/no-code platforms</li>
<li>Experience developing repeatable architecture patterns or implementation frameworks</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Solution Architecture, Data Modelling, Database Design, Governance, AI-Enabled SaaS Capabilities, LLMs, AI Automation, AI-Assisted Workflows, AI Copilots, Process Mapping, Architecture Documentation Tools, APIs, Integrations, Automation Frameworks, Enterprise Data Governance, Security Models, Professional Services, Scripting, Low-Code/No-Code Platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Airtable</Employername>
      <Employerlogo>https://logos.yubhub.co/airtable.com.png</Employerlogo>
      <Employerdescription>Airtable is a no-code app platform that empowers people to accelerate their most critical business processes. It has over 500,000 organisations, including 80% of the Fortune 100, relying on it to transform how work gets done.</Employerdescription>
      <Employerwebsite>https://airtable.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/airtable/jobs/8487502002</Applyto>
      <Location>San Francisco, CA; New York, NY; Remote - US</Location>
      <Country></Country>
      <Postedate>2026-04-18</Postedate>
    </job>
    <job>
      <externalid>e202ae88-c05</externalid>
      <Title>Supply Chain Planner II</Title>
      <Description><![CDATA[<p>We are looking for a Supply Chain Planner II to manage and optimize the flow of materials, components, and information for our autonomous surface vessels. This individual will play a critical role in forecasting demand, managing inventory levels, and supporting production, NPI, and/or non-steady state operations.</p>
<p>The successful candidate will partner closely with engineering, procurement, program management, and suppliers to ensure material readiness while balancing cost, risk, and schedule in dynamic environments. As a key member of our team, you will be responsible for maintaining our supply chain&#39;s integrity, meeting deadlines, and ensuring compliance with DoD regulations and standards.</p>
<p>Responsibilities:</p>
<ul>
<li>Own and improve end-to-end tactical supply plans to support NPI builds, production builds, and/or off-nominal builds.</li>
<li>Translate engineering demand, build schedules, and program milestones into executable material requirements within ERP/MRP systems, ensuring system-driven visibility and control.</li>
<li>Optimize factory planning parameters (lead times, lot sizes, safety stock, planning fences, reorder strategies) to improve material flow, reduce shortages, and support rapid design iteration.</li>
<li>Drive BOM maturity and change management execution by assessing material impact of ECOs, managing cut-ins/cut-outs, and minimizing excess and obsolete exposure.</li>
<li>Identify and mitigate material readiness risks related to design instability, supplier capacity constraints, long-lead components, and quality performance through structured risk registers and recovery plans.</li>
<li>Ensure accurate master data governance (BOMs, routings, planning attributes, supplier parameters) to enable reliable MRP outputs and production readiness signals.</li>
<li>Partner cross-functionally with Engineering, Manufacturing, Quality, and Procurement to align supply execution with build plans, capacity constraints, and program priorities.</li>
<li>Establish and monitor material readiness metrics (shortage tracking, kit completeness, on-time-to-build, supplier OTD, inventory health) to drive daily execution and escalation.</li>
<li>Lead shortage resolution and recovery management, including root cause analysis, systemic corrective actions, and supplier coordination to protect build schedules.</li>
<li>Develop and implement supply planning processes, workflows, and system utilization to improve repeatability, scalability, and factory throughput.</li>
<li>Analyze supply chain performance data to identify systemic bottlenecks and implement process improvements that balance speed, cost, inventory exposure, and risk.</li>
<li>Ensure compliance with DoD program requirements, audit readiness, and traceability standards through disciplined documentation and system controls</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>Bachelor&#39;s degree in Supply Chain Management, Business, Engineering, or a related field (preferred).</li>
<li>5+ years of experience in supply chain planning or management, ideally in a manufacturing environment and 1+ years of experience with NPI/Change Management operations (experience with defense or advanced technology products is a plus).</li>
<li>Strong knowledge of supply chain management software (e.g., SAP, Oracle, or similar) and Microsoft Office Suite (Excel, Word, etc.).</li>
<li>Experience with supply planning, with emphasis on new product introduction</li>
<li>Proven ability to work in a fast-paced and agile environment and manage multiple projects simultaneously.</li>
<li>Strong analytical and problem-solving skills, with the ability to identify and resolve supply chain issues effectively.</li>
<li>Knowledge of DoD regulations, standards, and compliance requirements for defense-related manufacturing (preferred).</li>
<li>Excellent communication skills, both written and verbal, with the ability to collaborate effectively with internal teams and external suppliers.</li>
<li>Strong attention to detail, organizational skills, and the ability to meet deadlines.</li>
<li>Ability to work independently and as part of a team in a dynamic startup environment.</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Medical Insurance: Comprehensive health insurance plans covering a range of services</li>
<li>Saronic pays 100% of the premium for employees and 80% for dependents</li>
<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</li>
<li>Saronic pays 100% of the premium under the basic plan for employees and 80% for dependents</li>
<li>Time Off: Generous PTO and Holidays</li>
<li>Parental Leave: Paid maternity and paternity leave to support new parents</li>
<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</li>
<li>Retirement Plan: 401(k) plan with company match</li>
<li>Stock Options: Equity options to give employees a stake in the company’s success</li>
<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</li>
<li>Pet Insurance: Discounted pet insurance options including 24/7 Telehealth helpline</li>
<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>supply chain management, ERP/MRP systems, factory planning, BOM maturity, change management, risk management, master data governance, cross-functional collaboration, analytical skills, problem-solving skills, communication skills, attention to detail, organizational skills</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Saronic Technologies</Employername>
      <Employerlogo>https://logos.yubhub.co/saronictechnologies.com.png</Employerlogo>
      <Employerdescription>Saronic Technologies develops state-of-the-art solutions for autonomous and intelligent platforms in the maritime industry.</Employerdescription>
      <Employerwebsite>https://www.saronictechnologies.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/saronic/12d14f00-f185-46cf-9ede-feb6fa34eeee</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>2b727ac5-e2a</externalid>
      <Title>IT Engineer</Title>
      <Description><![CDATA[<p>We are seeking a versatile and growth-oriented IT Engineer to join our growing Enterprise Technology team. This role will administer and optimize our enterprise SaaS ecosystem while ensuring secure, scalable, and well-governed system operations across the organization.</p>
<p>The position blends hands-on systems administration, workflow configuration, lightweight tool development, and cross-functional partnership to support evolving business needs.</p>
<p>This role is not limited to a fixed set of platforms. We are looking for someone who is intellectually curious, adaptable, and eager to learn new technologies as our enterprise landscape evolves.</p>
<p>Responsibilities:</p>
<p>Administer and own enterprise SaaS applications including DocuSign, Confluence, Ashby (Lever), Ramp, and related platforms.
Manage user lifecycle operations and role-based access controls across enterprise systems such as Salesforce, NetSuite, and Workday.
Configure workflows, permissions, automations, and integrations to support evolving business needs.
Manage and triage the ServiceNow queue, fulfilling requests and routing development tickets appropriately.
Build and deploy lightweight internal tools and workflow solutions, leveraging coding agents such as Claude and Codex where appropriate.
Partner cross-functionally to translate operational requirements into scalable system solutions.
Maintain SharePoint and Confluence structures, governance standards, and information architecture across Enterprise Technology.
Ensure security, data integrity, documentation standards, change management discipline, and compliance readiness across platforms.
Continuously evaluate and learn new enterprise platforms, tools, and technologies as business needs expand.</p>
<p>Qualifications:</p>
<p>5+ years of experience in IT engineering, enterprise applications administration, or systems engineering.
Hands-on experience administering SaaS platforms such as DocuSign, Confluence, Ashby (Lever), Ramp, Salesforce, NetSuite, Workday, or similar systems.
Strong ability and demonstrated willingness to quickly learn and administer new enterprise platforms not previously supported.
Experience managing ServiceNow queues, ticket triage, and cross-functional coordination.
Strong understanding of user lifecycle management and role-based access control (RBAC).
Experience building lightweight internal tools, automations, or workflow-based applications.
Experience working with APIs, system integrations, SSO (Okta/Azure AD), and identity management platforms.
Familiarity with secure integration design and data governance principles.
Experience operating in compliance-focused or audit-sensitive environments preferred.
Bachelor’s degree in Computer Science, Information Systems, or related discipline, or equivalent work experience.</p>
<p>Keys to Success in This Role:</p>
<p>Strong collaborator and cross-functional partner.
Comfortable operating in structured, compliance-oriented environments.
Demonstrated intellectual curiosity and eagerness to take ownership of new systems and technologies.
Ability to quickly understand new technologies and evolving enterprise platforms.
Strong analytical skills with the ability to address complex business and technical issues.
Ownership mindset with disciplined execution and attention to detail.
Ability to work effectively in high-growth, rapidly scaling enterprise environments.</p>
<p>Physical Demands:</p>
<p>Prolonged periods of sitting at a desk and working on a computer.
Occasional standing and walking within the office.
Manual dexterity to operate a computer keyboard, mouse, and other office equipment.
Visual acuity to read screens, documents, and reports.
Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies.
Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages).</p>
<p>Additional Information:</p>
<p>Benefits:
Medical Insurance: Comprehensive health insurance plans covering a range of services
Saronic pays 100% of the premium for employees and 80% for dependents
Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care
Saronic pays 100% of the premium under the basic plan for employees and 80% for dependents
Time Off: Generous PTO and Holidays
Parental Leave: Paid maternity and paternity leave to support new parents
Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses
Retirement Plan: 401(k) plan with company match
Stock Options: Equity options to give employees a stake in the company’s success
Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage
Pet Insurance: Discounted pet insurance options including 24/7 Telehealth helpline
Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SaaS platform administration, User lifecycle management, Role-based access control, Workflow configuration, APIs, System integrations, SSO, Identity management, Secure integration design, Data governance</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Saronic Technologies</Employername>
      <Employerlogo>https://logos.yubhub.co/saronictechnologies.com.png</Employerlogo>
      <Employerdescription>Saronic Technologies develops state-of-the-art solutions that enhance maritime operations through autonomous and intelligent platforms.</Employerdescription>
      <Employerwebsite>https://www.saronictechnologies.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/saronic/1c8313cc-e98d-4061-b9e0-7a7ab50e160d</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>8c1f2ae9-db9</externalid>
      <Title>Configuration Manager - Product</Title>
      <Description><![CDATA[<p>We are hiring a Configuration Manager focused on core product definition, audit compliance, and part traceability across programs. This individual is responsible for maintaining the structured relationships between parts, subassemblies, and top-level platforms,ensuring version-controlled consistency across our engineering, manufacturing, and sustainment environments.</p>
<p>This role acts as the central authority for how items (designed or procured) are reused, revised, and controlled across multiple programs. The Configuration Manager – Product will work closely with engineering, supply chain, manufacturing, and program leadership to ensure that product configurations are accurate, auditable, and aligned to contract and regulatory expectations.</p>
<p>Responsibilities:</p>
<ul>
<li>Maintain structured part and assembly relationships across all platforms and variants</li>
<li>Own the item master definition for Saronic’s parts, subassemblies, and COTS items across programs</li>
<li>Ensure all product data is version-controlled, baselined, and auditable at the component, system, and platform level</li>
<li>Govern how items and designs are reused or revised across multiple configurations, with clear traceability</li>
<li>Partner with manufacturing engineering to validate MBOM alignment to the product configuration baseline</li>
<li>Review and oversee configuration changes ensuring they are properly documented and traceable in accordance with EIA-649, ASME Y14.5, ISO 9001, and internal practices</li>
<li>Coordinate configuration planning and release schedules across programs to avoid version conflicts and unapproved divergence</li>
<li>Support audit readiness by maintaining a complete digital record of product configurations and associated approvals</li>
<li>Collaborate with the PLM Systems Architect and CAD Administrator to ensure systems and data structures support configuration governance</li>
</ul>
<p>Qualifications:</p>
<ul>
<li>5+ years of experience in configuration management, product data governance, or related roles</li>
<li>Deep familiarity with product structure: items, subassemblies, configurations, alternates, and interchangeability rules</li>
<li>Working knowledge of engineering change control (ECOs), drawing versioning, and part release processes</li>
<li>Familiarity with configuration and documentation standards such as EIA-649, ASME Y14.5, ISO 9001, or MIL-STD equivalents</li>
<li>Experience working with PLM/PDM systems (Teamcenter preferred) to manage product definitions and revision histories</li>
<li>Comfortable navigating engineering BOMs, manufacturing BOMs, and how they diverge across programs</li>
<li>Ability to collaborate and enforce data discipline across engineering, operations, supply chain, and program management</li>
<li>Strong documentation, audit preparation, and version traceability skills</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Medical Insurance: Comprehensive health insurance plans covering a range of services</li>
<li>Dental and Vision Insurance: Coverage for routine dental check-ups, orthodontics, and vision care</li>
<li>Saronic pays 100% of the premium for employees and 80% for dependents</li>
<li>Time Off: Generous PTO and Holidays</li>
<li>Parental Leave: Paid maternity and paternity leave to support new parents</li>
<li>Competitive Salary: Industry-standard salaries with opportunities for performance-based bonuses</li>
<li>Retirement Plan: 401(k) plan</li>
<li>Stock Options: Equity options to give employees a stake in the company’s success</li>
<li>Life and Disability Insurance: Basic life insurance and short- and long-term disability coverage</li>
<li>Additional Perks: Free lunch benefit and unlimited free drinks and snacks in the office</li>
</ul>
<p>Physical Demands:</p>
<ul>
<li>Prolonged periods of sitting at a desk and working on a computer</li>
<li>Occasional standing and walking within the office</li>
<li>Manual dexterity to operate a computer keyboard, mouse, and other office equipment</li>
<li>Visual acuity to read screens, documents, and reports</li>
<li>Occasional reaching, bending, or stooping to access file drawers, cabinets, or office supplies</li>
<li>Lifting and carrying items up to 20 pounds occasionally (e.g., office supplies, packages)</li>
</ul>
<p>Additional Information:
This role requires access to export-controlled information or items that require “U.S. Person” status. As defined by U.S. law, individuals who are any one of the following are considered to be a “U.S. Person”: (1) U.S. citizens, (2) legal permanent residents (a.k.a. green card holders), and (3) certain protected classes of asylees and refugees, as defined in 8 U.S.C. 1324b(a)(3)</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>configuration management, product data governance, PLM/PDM systems, engineering change control, drawing versioning, part release processes, EIA-649, ASME Y14.5, ISO 9001, MIL-STD equivalents</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Saronic Technologies</Employername>
      <Employerlogo>https://logos.yubhub.co/saronictechnologies.com.png</Employerlogo>
      <Employerdescription>Saronic Technologies develops advanced autonomous surface vessels for the Department of Defense.</Employerdescription>
      <Employerwebsite>https://www.saronictechnologies.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/saronic/1214fed2-43f3-437b-9c4b-49c01acc42e4</Applyto>
      <Location>San Diego</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>4d67ce9b-156</externalid>
      <Title>Senior Product Manager, Data Enablement</Title>
      <Description><![CDATA[<p>Omada Health is on a mission to inspire and engage people in lifelong health, one step at a time.\n\nAs the Senior Product Manager for Data Enablement, you will be the enterprise-level owner of analytics capabilities, shared definitions, and enablement tooling that ensures Omada can answer critical business questions faster and more efficiently. This role will work across Product, Clinical, Commercial, Operations, and Finance to create the structures for how analytics are governed, accessed, and used.\n\nThe primary mandate of this role is to own the product strategy and roadmap for Omada’s enterprise analytics including governance, shared definitions, and enablement tooling. You are accountable for ensuring that teams across Product, Clinical, Commercial, Operations, and Finance can ask and answer critical business questions quickly, consistently, and with high trust in the underlying data. This means defining how key concepts are governed, how analytics tools are shaped and prioritized, and how cross-functional stakeholders engage with the data platform.\n\nYou will act as the central product owner for Omada’s analytics ecosystem, setting clear standards and driving adoption across the company. In practice, this includes:\n\n* Owning the roadmap for core data products and analytics tooling that power dashboards, reporting, and self-service analysis.\n* Driving the culture of data across our engineering, product and other business teams at Omada\n* Partnering with data, engineering, analytics, and business leaders to ensure analytics investments are aligned with company strategy and deliver measurable impact.\n* Collaborating with other product managers to incorporate data science concepts into their features &amp; experiences\n\nAbout you:\n\n* 10+ years of relevant product management or data product experience (8+ years with Master&#39;s degree, 5+ years with PhD)\n* 7+ years of experience with data technology and data management, including familiarity with:\n  * Modern data warehousing technologies (Redshift, Snowflake, BigQuery or similar)\n  * Cloud technologies, preferably AWS\n  * Business intelligence and analytics tools (Tableau, Amplitude, Looker, or similar)\n  * Data governance frameworks and data quality management\n  * SQL and ability to write queries and QA datasets\n* Subject matter expertise in enterprise analytics governance, data product management, or analytics enablement\n* Experience establishing governance processes &amp; associated tool implementation for data definitions, metrics, or analytics across multiple business functions\n* Proven ability to partner with Data Teams for technical evaluation of self service tools and/or statistical model training lifecycle\n* Proven ability to influence senior stakeholders and reconcile opposing viewpoints to drive consensus\n* Track record of working on problems that are not clearly defined, using advanced knowledge and conceptual thinking to develop solutions\n* Experience with healthcare data standards and regulatory requirements is strongly preferred\n* Bachelor&#39;s degree or equivalent relevant experience (advanced degree preferred)\n\nEssential Competencies:\n\n* Strategic &amp; Analytical:\n  * Able to think beyond constraints to imagine new ways to use data to impact members and customers\n  * Excellent organizational and analytical skills with strong technical understanding\n  * Comfortable analyzing data and leading research/discovery efforts to understand problem spaces, identify opportunities, and propose solutions\n  * Proven ability to set clear, achievable objectives and work plans for complex, cross-functional initiatives\n* Communication &amp; Influence:\n  * Exceptional written and verbal communication skills\n  * Ability to create formal networks with key decision makers and influence stakeholders across multiple functions\n  * Skilled at adapting communication style for different audiences, from technical teams to senior executives\n  * Experience negotiating matters of significance with senior management and/or major customers\n* Collaboration &amp; Problem Solving:\n  * Comfortable with ambiguity and adept at solving problems with limited resources and information\n  * Able to drive alignment among diverse stakeholders with competing priorities\n  * Great listener who asks insightful questions and doesn&#39;t accept &quot;It&#39;s how we&#39;ve always done it&quot; as an answer\n  * Proven ability to reconcile various and opposing stakeholder views to drive results\n* Execution &amp; Delivery:\n  * History of consistently delivering results against target metrics and commitments\n  * Experience managing product backlogs, writing clear product specifications, and driving execution\n  * Ability to exercise autonomous decision making with limited input on methods and techniques to obtain results\n\nYour impact:\n\nIn this role, you will directly contribute to transforming how Omada uses data to serve members, customers, and internal stakeholders. Your work will:\n\n* Eliminate fragmentation: Establish single sources of truth for critical business and clinical metrics, reducing confusion and accelerating decision-making\n* Increase trust: Build confidence in analytics across the organization through governed definitions and transparent processes\n* Accelerate insights: Enable analysts and business users to answer questions faster through improved tooling, AI-assisted analytics, and reusable data products\n* Scale effectively: Create governance frameworks and enablement systems that scale as Omada&#39;s product portfolio and customer base expand\n* Drive outcomes: Ensure that data and analytics capabilities directly support better health outcomes for members and business results for customers\n\nBenefits:\n\n* Competitive salary with generous annual cash bonus\n* Equity grants\n* Remote first work from home culture\n* Flexible Time Off to help you rest, recharge, and connect with loved ones\n* Generous parental leave\n* Health, dental, and vision insurance (and above market employer contributions)\n* 401k retirement savings plan\n* Lifestyle Spending Account (LSA)\n* Mental Health Support Solutions\n* ...and more!\n\nWe take a village to change healthcare. As we build together toward our mission, we strive to embody the following values in our day-to-day work. We hope these hold meaning for you as well as you consider Omada!\n\n* Cultivate Trust. We listen closely and we operate with kindness. We provide respectful and candid feedback to each other.\n* Seek Context. We ask to understand and we build connections. We do our research up front to move faster down the road.\n* Act Boldly. We innovate daily to solve problems, improve processes, and find new opportunities for our members and customers.\n* Deliver Results. We reward impact above output.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data technology, data management, modern data warehousing technologies, cloud technologies, business intelligence and analytics tools, data governance frameworks, data quality management, SQL, subject matter expertise in enterprise analytics governance, data product management, analytics enablement</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Omada Health</Employername>
      <Employerlogo>https://logos.yubhub.co/omadahealth.com.png</Employerlogo>
      <Employerdescription>Omada Health is a healthcare technology company that provides virtual-first care for pre-diabetes, diabetes, hypertension, and musculoskeletal conditions.</Employerdescription>
      <Employerwebsite>https://www.omadahealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/omadahealth/jobs/7759052</Applyto>
      <Location>Remote, USA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>7b750523-8ff</externalid>
      <Title>Staff Software Engineer, Data Engineering</Title>
      <Description><![CDATA[<p>We are seeking a Staff Software Engineer to lead the technical strategy and implementation of our enterprise data architecture, governance foundations, and analytics enablement tooling.</p>
<p>In this role, you will be the primary engineering counterpart to the Senior Product Manager for Data Enablement &amp; Governance, jointly shaping the roadmap for enterprise analytics, shared definitions, and the tools that help Omada answer questions faster and more reliably.</p>
<p>You will design and evolve core data products, define patterns and standards used across the company, and drive the technical execution of initiatives that ensure our metrics, reports, and data products are scalable, governed, and trustworthy.</p>
<p>This is a high-impact, cross-functional Staff role working across Data Engineering, Data Science, Analytics, Product, IT, and business leaders.</p>
<p><strong>Key Responsibilities:</strong></p>
<p><strong>Enterprise Data Architecture</strong></p>
<ul>
<li>Own the vision and technical roadmap for Omada&#39;s enterprise data architecture, spanning ingestion, storage, modeling, and serving layers for analytics and applied statistics use cases.</li>
<li>Design, implement, and evolve scalable, secure, and cost-efficient data solutions (datalakes, warehouses, marts, semantic layers) that support governed, cross-functional analytics and self-service.</li>
<li>Define and socialize architectural patterns, data contracts, and integration standards used by data and product teams across the organization.</li>
<li>Anticipate future needs (e.g., new product lines, new modalities, AI/ML workloads) and drive proactive architectural changes rather than reacting to incidents or point-in-time requests.</li>
</ul>
<p><strong>Data Modeling, Quality, and Governance Foundations</strong></p>
<ul>
<li>Lead the design of logical and physical data models to support enterprise metrics, dashboards, and ad hoc analytics, with a focus on reusability and clear ownership.</li>
<li>Implement robust data quality, validation, and monitoring frameworks that underpin trusted “single source of truth” definitions for core concepts (e.g., active member, MAU, GLP-1 member).</li>
<li>Partner with the Senior Product Manager, Data Enablement &amp; Governance to translate governance decisions (definitions, ownership, change-management processes) into concrete technical implementations in the data platform.</li>
<li>Set standards and review mechanisms to ensure new pipelines, marts, and reports align with enterprise definitions and governance policies.</li>
<li>Continuously improve performance, scalability, and cost-efficiency of data workflows and storage; lead deep dives and remediation for complex production issues.</li>
</ul>
<p><strong>Enterprise Data Products Lifecycle</strong></p>
<ul>
<li>In close partnership with the Senior PM, define and deliver core, reusable data products (e.g., engagement, clinical, financial, client, care delivery datasets) that power dashboards, reporting, and self-service analytics.</li>
<li>Co-Architect and implement technical foundations for AI-assisted analytics tools, governed semantic layers, and reporting applications that make analysts and business users more efficient.</li>
<li>Partner with Product and Engineering teams owning tools like Amplitude, Tableau, and internal reporting tools to ensure consistent instrumentation, mapping to enterprise definitions, and scalable access patterns.</li>
<li>Translate business and product requirements into resilient schemas, data services, and interfaces that are usable, maintainable, and auditable.</li>
<li>Ensure production data delivery meets defined SLAs and supports downstream BI, reporting apps, and applied statistics workloads.</li>
<li>Play a key role in cross-functional forums (e.g., Data Governance Committee, analytics communities) as the technical voice for feasibility, risk, and long-term platform health.</li>
</ul>
<p><strong>Technical Leadership, Mentorship, and Culture</strong></p>
<ul>
<li>Lead large, multi-team technical initiatives,from design to implementation and rollout,setting a high bar for design docs, reviews, and execution quality.</li>
<li>Mentor senior and mid-level engineers, elevating the team’s skills in data modeling, pipeline design, governance, and platform thinking.</li>
<li>Help shape playbooks for how product squads and spokes engage with central data teams on new metrics, data products, and applied stats projects.</li>
<li>Partner closely with Analytics, Data Science, Product, and business leaders to ensure data architecture and governance decisions are aligned with company OKRs and measurable business value.</li>
<li>Proactively identify complexity, duplication, and fragility in existing systems; drive simplification and standardization with sustainable solutions.</li>
<li>Model Omada’s values in day-to-day work, fostering a culture of trust, context-seeking, bold thinking, and high-impact delivery.</li>
</ul>
<p><strong>About You:</strong></p>
<ul>
<li>8+ years of experience building, maintaining, and orchestrating scalable data platforms and high-quality production pipelines, including significant experience in analytics or warehousing environments.</li>
<li>Demonstrated Staff-level impact: leading cross-team technical initiatives, making architectural decisions that shaped a multi-year roadmap, and influencing stakeholders beyond your immediate team.</li>
<li>Deep experience with cloud data ecosystems (e.g., AWS) and modern data warehouses (e.g., Redshift, Snowflake, BigQuery), including MPP query optimization.</li>
<li>Strong background in data modeling for OLTP and OLAP, and designing reusable data products for BI, reporting, and advanced analytics.</li>
<li>Hands-on experience implementing data quality, observability, and governance frameworks, ideally in a regulated or PHI/PII-sensitive environment.</li>
<li>Experience partnering with Product Management and Analytics to define and deliver platform capabilities, not just point solutions.</li>
</ul>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Strong proficiency in SQL (analytical and performance-tuned) and experience with relational and MPP databases.</li>
<li>Proficiency in at least one modern programming language used in data engineering (e.g., Python, Java, Scala) and comfort applying software engineering best practices (testing, CI/CD, code review).</li>
<li>Experience with workflow orchestration and data integration tools (e.g., Airflow) and event-driven or streaming patterns where appropriate.</li>
<li>Familiarity with BI and analytics tools (e.g., Tableau, Amplitude, or similar) and how they integrate with governed data layers.</li>
<li>Experience with data governance concepts (ownership, lineage, definitions, access controls) and their technical implementation in a modern data stack.</li>
<li>Familiarity with AI tools for development.</li>
</ul>
<p><strong>Communication &amp; Working Style:</strong></p>
<ul>
<li>Excellent communication and collaboration skills, with the ability to convey complex technical concepts to non-technical stakeholders.</li>
<li>Highly self-directed and comfortable operating in ambiguous, cross-functional problem spaces, creating clarity and direction where none exists.</li>
<li>Strong sense of ownership and bias for impact; you care about outcomes for members, customers, and internal users, not just elegant systems.</li>
</ul>
<p><strong>Benefits:</strong></p>
<ul>
<li>Competitive salary with generous annual cash bonus</li>
<li>Equity grants</li>
<li>Remote first work from home culture</li>
<li>Flexible Time Off to help you recharge</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, Cloud data ecosystems, Modern data warehouses, MPP query optimization, Data modeling, Data quality, Data governance, Workflow orchestration, Data integration, Event-driven or streaming patterns, BI and analytics tools, AI tools for development</Skills>
      <Category>Engineering</Category>
      <Industry>Healthcare</Industry>
      <Employername>Omada Health</Employername>
      <Employerlogo>https://logos.yubhub.co/omadahealth.com.png</Employerlogo>
      <Employerdescription>Omada Health is a healthcare technology company that provides digital therapeutics for chronic disease management.</Employerdescription>
      <Employerwebsite>https://www.omadahealth.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/omadahealth/jobs/7753330</Applyto>
      <Location>Remote, USA</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>e4305742-104</externalid>
      <Title>Director of Data</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Director of Data who can help us use the approaches that led to the creation of relational databases to drive Mercury&#39;s success.</p>
<p>In this role, you&#39;ll lead a full-stack data organisation of 30+ data scientists and data engineers embedded across Mercury&#39;s business and product teams. You&#39;ll define how data is built, governed, and leveraged , shaping a culture where anyone can access trusted data, ask hard questions, and self-serve at scale. You&#39;ll architect the systems that help Mercury move faster, invest smarter, and make better decisions at every level.</p>
<p>Responsibilities:</p>
<ul>
<li>Build, lead, and develop a high-performing team of 30+ data scientists and data engineers embedded across GTM, Core Product, and Risk, ensuring Mercury&#39;s most important decisions are grounded in rigorous, trusted data.</li>
<li>Define and execute a forward-looking data strategy that embeds rigorous, data-driven decision-making into how Mercury sets company priorities, builds products, and manages risk.</li>
<li>Lead Mercury&#39;s evolution toward self-serve data at scale by shipping high-leverage data platforms that empower teams to move quickly from question to insight to action.</li>
<li>Elevate data reliability, governance, and security as core capabilities by shaping the standards and systems required to operate as a scaled financial institution.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>10+ years of experience, including 4+ years managing a team of data science or data engineering managers.</li>
<li>Track record of partnering with and influencing executives, equipping them with the tools, reporting, and advanced analytics needed to make decisions , proactively surfacing opportunities and risks they may not see.</li>
<li>Bring strong opinions on how data should inform decision-making, with the ability to translate that perspective into scalable practices and standards.</li>
<li>Have deep expertise in data science, machine learning, statistics, financial calculations, and experimentation for commercial applications.</li>
<li>Experience developing high-performing data teams, setting a high talent bar while fostering an inspiring environment of ownership, growth, and impact.</li>
<li>Have developed self-service analytics capabilities at scale, expanding access to reliable data and powering AI-enabled insights for business and product teams.</li>
<li>Partner effectively with People, Legal, Compliance, and Privacy teams to ensure that we are enforcing best practices and compliant standards for data privacy and security.</li>
</ul>
<p>Total rewards package at Mercury includes base salary, equity (stock options/RSUs), and benefits.</p>
<p>Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry. New hire offers are made based on a candidate&#39;s experience, expertise, geographic location, and internal pay equity relative to peers.</p>
<p>Our target new hire base salary ranges for this role are:</p>
<ul>
<li>US employees: $320,000-$420,000</li>
<li>Canadian employees (any location): CAD $302,400-$396,900</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$320,000-$420,000 (US employees), CAD $302,400-$396,900 (Canadian employees)</Salaryrange>
      <Skills>data science, machine learning, statistics, financial calculations, experimentation, data governance, data security, data engineering, team management</Skills>
      <Category>Finance</Category>
      <Industry>Fintech</Industry>
      <Employername>Mercury</Employername>
      <Employerlogo>https://logos.yubhub.co/mercury.com.png</Employerlogo>
      <Employerdescription>Mercury is a financial technology company.</Employerdescription>
      <Employerwebsite>https://www.mercury.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mercury/jobs/5812226004</Applyto>
      <Location>San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>782f7e7f-e0e</externalid>
      <Title>Revenue Technology - Data Strategy &amp; Operations Lead</Title>
      <Description><![CDATA[<p>We&#39;re looking for a Data Strategy &amp; Operations leader to own the data foundations that power revenue execution. This role ensures that revenue data is reliable, interpretable, scalable, and usable as the business evolves and that teams can act on what they see with confidence.</p>
<p>In this role, you will report to the Head of Platforms &amp; Infrastructure and play a central role in shaping how Mercury models, governs, and operationalizes GTM data. You’ll partner closely with Data Engineering, Data Science, Solution Architecture, Platform Engineering, etc.</p>
<p>Some key responsibilities include:</p>
<ul>
<li>Owning the definition, structure, and reliability of data originating from revenue platforms (e.g., Salesforce, GTM tools, automation systems)</li>
<li>Serving as the primary decision owner for GTM-sourced tables and views used for revenue execution, forecasting inputs, lifecycle tracking, and signal-based workflows</li>
<li>Designing and evolving core GTM data models across Salesforce, ETL, and analytics layers</li>
<li>Partnering with Data Engineering to align GTM schemas with enterprise data models and define clear data contracts between source systems and downstream consumers</li>
<li>Partnering with Data Science / Analytics to ensure revenue data is interpretable, statistically sound, and reflects how the business actually operates</li>
<li>Owning clarity around data ownership boundaries, shared dependencies, and escalation paths when upstream or downstream changes impact revenue integrity</li>
<li>Defining and upholding data quality, freshness, consistency, and documentation standards for revenue platforms</li>
<li>Monitoring and improving pipeline reliability, performance, and scalability, proactively identifying fragile or redundant transformations</li>
<li>Identifying opportunities to automate manual or error-prone data workflows and reduce operational overhead</li>
<li>Acting as a data thought partner to Platforms &amp; Infrastructure, Revenue Operations, Analytics, and Security , advising on feasibility, tradeoffs, and sequencing for data-heavy initiatives</li>
</ul>
<p>You should have:</p>
<ul>
<li>7+ years of experience in data engineering or data systems roles within SaaS or technology companies</li>
<li>Deep experience designing and operating production data pipelines</li>
<li>Highly proficient in SQL and experienced in data modeling</li>
<li>Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery, Redshift)</li>
<li>Experience with ETL / ELT tooling (e.g., dbt, Airflow, Census, or similar)</li>
<li>Understanding of Salesforce data models and common GTM system architectures</li>
<li>Ability to translate business concepts into durable, well-structured data models</li>
<li>Clear communication skills with both technical and non-technical partners</li>
</ul>
<p>Preferred qualifications include:</p>
<ul>
<li>Experience supporting revenue, sales, or customer lifecycle data</li>
<li>Familiarity with event-based data platforms (e.g., Data Cloud or equivalents)</li>
<li>Experience working alongside platform engineering and security teams</li>
<li>Exposure to data governance, access controls, and compliance considerations</li>
<li>Experience mentoring or guiding other data practitioners</li>
</ul>
<p>The total rewards package at Mercury includes base salary, equity, and benefits. Our salary and equity ranges are highly competitive within the SaaS and fintech industry and are updated regularly using the most reliable compensation survey data for our industry.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange>$142,600 - $198,000</Salaryrange>
      <Skills>SQL, data modeling, modern data stacks, ETL/ELT tooling, Salesforce data models, GTM system architectures, event-based data platforms, data governance, access controls, compliance considerations, mentoring/guiding other data practitioners</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Mercury</Employername>
      <Employerlogo>https://logos.yubhub.co/mercury.com.png</Employerlogo>
      <Employerdescription>Mercury is a fintech company that provides banking services through Choice Financial Group and Column N.A.</Employerdescription>
      <Employerwebsite>https://www.mercury.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/mercury/jobs/5806201004</Applyto>
      <Location>San Francisco, CA, New York, NY, Portland, OR, or Remote within Canada or United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>6fb4a600-300</externalid>
      <Title>Senior People Systems and Data Analyst</Title>
      <Description><![CDATA[<p>Freenome is seeking a highly analytical and systems-savvy Senior People Systems and Data Analyst to elevate our People data capabilities and drive insight-led decision-making.</p>
<p>This role will own the development of dashboards, analytics, and reporting that provide visibility into workforce trends and inform hiring, retention, engagement, and organisational planning strategies.</p>
<p>While the role maintains ownership of People systems and data integrity, its primary focus is generating insights that enable better business decisions.</p>
<p>Responsibilities:</p>
<ul>
<li>Develop, analyse, and interpret workforce data to identify trends, risks, and opportunities across recruiting, retention, performance, and engagement.</li>
<li>Conduct deep-dive analyses (attrition trends, hiring funnel performance, compensation insights, engagement drivers).</li>
<li>Build statistical and trend analyses across the employee lifecycle to proactively surface insights that influence People and business decisions.</li>
<li>Deliver actionable quarterly reporting that translates complex datasets into clear, executive-ready insights and recommendations.</li>
<li>Design, build, and maintain scalable and automated dashboards.</li>
<li>Establish consistent KPI definitions and reporting standards.</li>
<li>Improve data visualisation practices to ensure clarity, usability, and impact.</li>
<li>Replace manual reporting processes with automated, real-time reporting solutions.</li>
<li>Ensure high levels of data accuracy, integrity, and trust.</li>
<li>Serve as primary People data and HRIS reporting expert.</li>
<li>Partner with IT and vendors to implement and manage integrations, evaluate new features, and identify opportunities to automate data workflows.</li>
<li>Manage data governance standards to ensure compliance and data privacy best practices.</li>
<li>Identify process improvements that enhance efficiency, data structure and accessibility.</li>
<li>Enable tier-zero employee and manager self-service for policies, processes, and data through systems and chatbots, reducing reliance on People team members and ticket systems.</li>
<li>Champion system education to drive employee and manager self-service adoption and data and systems utilisation.</li>
<li>Drive AI initiatives such as deploying LLMs for job descriptions and ML to forecast attrition, ensuring tools are trained and continuously refined to be fit-for-purpose and remain within ethical and legal guardrails.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Bachelor&#39;s degree in Statistics, Data Science, Business Analytics, HR, Economics, or related field.</li>
<li>5–7 years of experience in People Analytics, Workforce Analytics, HRIS Analytics, or related analytical roles.</li>
<li>Prior experience modifying, improving and/or implementing People systems (Greenhouse, Paylocity a bonus).</li>
<li>Advanced proficiency in data visualisation tools (Tableau, Power BI, Looker, or similar).</li>
<li>Strong experience building executive dashboards from the ground up.</li>
<li>Knowledge of data, statistical, and predictive analysis.</li>
<li>Advanced Excel/Google Sheets skills.</li>
<li>Experience working with large, complex datasets.</li>
<li>Demonstrated ability to translate data into clear business insights and recommendations.</li>
<li>Strong understanding of core People processes (recruiting, compensation, performance, engagement, workforce planning).</li>
<li>Excellent verbal and written communication skills.</li>
<li>Strong analytical and problem-solving skills.</li>
<li>Passion for driving continuous improvement simplifying workflows, implementing automation, and driving self-service.</li>
<li>Demonstrated interest in leveraging AI to enhance systems and analytics.</li>
</ul>
<p>Nice to haves:</p>
<ul>
<li>Experience in a high-growth biotech or startup environment, with exposure to IPO preparation, public company reporting requirements, or SOX-compliant processes.</li>
<li>Familiarity with Greenhouse, Paylocity, or other HRIS/ATS platforms.</li>
<li>Exposure to predictive analytics or basic modelling techniques.</li>
<li>SHRM-CP or SHRM-SCP certification.</li>
</ul>
<p>Benefits and additional information:</p>
<p>The US target range of our base salary for new hires is $120,275 - $170,100. You will also be eligible to receive equity, cash bonuses, and a full range of medical, financial, and other benefits depending on the position offered.</p>
<p>Please note that individual total compensation for this position will be determined at the Company’s sole discretion and may vary based on several factors, including but not limited to, location, skill level, years and depth of relevant experience, and education.</p>
<p>We invite you to check out our career page @ freenome.com/job-openings/ for additional company information.</p>
<p>Freenome is proud to be an equal-opportunity employer, and we value diversity. Freenome does not discriminate on the basis of race, colour, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.</p>
<p>Applicants have rights under Federal Employment Laws.</p>
<p>Family &amp; Medical Leave Act (FMLA)</p>
<p>Equal Employment Opportunity (EEO)</p>
<p>Employee Polygraph Protection Act (EPPA)</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$120,275 - $170,100</Salaryrange>
      <Skills>data visualisation, statistics, data science, business analytics, HR, economics, People analytics, workforce analytics, HRIS analytics, Greenhouse, Paylocity, Tableau, Power BI, Looker, Excel, Google Sheets, predictive analysis, data governance, AI, LLMs, ML</Skills>
      <Category>HR</Category>
      <Industry>biotech</Industry>
      <Employername>Freenome</Employername>
      <Employerlogo>https://logos.yubhub.co/freenome.com.png</Employerlogo>
      <Employerdescription>Freenome is a biotech company that develops software for cancer detection.</Employerdescription>
      <Employerwebsite>https://freenome.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/freenome/jobs/8460884002</Applyto>
      <Location>Brisbane, California</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>1c431665-20b</externalid>
      <Title>Data Governance and Management Lead</Title>
      <Description><![CDATA[<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto. We are seeking a Data Governance &amp; Management Lead within the Global Analytics team to help develop and implement data controls, data quality standards, and governance practices across the platform.</p>
<p>This role supports data integrity, metadata, and access controls to help ensure data is accurate, consistent, and fit for purpose. This is a hands-on role that requires strong technical fluency, structured problem-solving, and the ability to translate governance requirements into practical implementations within data systems.</p>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Working knowledge of data governance, data management, and data quality frameworks</li>
<li>Experience supporting the implementation of data controls within data pipelines and reporting systems</li>
<li>Advanced proficiency in SQL, Python, or other data query and analysis tools</li>
<li>Proficiency with business intelligence and data visualization tools such as Looker, Power BI, or Tableau</li>
<li>Experience with database design, including understanding complex data schemas and data extraction</li>
<li>Familiarity with data lineage, metadata management, and data modeling concepts</li>
<li>Ability to define and implement data quality rules and validation checks</li>
<li>Understanding of data access principles, including role-based access and data classification</li>
<li>Ability to document data processes and controls clearly and in a structured way</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<ul>
<li>Oversee the data governance program, identify improvement areas, and implement best practices to enhance data quality, integrity, and security</li>
<li>Develop and implement data quality standards and monitoring processes, including establishing data quality metrics and thresholds</li>
<li>Assist in managing the data issue lifecycle, including tracking and supporting remediation efforts</li>
<li>Manage the data governance platform (Atlan) and serve as the primary subject matter expert</li>
<li>Assist in data classification efforts, including identifying and categorizing sensitive data and critical data elements</li>
<li>Manage external data requests, including regulatory inquiries, ensuring compliance with banking regulations</li>
<li>Monitor and report on key data governance metrics and KPIs, providing insights and recommendations to senior management</li>
<li>Lead data governance meetings and workshops, facilitating discussions and decision-making to drive the data governance program forward</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Have a deep understanding of Anchorage Digital’s strategy and business lines.</li>
<li>Understand how data supports decision-making and operational processes across the organization</li>
<li>Possess strategic thinking and vision, with the ability to develop and implement a comprehensive data governance strategy aligned with organizational goals and objectives</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Able to communicate complex issues clearly and credibly to a wide range of audiences.</li>
<li>Document data processes, controls, and findings clearly for internal stakeholders</li>
<li>Build effective relationships and rapport with stakeholders, including cross-functional and external partners</li>
<li>Communicate, organize, and execute cross-team goals and projects, leveraging relationships and resources to solve problems</li>
<li>Collaborate with Data Platform, InfoSec, Product, and Engineering partners</li>
</ul>
<p><strong>You may be a fit for this role if you have:</strong></p>
<ul>
<li>Bachelor’s degree required. Advanced degrees or certifications in data analytics or governance preferred</li>
<li>4–7 years of experience in data governance, data management, data quality, or data analytics</li>
<li>Hands-on experience implementing or supporting data quality and governance practices</li>
<li>Experience managing data classification, access controls, and external data requests</li>
<li>Experience working with data pipelines, reporting systems, or analytical datasets</li>
<li>Experience writing, editing, or reviewing technical documentation for regulatory or banking contexts</li>
<li>Strong attention to detail, with a focus on accuracy, completeness, and consistency in data governance processes and controls</li>
<li>Ability to work independently on defined tasks and contribute to team objectives</li>
<li>Strong problem-solving skills and comfort working in structured, detail-oriented environments</li>
</ul>
<p><strong>Although not a requirement, bonus points if:</strong></p>
<ul>
<li>You&#39;ve kept up to date with the proliferation of blockchain and crypto innovations.</li>
<li>You were emotionally moved by the soundtrack to Hamilton, which chronicles the founding of a new financial system. :)</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data governance, data management, data quality frameworks, SQL, Python, Looker, Power BI, Tableau, database design, data lineage, metadata management, data modeling, data access principles, role-based access, data classification</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.com.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a crypto platform that enables institutions to participate in digital assets through custody, staking, trading, governance, settlement, and the industry&apos;s leading security infrastructure.</Employerdescription>
      <Employerwebsite>https://anchorage.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/5bfbd64c-933e-418c-9c07-5aea50212c0d</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>3d849fbc-058</externalid>
      <Title>Member of Product, Data Platform</Title>
      <Description><![CDATA[<p>At Anchorage Digital, we are building the world’s most advanced digital asset platform for institutions to participate in crypto.</p>
<p>The Data Platform team is the backbone of Anchorage Digital&#39;s information infrastructure. As data becomes the lifeblood of every product, compliance workflow, and client-facing report we produce, this team is responsible for building and operating a unified, scalable, and reliable data platform that serves the entire organization.</p>
<p>As a Data Platform Product Manager, you will own the strategy and execution for centralizing and formalizing the company&#39;s data infrastructure , spanning internal operational data, transaction and blockchain data, customer data, and external data sources.</p>
<p>Your mission is to transform a fragmented data landscape into a single source of truth that powers mission-critical reporting, business insights, and downstream product experiences across every team at Anchorage.</p>
<p>This is a force-multiplier role. Your work will elevate the quality, speed, and reliability of every product and team at the company.</p>
<p>You will define the standards, build the platform, and create the foundation that enables Anchorage to scale with confidence.</p>
<p>If you thrive at the intersection of complex data systems, cross-functional influence, and platform thinking, this is your opportunity to have outsized impact at a category-defining company in digital assets.</p>
<p>Below, we define our Factors of Growth &amp; Impact to help Anchorage Villagers measure their impact and articulate feedback, coaching, and the rich learning that happens while exploring, developing, and mastering capabilities within and beyond the Member of Product, Data Platform role:</p>
<p><strong>Technical Skills:</strong></p>
<ul>
<li>Own the detailed prioritization of the data platform roadmap, balancing foundational infrastructure work, new capabilities, and technical debt.</li>
<li>Demonstrate deep strategic thinking in shaping the platform roadmap, considering the unique data challenges of digital assets, blockchain protocols, and regulated financial services.</li>
<li>Deliver complex, cross-functional projects with multiple dependencies across engineering, analytics, compliance, and operations teams.</li>
<li>Work closely with engineering and data science counterparts to drive product development processes, sprint planning, and architectural decisions.</li>
<li>Ability to understand and reason about system architecture , including data warehousing, ETL/ELT pipelines, streaming vs. batch processing, and modern data stack components , and communicate clear requirements to engineering.</li>
<li>Drive comprehensive go-to-market strategy for internal platform adoption, including defining success metrics, tracking KPIs around data quality and platform usage, and iterating based on data-driven insights.</li>
</ul>
<p><strong>Complexity and Impact of Work:</strong></p>
<ul>
<li>Lead and influence cross-functional teams while maintaining strong stakeholder relationships across the entire organization , from engineering to finance to compliance.</li>
<li>Exercise independent decision-making and take full ownership of data platform strategy and execution.</li>
<li>Contribute strategic insights that significantly impact company direction, operational efficiency, and product quality.</li>
<li>Demonstrate platform leadership that elevates the performance and effectiveness of every team that depends on data.</li>
</ul>
<p><strong>Organizational Knowledge:</strong></p>
<ul>
<li>Develop deep understanding of Anchorage&#39;s business model, product suite, regulatory environment, and organizational structure.</li>
<li>Build and maintain strong relationships with stakeholders across all departments to ensure the data platform serves the company&#39;s most critical needs.</li>
<li>Navigate and improve organizational data practices to enhance efficiency, compliance, and decision-making.</li>
<li>Drive company objectives through strategic data platform decisions and initiatives.</li>
</ul>
<p><strong>Communication and Influence:</strong></p>
<ul>
<li>Effectively influence and motivate teams across the organization to adopt platform standards and invest in data quality, even when those teams do not report to you.</li>
<li>Enable cross-functional collaboration through clear, consistent communication about platform capabilities, timelines, and data governance expectations.</li>
<li>Act as a thoughtful knowledge partner to senior leadership, translating complex data infrastructure topics into clear business impact.</li>
<li>Proactively communicate platform goals, status updates, and data health metrics throughout the organization.</li>
</ul>
<p><strong>You may be a fit for this role if you:</strong></p>
<ul>
<li>5+ years of product management experience, with significant time spent on data platforms, data infrastructure, or data-intensive enterprise products.</li>
<li>Proven experience building or scaling enterprise data platforms , including data warehousing, data lakes, ETL/ELT pipelines, or modern data stack tooling (e.g., Snowflake, Databricks, dbt, Airflow, Spark).</li>
<li>Strong understanding of data modeling, data governance, and data quality frameworks.</li>
<li>Experience working with diverse data types , including transactional data, customer data, financial data, and ideally blockchain or on-chain data.</li>
<li>Track record of driving cross-functional alignment and adoption for internal platform products where you must influence without direct authority.</li>
<li>Exceptional written and verbal communication skills, with the ability to convey complex data architecture concepts to both technical and non-technical audiences.</li>
<li>Your empathy and adaptability not only complement others&#39; working styles but also embody our culture of curiosity, creativity, and shared understanding.</li>
<li>You self describe as some combination of the following: creative, humble, ambitious, detail oriented, hard working, trustworthy, eager to learn, methodical, action oriented, and tenacious.</li>
</ul>
<p><strong>Although not a requirement, bonus points if you have:</strong></p>
<ul>
<li>You have hands-on experience with blockchain data indexing, onchain analytics, or crypto-native data infrastructure.</li>
<li>You have built data platforms that serve both internal analytics consumers and external client-facing products (reports, statements, dashboards).</li>
<li>You have experience supporting clients with data-related issues or concerns.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data platforms, data infrastructure, data-intensive enterprise products, data warehousing, data lakes, ETL/ELT pipelines, modern data stack tooling, Snowflake, Databricks, dbt, Airflow, Spark, data modeling, data governance, data quality frameworks, blockchain or on-chain data</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anchorage Digital</Employername>
      <Employerlogo>https://logos.yubhub.co/anchorage.com.png</Employerlogo>
      <Employerdescription>Anchorage Digital is a crypto platform that enables institutions to participate in digital assets through custody, staking, trading, governance, settlement, and the industry&apos;s leading security infrastructure.</Employerdescription>
      <Employerwebsite>https://anchorage.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/anchorage/0e730f61-a2e4-4152-8277-3f6383cc69a6</Applyto>
      <Location>United States</Location>
      <Country></Country>
      <Postedate>2026-04-17</Postedate>
    </job>
    <job>
      <externalid>107b978a-791</externalid>
      <Title>Data - Regulatory Reporting, Luxembourg</Title>
      <Description><![CDATA[<p>We are seeking a Data and Regulatory Reporting professional to build and operate our regulatory reporting production line, including robust data sourcing, reconciliation, data quality controls, and submission governance.</p>
<p>The role partners closely with our Data Officer based in San Francisco and the regulatory reporting team based in Ireland. The tight collaboration will ensure timely, accurate, and auditable reporting to the CSSF and efficient responses to supervisory requests.</p>
<p><strong>Regulatory Reporting Production and Controls</strong></p>
<ul>
<li>Own end-to-end production of periodic regulatory and prudential reporting for Bridge Building S.A. (CSSF periodic reports, BCL reports, CRS, CARF and CESOP).</li>
<li>Implement and operate first-line controls to ensure data completeness, data accuracy, and data timeliness, including reconciliations to the general ledger, sub-ledgers and Bridge Building S.A.&#39;s platform.</li>
</ul>
<p><strong>Data Governance and Data Quality</strong></p>
<ul>
<li>Define reporting data models, lineage, and data dictionaries; maintain consistent definitions for key regulatory metrics.</li>
<li>Implement data quality rules, exception management, and remediation workflows in collaboration with data owners.</li>
</ul>
<p><strong>Submission Governance and Regulator Readiness</strong></p>
<ul>
<li>Maintain submission procedures, access controls, and audit trails for regulatory transmissions and supervisory requests.</li>
<li>Support responses to CSSF questions, reporting-related inspections, and remediation plans where issues are identified.</li>
</ul>
<p><strong>Automation and Scalability</strong></p>
<ul>
<li>Drive automation of reporting workflows (e.g. SQL/ETL/BI) to reduce manual effort and operational risk.</li>
<li>Maintain clear documentation, runbooks, and operational KPIs for reporting processes.</li>
</ul>
<p><strong>Key Requirements</strong></p>
<ul>
<li>Bachelor&#39;s or Master&#39;s degree in Finance, Data/Analytics, Statistics, Engineering, or a related field.</li>
<li>3-5 years in regulatory reporting and/or regulatory data management within regulated financial services (payments/EMI/PI, fintech, bank, or advisory).</li>
<li>Proven experience implementing reconciliations, data controls, and reporting governance with strong audit discipline.</li>
</ul>
<p><strong>Skills</strong></p>
<ul>
<li>Strong SQL and data transformation skills; ability to translate regulatory requirements into data specifications. Live SQL assessment will be part of the interview.</li>
<li>Strong controls mindset, attention to detail, and capability to manage deadlines and stakeholders.</li>
</ul>
<p><strong>Languages</strong></p>
<ul>
<li>Fluent English required</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SQL, data transformation, regulatory reporting, data governance, data quality</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Bridge Building S.A.</Employername>
      <Employerlogo>https://logos.yubhub.co/bridgebuildingsa.com.png</Employerlogo>
      <Employerdescription>Bridge Building S.A. is a Luxembourg-based company building a regulated EMI and CASP.</Employerdescription>
      <Employerwebsite>https://www.bridgebuildingsa.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/stripe/jobs/7587256</Applyto>
      <Location>Luxembourg</Location>
      <Country></Country>
      <Postedate>2026-03-31</Postedate>
    </job>
    <job>
      <externalid>d85d5c0a-c7f</externalid>
      <Title>Project Manager - Data Risk &amp; Control (9 months FTC)</Title>
      <Description><![CDATA[<p>We are seeking a hands-on, delivery-focused Project Manager to lead two critical initiatives: the completion of the organisation&#39;s Data Control Library and the delivery of a comprehensive Data Retention Programme.</p>
<p>This role is accountable for translating data strategy and regulatory expectations into practical, executable outcomes across the business. You will work closely with senior stakeholders across technology, risk, compliance, and operations to ensure data is retained, managed, and securely deleted in line with legal, regulatory, and business requirements.</p>
<p>Success in this role requires strong project delivery capability, deep understanding of the data lifecycle, and the ability to influence and mobilise teams without direct authority.</p>
<p>Key responsibilities include:</p>
<ul>
<li>Leading and delivering a structured, end-to-end Data Retention Programme, including the identification and remediation of over-retained data.</li>
<li>Translating high-level data strategy and regulatory requirements into clear, actionable plans for operational teams.</li>
<li>Working closely with the Data Protection Officer (DPO), Deputy CIO, and other senior stakeholders to clarify data ownership, accountability, and governance.</li>
<li>Identifying and supporting Data Owners across the organisation, ensuring they understand their GDPR and data retention obligations.</li>
<li>Owning the delivery of the Data Control Library, ensuring controls are validated, risk-mapped, clearly documented, and embedded into business-as-usual processes.</li>
<li>Identifying control gaps or failures, and partnering with technology and business owners to develop Remediation Plans with clear timelines.</li>
<li>Quantifying and reporting on the Residual Risk remaining when controls are found to be ineffective or missing.</li>
<li>Partnering with Second Line of Defence (2LoD) teams to align on expectations, controls, and training requirements.</li>
<li>Establishing and maintaining effective project governance.</li>
<li>Driving delivery momentum, proactively resolving any blockers to ensure timelines are met.</li>
</ul>
<p>Requirements include:</p>
<ul>
<li>Strong influencing skills, with the ability to secure engagement and commitment without formal authority.</li>
<li>Outcome-focused with an ability to see the problems through to resolution with minimal supervision.</li>
<li>Comfortable navigating ambiguity and maintaining momentum through complex challenges.</li>
<li>Able to translate between regulatory, technical, and operational perspectives.</li>
<li>Confident presenting to senior leaders while also working “in the weeds” with data owners and delivery teams.</li>
<li>Anticipates risks and issues early and acts decisively to address them.</li>
</ul>
<p>The ideal candidate will have experience implementing or coordinating data retention schedules and data cleanup initiatives, proven experience delivering complex, cross-functional programmes end-to-end, and a strong command of structured project management methodologies.</p>
<p>Benefits include:</p>
<ul>
<li>Make an Impact: Work on projects that directly shape the future of banking and improve the financial lives of our customers.</li>
<li>Culture of Excellence: Be part of a collaborative, empowered, and forward-thinking team.</li>
<li>Growth and Development: We are committed to your professional growth, offering opportunities to learn new technologies, take on new challenges, and own interesting things from day one.</li>
<li>A Bank That Cares: We&#39;re a Living Wage employer, committed to flexible working, and dedicated to creating a fair, open, and safe working environment with compassion and inclusion at its core.</li>
<li>Comprehensive Benefits: We offer a competitive salary and a comprehensive benefits package, including company-enhanced salary sacrifice pension scheme, private medical insurance, 25 days holiday, and life insurance.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>contract</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>project management, data risk and control, data retention, data protection, regulatory compliance, structured project management methodologies, data governance, data ownership, GDPR, data retention obligations</Skills>
      <Category>Finance</Category>
      <Industry>Finance</Industry>
      <Employername>Starling</Employername>
      <Employerlogo>https://logos.yubhub.co/starlingbank.com.png</Employerlogo>
      <Employerdescription>Starling is a digital bank providing a fairer, smarter, and more human alternative to traditional banks.</Employerdescription>
      <Employerwebsite>https://www.starlingbank.com/</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://apply.workable.com/j/54D5531B73</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-20</Postedate>
    </job>
    <job>
      <externalid>290c3d28-4b2</externalid>
      <Title>Partner Solution Architect - ASEAN</Title>
      <Description><![CDATA[<p>About Mistral AI</p>
<p>At Mistral AI, we believe in the power of AI to simplify tasks, save time, and enhance learning and creativity. Our technology is designed to integrate seamlessly into daily working life.</p>
<p>We are a global company with teams distributed between France, USA, UK, Germany and Singapore. We are a diverse workforce that thrives in competitive environments and is committed to driving innovation.</p>
<p>Why This Role Matters</p>
<p>You will be the technical linchpin between Mistral and our strategic partners in ASEAN (Nvidia, Dell, Hyperscalers, Global System Integrators), translating our open-weight models and sovereign AI architecture into deployable, scalable solutions.</p>
<p>By designing joint architectures, influencing partner GTM motions, and earning a seat at the CIO/CTO table, you will accelerate Mistral’s technical credibility and deployment velocity across Asia Pacific.</p>
<p>This is a foundational role where you will define how open-weight AI is operationalized at scale in the region.</p>
<p>What You Will Do</p>
<p><strong>Partner Technical Leadership &amp; Architecture Design</strong></p>
<ul>
<li>Lead the technical design, deployment, and enablement of Mistral’s partner solutions, bridging our AI models with partner infrastructure (Nvidia, Dell, Hyperscalers, GSIs) to deliver scalable AI Labs, AI Factories, and sovereign AI architectures.</li>
</ul>
<ul>
<li>Serve as the trusted technical advisor to partner CTOs, CIOs, and engineering leaders—shaping joint architectures, guiding GPU/model deployment strategies, and accelerating GTM execution.</li>
</ul>
<ul>
<li>Design reference architectures and deployment patterns for partner-led implementations (e.g., multi-GPU inference clusters, AI Lab topologies, private AI clouds).</li>
</ul>
<ul>
<li>Innovate the Executive Briefing Center (EBC) function for technical leaders (CIOs, CTOs, CDOs), positioning Mistral as the default choice for enterprise AI.</li>
</ul>
<ul>
<li>Co-design sovereign AI reference architectures with Nvidia and Dell (H100, H200, GB200 platforms).</li>
</ul>
<p><strong>Co-Sell &amp; Revenue Enablement</strong></p>
<ul>
<li>Collaborate with Mistral’s partner and sales teams to progress deals, providing technical expertise to penetrate accounts and influence GTM pipeline.</li>
</ul>
<ul>
<li>Support partners in qualifying/disqualifying opportunities, ensuring Mistral solutions unlock maximum value for customers.</li>
</ul>
<ul>
<li>Deploy Mistral’s enterprise AI suite (models, fine-tuning, use-case building) in partner-led environments, tailoring solutions to customer requirements.</li>
</ul>
<p><strong>Trusted Advisor &amp; Lighthouse Implementations</strong></p>
<ul>
<li>Drive strategic partner-led opportunities through technical discovery, architecture design, and POC execution.</li>
</ul>
<ul>
<li>Lead lighthouse deployments that become referenceable case studies (e.g., Singtel AI Grid, Accenture AI Lab).</li>
</ul>
<ul>
<li>Establish a scalable partner enablement framework, training 100+ partner engineers across ASEAN.</li>
</ul>
<p><strong>Product Feedback &amp; Internal Collaboration</strong></p>
<ul>
<li>Coordinate with Mistral’s product and engineering teams to relay partner-specific requirements and feedback.</li>
</ul>
<ul>
<li>Align joint GTM and technical execution between Mistral Science, Partner Engineering, and partner field teams.</li>
</ul>
<p>About You</p>
<p><strong>Must-Have</strong></p>
<ul>
<li>10–15 years’ experience in partner-facing technical sales or solution architecture (e.g., Partner SA, Alliance Architect, Partner Technology Strategist).</li>
</ul>
<ul>
<li>Proven ability to engage C-suite and senior technical stakeholders (CTO, CIO, Chief Architect) in strategic architecture discussions.</li>
</ul>
<ul>
<li>Deep GenAI/LLM expertise: RAG, fine-tuning, prompt engineering, model evaluation, and deployment patterns.</li>
</ul>
<ul>
<li>Technical mastery of AI/ML infrastructure (GPU clusters, cloud platforms, model deployment frameworks).</li>
</ul>
<ul>
<li>Track record of co-designing/deploying joint solutions with ecosystem partners (Nvidia, Dell, AWS, Accenture, etc.).</li>
</ul>
<ul>
<li>Executive communication: Ability to articulate science-driven value propositions to technical and business audiences.</li>
</ul>
<ul>
<li>Entrepreneurial mindset: Operates autonomously in high-growth environments; creates playbooks, not follows them.</li>
</ul>
<ul>
<li>Fluent in English; confident working across diverse, cross-cultural teams in Asia.</li>
</ul>
<p><strong>Nice-to-Have</strong></p>
<ul>
<li>Experience with open-weight LLMs or open-source AI stacks (Mistral, Hugging Face, LangChain, vLLM, RAG frameworks).</li>
</ul>
<ul>
<li>Prior involvement in AI Lab, AI Factory, or Sovereign Cloud deployments.</li>
</ul>
<ul>
<li>Familiarity with data governance, model evaluation, and GPU sizing for large-scale inference.</li>
</ul>
<ul>
<li>Network across GSIs and infrastructure partners in Asia</li>
</ul>
<ul>
<li>Exposure to multi-region partner programs or joint GTM initiatives in APJ.</li>
</ul>
<ul>
<li>Bonus languages: Korean, Japanese, or Mandarin for regional partner engagement.</li>
</ul>
<p>What we offer</p>
<ul>
<li>💰 Competitive cash salary and equity</li>
</ul>
<ul>
<li>🚑 Health Insurance : Best in Class</li>
</ul>
<ul>
<li>🥎 Sport : $90 for gym membership allowance</li>
</ul>
<ul>
<li>🥕 Food : $200 monthly allowance for meals (solution might evolve as we grow bigger)</li>
</ul>
<ul>
<li>🚴 Transportation : $120/month for public transport or Parking charges reimbursed</li>
</ul>
<ul>
<li>🏝️ PTO: 18 per year</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>GenAI/LLM expertise, RAG, fine-tuning, prompt engineering, model evaluation, deployment patterns, AI/ML infrastructure, GPU clusters, cloud platforms, model deployment frameworks, co-designing/deploying joint solutions, ecosystem partners, Nvidia, Dell, AWS, Accenture, open-weight LLMs, open-source AI stacks, Mistral, Hugging Face, LangChain, vLLM, RAG frameworks, data governance, model evaluation, GPU sizing, large-scale inference, GSIs, infrastructure partners, multi-region partner programs, joint GTM initiatives, APJ, Korean, Japanese, Mandarin</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Mistral AI</Employername>
      <Employerlogo></Employerlogo>
      <Employerdescription>Mistral AI is an AI technology company that provides high-performance, optimized, open-source and cutting-edge models, products and solutions.</Employerdescription>
      <Employerwebsite>https://mistral.ai/careers</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.lever.co/mistral/fe3542b5-4f99-4d62-af6a-fbdfd13bf0e4</Applyto>
      <Location>Singapore</Location>
      <Country></Country>
      <Postedate>2026-03-10</Postedate>
    </job>
    <job>
      <externalid>c7557416-e8a</externalid>
      <Title>FBS Data Engineer (Javascript, SQL experience)</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. As a Data Engineer, you will be responsible for designing, developing, and implementing data products of intermediate complexity. You will work with data projects, producing data building blocks, data models, and data flows for client requests. You will also create business user access methods to structured and unstructured data, utilizing techniques such as mapping data to a common data model, natural language processing, and transforming data as necessary to satisfy business rules. You will translate business data stories into a technical story breakdown structure and work estimates for a schedule or planned agile sprint. You will develop and maintain moderately complex scalable data pipelines for both streaming and batch requirements and build out new API integrations to support increased demands of data volume and complexity. You will prep/cleanses data to optimize for downstream reporting via Farmers standard visualization or AI/ML tools. You will apply sound knowledge of data governance concepts and execute on mid-size projects and large tasks within timelines. You will develop data project plans and execute with some supervision. Performs other duties as assigned.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, javascript, sql, data governance, data pipelines, API integrations, data visualization, AI/ML</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across more than 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/6ncgFxZs5XYbfCSCkC1yMD/remote-fbs-data-engineer-(javascript%2C-sql-experience)-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7af16166-8fd</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p><strong>What to expect on your journey with us:</strong></p>
<ul>
<li>A solid and innovative company with a strong market presence</li>
<li>A dynamic, diverse, and multicultural work environment</li>
<li>Leaders with deep market knowledge and strategic vision</li>
<li>Continuous learning and development</li>
</ul>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p><strong>Key Responsibilities:</strong></p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
<li>Mentor developers and create reference implementations/frameworks.</li>
<li>Partners with System Architects to elaborate capabilities and features.</li>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p><strong>Requirements:</strong></p>
<ul>
<li>Over 6 years of experience as a senior domain architect for Data domains</li>
<li>Advanced English Level</li>
<li>Masters&#39; degree (PLUS)</li>
<li>Insurance Experience (PLUS) Financial Services (PLUS)</li>
</ul>
<p><strong>Technical &amp; Business Skills:</strong></p>
<ul>
<li>ETL/ELT Tools (Informatica, DBT) - Advanced (7+ Years)</li>
<li>Data Architecture / Data Modeling – Advanced (MUST)</li>
<li>Data Warehouse – Advanced (MUST)</li>
<li>Cloud Data Platforms - Advanced</li>
<li>Data Integration Tools – Advanced</li>
<li>Snowflake or Databricks - Intermediate (4-6 Years) MUST</li>
<li>Any Cloud - Intermediate (4-6 Years)</li>
<li>Power BI or Tableau - Intermediate (4-6 Years)</li>
<li>Data Science tools (Sagemaker, Databricks) - Intermediate (4-6 Years)</li>
<li>Data Lakehouse – Intermediate (MUST)</li>
</ul>
<ul>
<li>Data Governance - Intermediate</li>
<li>AI/ML - Entry Level (PLUS)</li>
<li>Master Data Management - Intermediate</li>
<li>Operational Data Management - Intermediate</li>
</ul>
<p><strong>Benefits:</strong></p>
<p>This position comes with a competitive compensation and benefits package.</p>
<ul>
<li>A competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>You will also enjoy a dynamic and inclusive work culture within a globally renowned group.</li>
<li>Private Health Insurance.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development opportunities in partnership with renowned companies.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, AI/ML, Master Data Management, Operational Data Management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global consulting and technology services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/jdUFHSPZZjHsgd3TR4R3BS/remote-fbs-senior-data-domain-architect-in-colombia-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7b03b30a-b20</externalid>
      <Title>FBS Senior Data Domain Architect</Title>
      <Description><![CDATA[<p>FBS – Farmer Business Services is part of Farmers operations with the purpose of building a global approach to identifying, recruiting, hiring, and retaining top talent. By combining international reach with US expertise, we build diverse and high-performing teams that are equipped to thrive in today’s competitive marketplace.</p>
<p>We believe that the foundation of every successful business lies in having the right people with the right skills. That is where we come in—helping Farmers build a winning team that delivers consistent and sustainable results.</p>
<p>Since we don’t have a local legal entity, we’ve partnered with Capgemini, which acts as the Employer of Record. Capgemini is responsible for managing local payroll and benefits.</p>
<p><strong>Objective:</strong> Designs and develops Data/Domain IT architecture (integrated process, applications, data and technology) solutions to business problems in alignment with the Enterprise Architecture direction and standards.</p>
<p>**Key Responsibilities:*</p>
<ul>
<li>Utilizes in-depth conceptual and practical knowledge in Domain Architecture and basic knowledge of related job disciplines to perform complex technical planning, architecture development and modification of specifications for Domain solution delivery.</li>
</ul>
<ul>
<li>Solves complex problems and partners effectively to execute broad, continuous Domain level architecture improvement roadmaps that impacts the organization.</li>
</ul>
<ul>
<li>Works independently, receives minimal guidance and direction to solve for and influence Enterprise and System architecture through Domain level knowledge.</li>
</ul>
<ul>
<li>Reviews high level design to ensure alignment to Solution Architecture.</li>
</ul>
<ul>
<li>May lead projects or project steps within a broader project or may have accountability for on-going activities or objectives.</li>
</ul>
<ul>
<li>Mentor developers and create reference implementations/frameworks.</li>
</ul>
<ul>
<li>Partners with System Architects to elaborate capabilities and features.</li>
</ul>
<ul>
<li>Delivers single domain architecture solutions and executes continuous domain level architecture improvement roadmap. Actively supports design and steering of a continuous delivery pipeline.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>ETL/ELT Tools (Informatica, DBT), Data Architecture / Data Modeling, Data Warehouse, Cloud Data Platforms, Data Integration Tools, Snowflake or Databricks, Any Cloud, Power BI or Tableau, Data Science tools (Sagemaker, Databricks), Data Lakehouse, Data Governance, Master Data Management, Operational Data Management, AI/ML</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across over 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/1U952YA2QBa8zK7Tm5d3Lm/remote-fbs-senior-data-domain-architect-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>227da67a-cfc</externalid>
      <Title>Senior Principal, Data &amp; AI</Title>
      <Description><![CDATA[<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You&#39;ll be part of an entrepreneurial, high-growth environment of over 320,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready?</p>
<p>We are looking for experienced leaders and senior managers who can join our growing team. You will bring together AI &amp; Data advisory expertise, commercial acumen and delivery excellence to shape, sell and deliver programmes that create value for our clients using AI &amp; data</p>
<p>The ideal candidate will understand how ML, GenAI, Agentic AI and data can create value for clients across a range of industries. Ideally you have a depth of knowledge in one industry of functional domain but have the versatility to translate that to other domains and industries.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Take a leading role in end-to-end deal pursuits and during the sales cycle to understand clients’ needs and shape solutions that will deliver value for them</li>
<li>Build relationships of trust with client stakeholders to identify opportunities for AI-enabled business transformation</li>
<li>Collaborate within the Infosys organisation to bring the best of Infosys together for the client</li>
<li>Contribute to thought leadership and be an external advocate on behalf of Infosys Consulting</li>
<li>Fully responsibility for programme delivery including leading distributed teams and offshore delivery</li>
<li>Build and nurture partnerships across the Data &amp; AI technology ecosystem</li>
<li>Share expertise and learning generously within the team to upskill colleagues</li>
<li>Continually strengthen and grow the team through hiring and development</li>
<li>Act as a role model, authentic leader and collaborative team player</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Essential Skills:</li>
<li>10+ year of experience in relevant data, analytics &amp; AI fields</li>
<li>High level of personal credibility with clients and colleagues in the business application of AI and data</li>
<li>A hands-on leader, ready to lead by example with a focus on execution and progress</li>
<li>Proven experience in leading AI and data projects</li>
<li>Experience leading distributed technical teams</li>
<li>Thorough understanding of how AI, machine learning, deep learning can leverage enterprise data to deliver value</li>
<li>Strong executive presence and influencing skills with the ability to engage credibly with C-suite stakeholders and shape the conversation</li>
<li>Broad leadership skills including commercial acumen, business case development and personal resilience.</li>
<li>A relentless drive for quality and attention to detail</li>
<li>Excellent interpersonal skills and strong written and verbal communication skills</li>
<li>Degree in a quantitative field</li>
</ul>
<p>Preferred Skills:</p>
<ul>
<li>Expertise in an industry: especially financial services, manufacturing or telecommunications</li>
<li>Expertise in a functional domain: especially customer service, marketing, sales or IT operations</li>
<li>A second major European language at C2 level is an advantage</li>
<li>Domain expertise within Data &amp; AI: data science, GenAI engineering, data strategy, data governance</li>
<li>Knowledge of a partner platform: agentic workflows, hyperscaler AI platform, data platform</li>
<li>MSc with PhD a plus</li>
</ul>
<p>Personal attributes</p>
<ul>
<li>Analytical, pragmatic problem-solver; outcome-oriented.</li>
<li>Self-directed, able to prioritise and juggle multiple workstreams.</li>
<li>Clear communicator who can simplify complexity.</li>
<li>Collaborative, curious, continuous learner.</li>
</ul>
<p>_Given that this is just a short snapshot of the role we encourage you to apply even if you don&#39;t meet all the requirements listed above. We are looking for individuals who strive to make an impact and are eager to learn. If this sounds like you and you feel you have the skills and experience required, then please apply now.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>10+ year of experience in relevant data, analytics &amp; AI fields, High level of personal credibility with clients and colleagues in the business application of AI and data, A hands-on leader, ready to lead by example with a focus on execution and progress, Proven experience in leading AI and data projects, Experience leading distributed technical teams, Thorough understanding of how AI, machine learning, deep learning can leverage enterprise data to deliver value, Strong executive presence and influencing skills with the ability to engage credibly with C-suite stakeholders and shape the conversation, Broad leadership skills including commercial acumen, business case development and personal resilience., A relentless drive for quality and attention to detail, Excellent interpersonal skills and strong written and verbal communication skills, Degree in a quantitative field, Expertise in an industry: especially financial services, manufacturing or telecommunications, Expertise in a functional domain: especially customer service, marketing, sales or IT operations, A second major European language at C2 level is an advantage, Domain expertise within Data &amp; AI: data science, GenAI engineering, data strategy, data governance, Knowledge of a partner platform: agentic workflows, hyperscaler AI platform, data platform, MSc with PhD a plus</Skills>
      <Category>Consulting</Category>
      <Industry>Consulting</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that supports the largest global organisations to find and deliver business value from data &amp; AI. The company is a mid-size player with a supportive, entrepreneurial spirit that works with a market-leading brand in every sector.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/eng6dnDMaPQcwxQ8T3NByE/hybrid-senior-principal%2C-data-%26-ai-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>dda066c5-7ef</externalid>
      <Title>Senior Principal - Data &amp; AI</Title>
      <Description><![CDATA[<p>Do you want to boost your career and collaborate with expert, talented colleagues to solve and deliver against our clients&#39; most important challenges? We are growing and are looking for people to join our team. You&#39;ll be part of an entrepreneurial, high-growth environment of over 320,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset. Are you ready?</p>
<p>We are looking for experienced leaders and senior managers who can join our growing team. You will bring together AI &amp; Data advisory expertise, commercial acumen and delivery excellence to shape, sell and deliver programmes that create value for our clients using AI &amp; data</p>
<p>The ideal candidate will understand how ML, GenAI, Agentic AI and data can create value for clients across a range of industries. Ideally you have a depth of knowledge in one industry of functional domain but have the versatility to translate that to other domains and industries.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Take a leading role in end-to-end deal pursuits and during the sales cycle to understand clients’ needs and shape solutions that will deliver value for them</li>
<li>Build relationships of trust with client stakeholders to identify opportunities for AI-enabled business transformation</li>
<li>Collaborate within the Infosys organisation to bring the best of Infosys together for the client</li>
<li>Contribute to thought leadership and be an external advocate on behalf of Infosys Consulting</li>
<li>Fully responsibility for programme delivery including leading distributed teams and offshore delivery</li>
<li>Build and nurture partnerships across the Data &amp; AI technology ecosystem</li>
<li>Share expertise and learning generously within the team to upskill colleagues</li>
<li>Continually strengthen and grow the team through hiring and development</li>
<li>Act as a role model, authentic leader and collaborative team player</li>
</ul>
<p><strong>Skills and Qualifications:</strong></p>
<p><strong>Essential Skills:</strong></p>
<ul>
<li>10+ year of experience in relevant data, analytics &amp; AI fields</li>
<li>High level of personal credibility with clients and colleagues in the business application of AI and data</li>
<li>A hands-on leader, ready to lead by example with a focus on execution and progress</li>
<li>Proven experience in leading AI and data projects</li>
<li>Experience leading distributed technical teams</li>
<li>Thorough understanding of how AI, machine learning, deep learning can leverage enterprise data to deliver value</li>
<li>Strong executive presence and influencing skills with the ability to engage credibly with C-suite stakeholders and shape the conversation</li>
<li>Broad leadership skills including commercial acumen, business case development and personal resilience.</li>
<li>A relentless drive for quality and attention to detail</li>
<li>Excellent interpersonal skills and strong written and verbal communication skills (English &amp; a second major European language at C2 level)</li>
<li>Degree in a quantitative field</li>
</ul>
<p><strong>Preferred Skills:</strong></p>
<ul>
<li>Expertise in an industry: especially financial services, manufacturing or telecommunications</li>
<li>Expertise in a functional domain: especially customer service, marketing, sales or IT operations</li>
<li>Domain expertise within Data &amp; AI: data science, GenAI engineering, data strategy, data governance</li>
<li>Knowledge of a partner platform: agentic workflows, hyperscaler AI platform, data platform</li>
<li>MSc with PhD a plus</li>
</ul>
<p><strong>Personal attributes</strong></p>
<ul>
<li>Analytical, pragmatic problem-solver; outcome-oriented.</li>
<li>Self-directed, able to prioritise and juggle multiple workstreams.</li>
<li>Clear communicator who can simplify complexity.</li>
<li>Collaborative, curious, continuous learner.</li>
</ul>
<p>_Given that this is just a short snapshot of the role we encourage you to apply even if you don&#39;t meet all the requirements listed above. We are looking for individuals who strive to make an impact and are eager to learn. If this sounds like you and you feel you have the skills and experience required, then please apply now._</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>10+ year of experience in relevant data, analytics &amp; AI fields, High level of personal credibility with clients and colleagues in the business application of AI and data, A hands-on leader, ready to lead by example with a focus on execution and progress, Proven experience in leading AI and data projects, Experience leading distributed technical teams, Thorough understanding of how AI, machine learning, deep learning can leverage enterprise data to deliver value, Strong executive presence and influencing skills with the ability to engage credibly with C-suite stakeholders and shape the conversation, Broad leadership skills including commercial acumen, business case development and personal resilience., A relentless drive for quality and attention to detail, Excellent interpersonal skills and strong written and verbal communication skills (English &amp; a second major European language at C2 level), Degree in a quantitative field, Expertise in an industry: especially financial services, manufacturing or telecommunications, Expertise in a functional domain: especially customer service, marketing, sales or IT operations, Domain expertise within Data &amp; AI: data science, GenAI engineering, data strategy, data governance, Knowledge of a partner platform: agentic workflows, hyperscaler AI platform, data platform, MSc with PhD a plus</Skills>
      <Category>Consulting</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size consultancy within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/4kUKNtbLYqYjchvkRA2t8P/hybrid-senior-principal---data-%26-ai--deutschlandweit-in-frankfurt-am-main-at-infosys-consulting---europe</Applyto>
      <Location>Frankfurt am Main, Hessen, Germany</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ed3b95c5-f87</externalid>
      <Title>FBS SR Analytics Engineer</Title>
      <Description><![CDATA[<p>Our client is a leading US insurer, providing a wide range of insurance and financial services products with gross written premium over US$25 Billion. They serve over 10 million U.S. households with more than 19 million individual policies across all 50 states.</p>
<p><strong>Key Responsibilities</strong></p>
<ul>
<li>Create and iterate on data products and develop pipelines to provide data on an ongoing basis.</li>
<li>Assist in enhancing data delivery across PL and Distribution.</li>
<li>Assist with pivoting from antiquated technologies to enterprise standards.</li>
<li>Understand, analyze, and translate business data stories into technical stories&#39; breakdown structures.</li>
<li>Design, build, test, and implement data products of varying complexity.</li>
<li>Design, build, and maintain ETL/ELT pipelines using Python and SQL.</li>
<li>Develop data validation scripts in Python.</li>
<li>Write SQL queries to detect anomalies, duplicates, and missing values.</li>
<li>Work closely with data analysts, scientists, and business stakeholders.</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>3 to 5 years of experience as a Data Engineer.</li>
<li>Full English fluency.</li>
<li>BS in computer Engineering, Information Systems, Data Science, Advanced Analytics, Data Engineering, ML Ops, or similar.</li>
<li>Insurance Background - Desirable.</li>
</ul>
<p><strong>Benefits</strong></p>
<p>This position comes with a competitive compensation and benefits package, including:</p>
<ul>
<li>Competitive salary and performance-based bonuses.</li>
<li>Comprehensive benefits package.</li>
<li>Home Office model.</li>
<li>Career development and training opportunities.</li>
<li>Flexible work arrangements (remote and/or office-based).</li>
<li>Dynamic and inclusive work culture within a globally known group.</li>
<li>Private Health Insurance.</li>
<li>Pension Plan.</li>
<li>Paid Time Off.</li>
<li>Training &amp; Development.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, SQL, ETL Pipeline Building, Devops, MLOPS, AWS Cloud Experience, Data Governance and Management, Data Mining and Engineering</Skills>
      <Category>Engineering</Category>
      <Industry>Finance</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is one of the world&apos;s largest consulting, technology, and outsourcing companies, with over 340,000 employees in more than 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/7vBWYchan8XNrZmR744q9d/remote-fbs-sr-analytics-engineer-in-mexico-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ee2fcbdc-fc4</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will act as a senior technical leader in complex data and analytics engagements, shaping and governing end-to-end enterprise data architectures, leading technical teams, and serving as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will be responsible for ensuring that enterprise data and analytics solutions are scalable, secure, and production-ready, while translating business requirements into robust technical designs and delivery roadmaps.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>Join our growing Data &amp; Analytics practice and make a difference. In this practice you will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, Azure, AWS or GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, Postgres, SQL Server, Oracle, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, Docker / Kubernetes, Advanced analytics, AI / ML or GenAI, Streaming platforms (e.g. Kafka, Azure Event Hubs), Data governance or metadata tools, Cloud, data, or architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. The company is a mid-size player within the scale of Infosys, a top-5 powerhouse IT brand.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/uuSzzCt8qNbo6UpEFkSyjY/hybrid-principal-consultant---data-architecture-in-london-at-infosys-consulting---europe</Applyto>
      <Location>London</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>56dc9a51-e66</externalid>
      <Title>Principal Consultant - Data Architecture</Title>
      <Description><![CDATA[<p><strong>Principal Consultant - Data Architecture</strong></p>
<p>You will be part of an entrepreneurial, high-growth environment of 300,000 employees. Our dynamic organization allows you to work across functional business pillars, contributing your ideas, experiences, diverse thinking, and a strong mindset.</p>
<p><strong>About Your Role</strong></p>
<p>As a Principal Data Architecture Consultant, you will act as a senior technical leader in complex data and analytics engagements. You will shape and govern end-to-end enterprise data architectures, lead technical teams, and serve as a trusted technical advisor for clients and internal stakeholders.</p>
<p><strong>Your Role Will Include:</strong></p>
<ul>
<li>Define and govern target enterprise data, integration and analytics architectures across cloud and hybrid environments</li>
<li>Translate business objectives into scalable, secure, and compliant data solutions</li>
<li>Lead the design of end-to-end data solutions (ingestion, integration, storage, security, processing, analytics, AI enablement)</li>
<li>Guide delivery teams through implementation, rollout, and production readiness</li>
<li>Function as senior technical counterpart for client architects, IT leads, and engineering teams</li>
<li>Mentor data architects, system architects and engineers and contribute to best practices and reference architectures</li>
<li>Support pre-sales and solution design activities from a technical perspective</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>5–8+ years of experience in enterprise data architecture, system data integration, data engineering, or analytics</li>
<li>Proven experience leading enterprise data architecture workstreams or technical teams</li>
<li>Strong client-facing experience in complex enterprise environments</li>
</ul>
<p><strong>Core Data &amp; Analytics Technology Skills</strong></p>
<ul>
<li>Strong expertise in modern data architectures, including:</li>
<li>Data Mesh/ Data Fabric/ Data lake / data warehouse architectures</li>
<li>Modern Data Architecture design principles</li>
<li>Batch and streaming data integration patterns</li>
<li>Data Platform, DevOps, deployment and security architectures</li>
<li>Analytics and AI enablement architectures</li>
<li>Hands-on experience with cloud data platforms, e.g.:</li>
<li>Azure, AWS or GCP</li>
<li>Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric</li>
<li>Strong SQL skills and experience with relational databases (e.g. Postgres, SQL Server, Oracle)</li>
<li>Experience with NoSQL databases (e.g. Cosmos DB, MongoDB, InfluxDB)</li>
<li>Solid understanding of API-based and event-driven architectures</li>
<li>Experience designing and governing enterprise data migration programmes, including mapping, transformation rules, data quality remediation etc.</li>
</ul>
<p><strong>Engineering &amp; Platform Foundations</strong></p>
<ul>
<li>Experience with data pipelines, orchestration, and automation</li>
<li>Familiarity with CI/CD concepts and production-grade deployments</li>
<li>Understanding of distributed systems; Docker / Kubernetes is a plus</li>
</ul>
<p><strong>Data Management &amp; Governance</strong></p>
<ul>
<li>Strong understanding of data management and governance principles, including:</li>
<li>Data quality, metadata, lineage, master data management</li>
<li>Data Management software and tools</li>
<li>Security, access control, and compliance considerations</li>
<li>Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience</li>
</ul>
<p><strong>Nice to Have</strong></p>
<ul>
<li>Exposure to advanced analytics, AI / ML or GenAI from an architectural perspective</li>
<li>Experience with streaming platforms (e.g. Kafka, Azure Event Hubs)</li>
<li>Hands-on Experience with data governance or metadata tools</li>
<li>Cloud, data, or architecture certifications</li>
</ul>
<p><strong>Language &amp; Mobility</strong></p>
<ul>
<li>Very good English skills</li>
<li>Willingness to travel for project-related work</li>
</ul>
<p><strong>Benefits</strong></p>
<p>You will be utilizing the most innovative technological solutions in modern data ecosystem. In this role you’ll be able to see your own ideas transform into breakthrough results in the areas of Data &amp; Analytics Strategy, Data Management &amp; Governance, Data Platforms &amp; Engineering, Analytics &amp; Data Science.</p>
<p><strong>About Infosys Consulting</strong></p>
<p>Be part of a globally renowned management consulting firm on the front-line of industry disruption and at the cutting edge of technology. We work with market leading brands across sectors. Our culture is inclusive and entrepreneurial. Being a mid-size consultancy within the scale of Infosys gives us the global reach to partner with our clients throughout their transformation journey.</p>
<p>Our core values, IC-LIFE, form a common code that helps us move forward. IC-LIFE stands for Inclusion, Equity and Diversity, Client, Leadership, Integrity, Fairness, and Excellence. To learn more about Infosys Consulting and our values, please visit our careers page.</p>
<p>Within Europe, we are recognized as one of the UK’s top firms by the Financial Times and Forbes due to our client innovations, our cultural diversity and dedicated training and career paths. Infosys is on the Germany’s top employers list for 2023. Management Consulting Magazine named us on their list of Best Firms to Work for. Furthermore, Infosys has been recognized by the Top Employers Institute, a global certification company, for its exceptional standards in employee conditions across Europe for five years in a row.</p>
<p>We offer industry-leading compensation and benefits, along with top training and development opportunities so that you can grow your career and achieve your personal ambitions. Curious to learn more? We’d love to hear from you.... Apply today!</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>enterprise data architecture, system data integration, data engineering, analytics, modern data architectures, Data Mesh/ Data Fabric/ Data lake / data warehouse architectures, Modern Data Architecture design principles, Batch and streaming data integration patterns, Data Platform, DevOps, deployment and security architectures, Analytics and AI enablement architectures, cloud data platforms, Azure, AWS, GCP, Databricks, Snowflake, BigQuery, Azure Synapse / Microsoft Fabric, SQL, relational databases, Postgres, SQL Server, Oracle, NoSQL databases, Cosmos DB, MongoDB, InfluxDB, API-based and event-driven architectures, data migration programmes, data pipelines, orchestration, automation, CI/CD concepts, production-grade deployments, distributed systems, Docker, Kubernetes, data management and governance principles, data quality, metadata, lineage, master data management, data management software and tools, security, access control, compliance considerations, Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field or equivalent practical experience, advanced analytics, AI / ML or GenAI, streaming platforms, Kafka, Azure Event Hubs, data governance or metadata tools, cloud, data, architecture certifications</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Infosys Consulting - Europe</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Infosys Consulting - Europe is a globally renowned management consulting firm that works with market leading brands across sectors. It is a mid-size player with a supportive, entrepreneurial spirit.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/hpBWjvvy8D6B1f818cHxZR/remote-principal-consultant---data-architecture-in-poland-at-infosys-consulting---europe</Applyto>
      <Location>Poland</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>73808b15-6f5</externalid>
      <Title>FBS Data Engineer (Javascript, SQL experience)</Title>
      <Description><![CDATA[<p>You will work as a Data Engineer on the FBS team, helping to build a global approach to identifying, recruiting, hiring, and retaining top talent. Your role will involve consulting on data projects of intermediate complexity, designing, developing, and implementing data products, and creating business user access methods to structured and unstructured data. You will also develop and maintain moderately complex scalable data pipelines for both streaming and batch requirements and build out new API integrations to support increased demands of data volume and complexity.</p>
<p>As a Data Engineer, you will work on a variety of projects, including dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research and exploration. You will utilize techniques such as mapping data to a common data model, natural language processing, transforming data as necessary to satisfy business rules, AI, statistical computations, and validation of data content.</p>
<p>You will work with a dynamic, diverse, and multicultural team, and will have the opportunity to develop your skills and knowledge in a fast-paced and innovative environment.</p>
<p>This position comes with a competitive compensation and benefits package, including a competitive salary and performance-based bonuses, comprehensive benefits package, flexible work arrangements, private health insurance, paid time off, and training &amp; development opportunities in partnership with renowned companies.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, javascript, sql, data governance, data pipelines, API integrations, data science, machine learning, natural language processing</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Capgemini</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Capgemini is a global technology consulting and professional services company with nearly 350,000 employees across more than 50 countries.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/wf2SbffHSzUXbwQgaX2jzr/remote-fbs-data-engineer-(javascript%2C-sql-experience)-in-colombia-at-capgemini</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>1afd04e5-198</externalid>
      <Title>Data &amp; Cloud Technical Project Manager (H/F)</Title>
      <Description><![CDATA[<p>We are a data company that helps brands improve their marketing, media and customer experience through a combination of consulting and technology services. Our team of over 320 experts includes digital consultants, data scientists, engineers and media specialists who work together to provide high-level marketing advice and technical assistance to brands across various industries.</p>
<p>Our services include data platform implementation, data governance, data analytics, and more. We work with brands to help them become omnicanal organisations that can effectively manage their digital ecosystem and its synergies with the physical world.</p>
<p>We are based in Paris and operate across three time zones from our 10 offices in Paris, London, Geneva, Milan, Shanghai, Hong Kong, Shenzhen, Taipei, Singapore and New York. We prioritise the well-being of our employees, which has enabled us to be ranked as one of the best workplaces in France in 2018.</p>
<p>We are looking for a Data &amp; Cloud Technical Project Manager to join our team. The successful candidate will be responsible for managing technical projects and teams, ensuring the successful delivery of projects within agreed timelines and budgets.</p>
<p>Key responsibilities:</p>
<ul>
<li>Manage technical projects and teams, ensuring the successful delivery of projects within agreed timelines and budgets</li>
<li>Lead technical teams and ensure the development of technical skills and expertise</li>
<li>Collaborate with data science teams to develop advanced use cases for clients</li>
<li>Develop and implement data governance frameworks for clients</li>
<li>Ensure the quality and accuracy of data and analytics</li>
<li>Collaborate with clients to understand their needs and develop solutions</li>
<li>Manage client relationships and ensure client satisfaction</li>
</ul>
<p>Requirements:</p>
<ul>
<li>6-8 years of experience in a technical consulting role</li>
<li>Experience in managing technical projects and teams</li>
<li>Strong technical skills, including data analytics, data governance and cloud computing</li>
<li>Excellent communication and presentation skills</li>
<li>Ability to work in a fast-paced environment and manage multiple projects simultaneously</li>
<li>Strong problem-solving skills and ability to think critically</li>
<li>Experience working with clients and developing solutions to meet their needs</li>
</ul>
<p>Preferred qualifications:</p>
<ul>
<li>Experience working with data platforms and data governance frameworks</li>
<li>Experience working with cloud computing platforms, including Google Cloud, Amazon Web Services and Microsoft Azure</li>
<li>Experience working with data analytics tools, including Google Data Studio and Tableau</li>
<li>Experience working with data science tools, including Python and R</li>
<li>Experience working with machine learning algorithms and models</li>
</ul>
<p>We offer a competitive salary and benefits package, including a comprehensive health insurance plan, a 401(k) matching program and a generous paid time off policy. We also offer a dynamic and supportive work environment, with opportunities for professional growth and development.</p>
<p>If you are a motivated and experienced technical professional looking for a new challenge, we encourage you to apply for this exciting opportunity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>data analytics, data governance, cloud computing, project management, technical leadership, data science, machine learning, data platforms, data governance frameworks, Google Cloud, Amazon Web Services, Microsoft Azure, Google Data Studio, Tableau, Python, R, machine learning algorithms, data science tools</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Fifty-Five</Employername>
      <Employerlogo>https://logos.yubhub.co/view.com.png</Employerlogo>
      <Employerdescription>Fifty-Five is a global data company that helps brands collect, analyse and activate their data across paid, earned and owned channels. The company has over 320 employees worldwide.</Employerdescription>
      <Employerwebsite>https://jobs.workable.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.workable.com/view/3bHkbDwesiUkgLvBrfxgUo/hybrid-data-%26-cloud-technical-project-manager-(h%2Ff)-in-paris-at-fifty-five</Applyto>
      <Location>Paris</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>ecdc5591-27d</externalid>
      <Title>Data Engineer</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Data Engineer to join our team. As a Data Engineer, you will play a key role in the development and maintenance of our data infrastructure, ensuring that our data is accurate, reliable, and secure.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Design, develop, and maintain data pipelines and architectures to support our data-driven decision-making processes</li>
<li>Collaborate with our data scientists and analysts to understand their data requirements and develop solutions to meet those needs</li>
<li>Work closely with our IT team to ensure that our data systems are integrated with our existing infrastructure</li>
<li>Develop and maintain data quality and governance processes to ensure that our data is accurate and reliable</li>
<li>Participate in the development and maintenance of our data architecture roadmap</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Mathematics, or a related field</li>
<li>2+ years of experience in data engineering or a related field</li>
<li>Strong understanding of data engineering principles and practices</li>
<li>Experience with data warehousing and business intelligence tools</li>
<li>Strong programming skills in languages such as Python, Java, or C++</li>
<li>Experience with cloud-based data platforms such as AWS or GCP</li>
<li>Strong communication and collaboration skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading Formula One racing team</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development and growth opportunities</li>
<li>Access to state-of-the-art technology and tools</li>
<li>Flexible working hours and remote work options</li>
</ul>
<p>Note: The salary range for this position is competitive and will be discussed during the interview process.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>Competitive and will be discussed during the interview process</Salaryrange>
      <Skills>data engineering, data warehousing, business intelligence, Python, Java, C++, AWS, GCP, cloud computing, data architecture, data governance</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>Williams Racing</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.williamsf1.com.png</Employerlogo>
      <Employerdescription>Williams Racing is a British Formula One racing team that has been in operation since 1977. The team is based in Grove, Oxfordshire.</Employerdescription>
      <Employerwebsite>https://careers.williamsf1.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/trackside-operations-lead-hospitality-in-london-jid-487</Applyto>
      <Location>Grove</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>056148f9-afd</externalid>
      <Title>AI Analyst Intern</Title>
      <Description><![CDATA[<p>We are seeking a dynamic AI Analyst to help drive AI-powered quality initiatives, establish robust data governance frameworks, and develop innovative processes that bring efficiency and increase overall data quality.</p>
<p>Your Contribution:</p>
<ul>
<li>Work with subject matter experts to drive AI Technology into business processes</li>
<li>Help establish and maintain data governance programs across enterprise applications</li>
<li>Lay the foundation for data-based decision utilizing AI Technologies</li>
<li>Work with a team of highly talented individuals to understand and support the data needs of our business.</li>
</ul>
<p>Responsibilities:</p>
<ul>
<li>Work with subject matter experts to drive AI Technology into business processes</li>
<li>Help establish and maintain data governance programs across enterprise applications</li>
<li>Lay the foundation for data-based decision utilizing AI Technologies</li>
<li>Work with a team of highly talented individuals to understand and support the data needs of our business.</li>
</ul>
<p>Requirements:</p>
<ul>
<li>Experience building predictive models, especially classification</li>
<li>Excellent understanding of machine learning techniques and AI</li>
<li>Expertise in SQL and Python, experience with NoSQL is a plus</li>
<li>A self-driven ownership mindset with a natural curiosity and excellence in finding solutions to ambiguous problems</li>
<li>Strong analytic skills related to working with unstructured datasets</li>
<li>Experience with EDWs or data lakes a plus</li>
<li>Experience with AWS cloud services</li>
<li>Junior/Senior pursuing a degree in Data Science/Analytics, Computer Science (focus on AI or Machine Learning), Information Systems/AI or related fields</li>
</ul>
<p>Benefits:</p>
<ul>
<li>Flexible work arrangements</li>
<li>Opportunities for professional growth and development</li>
<li>Collaborative and dynamic work environment</li>
<li>Recognition and rewards for outstanding performance</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>entry</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Predictive models, Machine learning techniques, SQL, Python, NoSQL, Data governance, Data lakes, AWS cloud services, EDWs, Data analytics, Computer science</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Logitech</Employername>
      <Employerlogo>https://logos.yubhub.co/logitech.com.png</Employerlogo>
      <Employerdescription>Logitech is a multinational company that designs and manufactures computer peripherals, software, and mobile communication products.</Employerdescription>
      <Employerwebsite>https://logitech.wd5.myworkdayjobs.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://logitech.wd5.myworkdayjobs.com/en-US/Logitech/job/Camas-Washington---USA/AI-Analyst-Intern_145578</Applyto>
      <Location>Camas, Washington</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>7697b617-216</externalid>
      <Title>Partner Development Manager, Provider Ecosystem</Title>
      <Description><![CDATA[<p><strong>Overview</strong></p>
<p>As the Manager of Ecosystem Partnerships (Provider Supply), you&#39;ll own and execute OpenRouter’s strategy for growing and managing our global network of LLM and inference providers. You will be the company’s first supply-side partnerships hire and will build the foundational programs that ensure OpenRouter has the industry&#39;s most comprehensive, reliable, and compliant model ecosystem.</p>
<p>You&#39;ll manage relationships with our highest-impact model and infrastructure partners, negotiate commercial and technical agreements, ensure billing and SLA accuracy, enforce data policy compliance, and drive the expansion of new providers and model families. This role is deeply cross-functional—working closely with Product, Engineering, Security, Legal, and Finance to make OpenRouter the platform of choice for providers looking to scale distribution and revenue.</p>
<p>You will shape the future of OpenRouter’s supply ecosystem, directly influencing product quality, reliability, economics, and overall platform competitiveness.</p>
<p><strong>What You&#39;ll Do</strong></p>
<p><strong>Grow and Prioritize the Provider Ecosystem</strong></p>
<ul>
<li>Develop and own the strategy for sourcing, prioritizing, and onboarding new model providers, inference services, and accelerator-based platforms.</li>
</ul>
<ul>
<li>Evaluate provider technical capabilities, performance characteristics, pricing, and strategic relevance to determine partnership fit.</li>
</ul>
<ul>
<li>Build a high-quality pipeline of emerging model developers, cloud inference platforms, and GPU/ASIC providers to expand OpenRouter’s supply diversity.</li>
</ul>
<p><strong>Lead Provider Relationship Management</strong></p>
<ul>
<li>Manage end-to-end relationships with key providers, including roadmap collaboration, integration planning, pricing reviews, and quarterly business reviews.</li>
</ul>
<ul>
<li>Partner with Engineering and Product to ensure smooth technical integration, metadata accuracy, reliability standards, and go-to-market alignment.</li>
</ul>
<ul>
<li>Advocate for provider needs while balancing platform-wide performance, economics, and compliance considerations.</li>
</ul>
<p><strong>Drive Provider Data Policy &amp; Security Compliance</strong></p>
<ul>
<li>Build scalable compliance programs that ensure providers meet OpenRouter’s data handling standards (no-train/no-retain, ZDR, SOC2/ISO alignment, GDPR, TPRM requirements).</li>
</ul>
<ul>
<li>Establish recurring compliance reviews, attestations, breach escalation pathways, and risk classifications.</li>
</ul>
<ul>
<li>Collaborate with Legal and Security to maintain enforceable data governance across all provider agreements.</li>
</ul>
<p><strong>Protect Revenue Through Billing Accuracy &amp; SLA Enforcement</strong></p>
<ul>
<li>Develop systems to identify billing discrepancies, token miscounts, rate-card inconsistencies, and revenue leakage.</li>
</ul>
<ul>
<li>Partner with Finance to maintain accurate provider rate cards, rev-share structures, and payout workflows.</li>
</ul>
<ul>
<li>Lead enforcement of provider SLAs, including uptime, latency, throughput, and error rate standards—ensuring credits or remedies are applied as required.</li>
</ul>
<ul>
<li>Build repeatable audit processes and dashboards to maintain billing integrity across the provider ecosystem.</li>
</ul>
<p><strong>Negotiate Strategic Commercial &amp; Technical Agreements</strong></p>
<ul>
<li>Lead negotiation of complex provider partnerships involving pricing, consumption economics, data privacy, reliability commitments, and integration requirements.</li>
</ul>
<ul>
<li>Structure agreements that balance mutual value creation while protecting OpenRouter’s reliability, margin, and compliance obligations.</li>
</ul>
<ul>
<li>Build scalable onboarding documentation and processes to reduce friction for future provider integrations.</li>
</ul>
<p><strong>Cross-Functional Leadership &amp; Ecosystem Strategy</strong></p>
<ul>
<li>Collaborate with Product and Engineering to influence provider-facing roadmap decisions, model routing logic, and supply expansion.</li>
</ul>
<ul>
<li>Work with Growth, Support, and Marketing to communicate provider changes, outages, launches, and performance improvements.</li>
</ul>
<ul>
<li>Maintain deep visibility into the AI/ML and inference ecosystem—emerging models, hardware accelerators, competitive pricing dynamics, and market shifts.</li>
</ul>
<p><strong>Qualifications</strong></p>
<ul>
<li>8+ years in partnerships, business development, or strategic alliances, with 3–5+ years leading supply-side or ecosystem partnerships programs.</li>
</ul>
<ul>
<li>Strong understanding of technical integrations, including API patterns, authentication, routing, data flows, rate limiting, and reliability metrics.</li>
</ul>
<ul>
<li>Expertise in negotiating agreements involving data governance, SLAs, token or usage-based pricing, compliance requirements, and integration commitments.</li>
</ul>
<ul>
<li>Highly analytical—comfortable digging into billing logs, token counts, performance metrics, and error patterns to identify discrepancies or opportunities.</li>
</ul>
<ul>
<li>Strategic thinker with the ability to operate from 0 to 1 while simultaneously managing day-to-day partner execution.</li>
</ul>
<ul>
<li>Excellent communicator capable of influencing senior technical and commercial stakeholders.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>Full time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>Remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Partnerships, Business Development, Strategic Alliances, Technical Integrations, API Patterns, Authentication, Routing, Data Flows, Rate Limiting, Reliability Metrics, Data Governance, SLAs, Token or Usage-Based Pricing, Compliance Requirements, Integration Commitments, Analytical Skills, Strategic Thinking, Communication Skills, Influencing Senior Technical and Commercial Stakeholders</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenRouter</Employername>
      <Employerlogo>https://logos.yubhub.co/openrouter.com.png</Employerlogo>
      <Employerdescription>OpenRouter is a platform that enables the distribution and revenue growth of artificial intelligence and machine learning models. The company is focused on building a comprehensive and reliable model ecosystem.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openrouter/3da7bf92-e45a-4957-9942-9142c89265ce</Applyto>
      <Location>Remote (US)</Location>
      <Country></Country>
      <Postedate>2026-03-09</Postedate>
    </job>
    <job>
      <externalid>015e5c6d-a31</externalid>
      <Title>Senior Data Engineer</Title>
      <Description><![CDATA[<p><strong>Why Valvoline Global Operations?</strong></p>
<p>At Valvoline Global Operations, we&#39;re proud to be The Original Motor Oil, but we&#39;ve never rested on being first. Founded in 1866, we introduced the world&#39;s first branded motor oil, staking our claim as a pioneer in the automotive and industrial solutions industry.</p>
<p><strong>Job Purpose</strong></p>
<p>We are seeking a highly skilled and motivated Data Engineer to join our growing data and analytics team. The ideal candidate will have strong experience designing and developing scalable data pipelines, integrating complex systems, and optimizing data workflows. Proficiency in Databricks and SAP Datasphere is preferred, as these platforms are central to our data ecosystem.</p>
<p><strong>How You Make an Impact (Job Accountabilities)</strong></p>
<ul>
<li>Design, build, and maintain robust, scalable, and high-performance data pipelines using Databricks and SAP Datasphere.</li>
<li>Collaborate with data architects, analysts, data scientists, and business stakeholders to gather requirements and deliver data solutions aligned with stakeholders&#39; goals.</li>
<li>Integrate diverse data sources (e.g., SAP, APIs, flat files, cloud storage) into the enterprise data platforms</li>
<li>Ensure high standards of data quality and implement data governance practices. Stay current with emerging trends and technologies in cloud computing, big data, and data engineering.</li>
<li>Provide ongoing support for the platform, troubleshoot any issues that arise, and ensure high availability and reliability of data infrastructure.</li>
<li>Create documentation for the platform infrastructure and processes, and train other team members or users in platform effectively.</li>
</ul>
<p><strong>What You Bring to the Role (Job Qualifications / Education / Skills / Requirements / Capabilities)</strong></p>
<ul>
<li>Bachelor&#39;s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.</li>
<li>5-7+ years of experience in a data engineering or related role.</li>
<li>Strong knowledge of data engineering principles, data warehousing concepts, and modern data architecture.</li>
<li>Proficiency in SQL and at least one programming language (e.g., Python, Scala).</li>
<li>Experience with cloud platforms (e.g., Azure, AWS, or GCP), particularly in data services.</li>
<li>Familiarity with data orchestration tools (e.g., PySpark, Airflow, Azure Data Factory) and CI/CD pipelines.</li>
</ul>
<p><strong>Competencies Desired</strong></p>
<ul>
<li>Hands-on experience with Databricks (including Spark/PySpark, Delta Lake, MLflow, Unity Catalog, etc.).</li>
<li>Practical experience working with SAP Datasphere (or SAP Data Warehouse Cloud) in data modeling and data integration scenarios.</li>
<li>SAP BW or SAP HANA experience is a plus.</li>
<li>Experience with BI tools like Power BI or Tableau.</li>
<li>Understanding of data governance frameworks and data security best practices.</li>
<li>Exposure to data lakehouse architecture and real-time streaming data pipelines.</li>
<li>Certifications in Databricks, SAP, or cloud platforms are advantageous.</li>
</ul>
<p><strong>Working Conditions / Physical Requirements / Travel Requirements</strong></p>
<ul>
<li>Normal Office environment.</li>
<li>Prolonged periods of computer use and frequent participation in meetings</li>
<li>Occasional walking, standing, and light lifting (up to 10 lbs)</li>
</ul>
<ul>
<li>Minimal travel required.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>remote</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>data engineering, Databricks, SAP Datasphere, SQL, Python, Scala, cloud platforms, data orchestration tools, CI/CD pipelines, Databricks, SAP Datasphere, SAP BW, SAP HANA, Power BI, Tableau, data governance frameworks, data security best practices, data lakehouse architecture, real-time streaming data pipelines</Skills>
      <Category>Engineering</Category>
      <Industry>Automotive</Industry>
      <Employername>Valvoline Global Operations</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.valvolineglobal.com.png</Employerlogo>
      <Employerdescription>Valvoline Global Operations is a global company that develops future-ready products and provides best-in-class services for the automotive and industrial solutions industry.</Employerdescription>
      <Employerwebsite>https://jobs.valvolineglobal.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.valvolineglobal.com/job/Senior-Data-Engineer/1316654400/</Applyto>
      <Location></Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>1e66d068-858</externalid>
      <Title>Strategic Finance, Hardware R&amp;D Finance Manager</Title>
      <Description><![CDATA[<p><strong>Compensation</strong></p>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p><strong>About the Team</strong></p>
<p>The Strategic Finance team provides financial insights and guidance to support the organization&#39;s long-term goals and strategies. We partner across the business to allocate and deploy our resources for the highest impact outcomes.</p>
<p><strong>About the Role</strong></p>
<p>We are hiring a Hardware Finance Manager, R&amp;D to own the financial strategy, investment modeling, and long-range planning for our hardware R&amp;D programs. This role sits at the intersection of Hardware Engineering, Product, and Finance, and is responsible for ensuring that R&amp;D roadmap decisions—spanning technical scope, sequencing, resourcing, and timelines—are grounded in rigorous financial analysis and disciplined capital allocation.</p>
<p>This role will serve as the embedded finance partner to Hardware Engineering and Product teams, with end-to-end ownership of R&amp;D investment models, program-level financials, and long-range planning across the hardware development lifecycle. You will help leadership evaluate tradeoffs across technical ambition, speed, risk, and capital efficiency as hardware programs scale in scope and complexity.</p>
<p>We have a strong preference for candidates who can be based in our San Francisco HQ. We use a hybrid work model of 3 days in the office per week and offer relocation assistance to new employees.</p>
<p><strong>In this role, you will oversee multiple areas of responsibility:</strong></p>
<ul>
<li>Own end-to-end hardware R&amp;D financial models across the development lifecycle, from early concept and prototyping through development, validation, and transition to production.</li>
</ul>
<ul>
<li>Serve as a key finance partner to Hardware Engineering and Product leadership, supporting decisions on roadmap sequencing, technical scope, resourcing levels, and investment pacing through clear financial frameworks and scenario analysis.</li>
</ul>
<ul>
<li>Translate technical roadmaps into financially grounded R&amp;D execution plans, integrating assumptions around headcount ramps, staffing mix, tooling, lab and test infrastructure, and development timelines.</li>
</ul>
<ul>
<li>Drive capital discipline and investment efficiency, proactively identifying scope changes, resourcing inefficiencies, and investment risks before spend becomes structurally locked in.</li>
</ul>
<ul>
<li>Own R&amp;D forecasting, budgeting, and variance analysis, providing clear visibility into spend vs. plan, key drivers of change, and implications for broader hardware investment priorities.</li>
</ul>
<ul>
<li>Frame program-level tradeoffs and decision scenarios for leadership, clearly quantifying implications across cost, schedule, technical risk, and long-term platform value.</li>
</ul>
<ul>
<li>Build and maintain standardized R&amp;D investment dashboards and program views to provide consistent, executive-ready visibility into burn rates, milestone progress, and capital allocation.</li>
</ul>
<ul>
<li>Support portfolio-level decision-making by comparing investment profiles across hardware programs and generations, informing prioritization, sequencing, and long-term R&amp;D strategy.</li>
</ul>
<ul>
<li>Contribute to the development and scaling of the hardware R&amp;D finance foundation, improving modeling rigor, governance, and decision support as the hardware portfolio grows.</li>
</ul>
<p><strong>You might thrive in this role if you have:</strong></p>
<ul>
<li>8+ years of progressive finance experience with significant exposure to hardware, manufacturing, or complex supply chain businesses.</li>
</ul>
<ul>
<li>A passion for helping build world-class finance teams and driving business and financial outcomes, as measured on margin improvement, working capital efficiency, forecast accuracy, and execution of cost-reduction initiatives.</li>
</ul>
<ul>
<li>A strong ability to critically evaluate opportunities and risks.</li>
</ul>
<ul>
<li>Expert modeling skills with best-in-class attention to detail and unwavering commitment to accuracy.</li>
</ul>
<ul>
<li>Exemplary ability to distill complex financial information into actionable insights.</li>
</ul>
<ul>
<li>Excellent communication skills and “story telling” ability when presenting data insights.</li>
</ul>
<ul>
<li>Strong enthusiasm for building the human-computer interface for the AI era.</li>
</ul>
<p><strong>About OpenAI</strong></p>
<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>
<p>We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic.</p>
<p>For additional information, please see [OpenAI’s Affirmative Action and Equal Employment Opportunity Policy Statement](https://cdn.openai.com/policies/eeo-policy-statement.pdf).</p>
<p>Background checks for applicants will be administered in accordance with applicable law, and qualified applicants with arrest or conviction records will be considered for employment consistent with those laws, including the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance for Employers, and the California Fair Chance Act, for US-based candidates. For unincorporated Los Angeles County workers: we reasonably believe that criminal history may have a direct, adverse and negative relationship with the following job duties, potentially resulting in the withdrawal of a conditional offer of employment: protect computer hardware entrusted to you from theft, loss or damage; return all computer hardware in your possession (including the data contained therein) upon termination of employment or end of assignment; and maintain the confidentiality of proprietary, confidential, and non-public information. In addition, job duties require access to secure and protected information.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>Full time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$234K – $325K</Salaryrange>
      <Skills>financial modeling, investment analysis, capital allocation, financial planning, forecasting, budgeting, variance analysis, financial reporting, financial analysis, financial modeling, data analysis, data visualization, financial software, Microsoft Excel, financial planning and analysis, financial modeling and analysis, financial reporting and analysis, data science, machine learning, artificial intelligence, cloud computing, data engineering, data architecture, data governance, data quality, data security, data analytics, data visualization, financial software, Microsoft Excel, financial planning and analysis, financial modeling and analysis, financial reporting and analysis</Skills>
      <Category>Finance</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/9548776c-d623-4a63-af2c-c3a4a1d9685f</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>c29dbc40-9e1</externalid>
      <Title>Advertising Marketing Science Lead</Title>
      <Description><![CDATA[<p><strong>Compensation</strong></p>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p><strong>About the team</strong></p>
<p>OpenAI’s mission is to ensure the responsible and widespread adoption of artificial intelligence. In support of that mission, the Marketing team helps deeply understand customer audiences and market dynamics, influence the development of the right products, build sustainable and customer-aligned monetization models, and drive awareness, adoption, and usage across OpenAI’s products and platform.</p>
<p><strong>About the role</strong></p>
<p>We’re looking for an <strong>Advertising Marketing Science</strong> leader to establish and scale OpenAI’s advertiser-facing reporting, measurement, and attribution credibility. You’ll combine deep measurement expertise with strong judgment and cross-functional leadership to define how advertisers understand performance on OpenAI and how our reporting aligns with their existing measurement frameworks (MTA, incrementality/lift testing, MMM/geo experimentation).</p>
<p>This role will start as a hands-on individual contributor responsible for building the methodological foundations of OpenAI’s advertising measurement system. Over time, you will define the strategy, operating model, and team needed to scale this function globally as advertiser adoption grows.</p>
<p>This role is ideal for someone who enjoys building new capabilities from first principles, can translate complex causal measurement approaches into trusted industry narratives, and is energized by partnering across Product, Engineering, Sales, Partnerships and Legal to build a privacy-first measurement ecosystem.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li><strong>Define OpenAI’s advertiser measurement strategy</strong>, establishing how our reporting aligns with attribution (MTA), incrementality/lift testing, MMM, geo experimentation, and partner measurement frameworks.</li>
</ul>
<ul>
<li><strong>Build the foundation of OpenAI’s Marketing Science function</strong>, initially leading work as an individual contributor while designing the long-term team structure, operating model, and measurement programs.</li>
</ul>
<ul>
<li><strong>Lead advertiser-facing measurement discussions</strong>, representing OpenAI in executive briefings, measurement escalations, and industry conversations while building trust in our methodologies and reporting.</li>
</ul>
<ul>
<li><strong>Develop clear advertiser narratives</strong> that translate causal inference, attribution models, and statistical methodologies into understandable guidance for campaign optimization and investment decisions.</li>
</ul>
<ul>
<li><strong>Design and govern OpenAI’s advertising measurement program</strong>, including standardized experiment patterns (A/B, geo, quasi-experimental), power calculators, diagnostics, and experiment-quality guardrails.</li>
</ul>
<ul>
<li><strong>Build scalable measurement frameworks</strong> that reconcile results across MMM, MTA, and lift testing, helping advertisers triangulate OpenAI performance within their broader marketing measurement systems.</li>
</ul>
<ul>
<li><strong>Establish privacy-centric measurement approaches as needed</strong>, including aggregated measurement, and conversion modeling in partnership with Legal and Privacy teams.</li>
</ul>
<ul>
<li><strong>Translate measurement strategy into product capabilities</strong>, partnering with Product and Engineering to operationalize methodologies into durable measurement tools and reporting infrastructure.</li>
</ul>
<ul>
<li><strong>Shape OpenAI’s external measurement ecosystem</strong>, working with third-party measurement partners, clean-room providers, and industry groups to align standards and reduce friction for advertisers.</li>
</ul>
<p><strong>You might thrive in this role if you:</strong></p>
<ul>
<li>Have <strong>deep expertise in advertising measurement</strong> including experimentation, incrementality testing, attribution modeling, and econometric approaches such as MMM.</li>
</ul>
<ul>
<li>Have <strong>experience designing and scaling lift or incrementality programs</strong>, including governance, experimentation frameworks, and statistical quality standards.</li>
</ul>
<ul>
<li>Are comfortable acting as a <strong>senior external measurement authority</strong>, confidently leading advertiser conversations, navigating discrepancies, and building trust with sophisticated marketing organizations.</li>
</ul>
<ul>
<li>Can <strong>translate complex statistical concepts into practical decision frameworks</strong> for both technical and non-technical audiences.</li>
</ul>
<ul>
<li>Can <strong>build functions from the ground up</strong>, setting strategy while also executing hands-on during early stages of team development.</li>
</ul>
<ul>
<li>Have successfully partnered with <strong>Product and Engineering teams to translate measurement science into scalable product capabilities</strong>.</li>
</ul>
<ul>
<li>Have experience working with <strong>third-party measurement providers or industry standards organizations</strong>.</li>
</ul>
<ul>
<li>Thrive in <strong>fast-paced, high-ambiguity environments</strong> and are comfortable leading cross-company initiatives without direct authority.</li>
</ul>
<ul>
<li>Care deeply about building <strong>trusted, privacy-forward measurement systems</strong> that enable long-term advertiser confidence.</li>
</ul>
<p><strong>About OpenAI</strong></p>
<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>
<p>We are an equal opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, sex, sexual orientation, age, veteran status, disability, genetic information, or other applicable legally protected characteristic.</p>
<p>For additional information, please see [OpenAI’s Affirm</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>Full time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>Hybrid</Workarrangement>
      <Salaryrange>$284K – $415K</Salaryrange>
      <Skills>Advertising measurement, Experimentation, Incrementality testing, Attribution modeling, Econometric approaches, Statistical quality standards, Measurement frameworks, Data analysis, Data visualization, Communication skills, Leadership skills, Collaboration skills, Data science, Machine learning, Statistics, Mathematics, Computer programming, Data engineering, Cloud computing, Big data, Data governance, Data security</Skills>
      <Category>Marketing</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. The company pushes the boundaries of the capabilities of AI systems and seeks to safely deploy them to the world through its products.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/5547d275-f123-46e1-8695-71fd79a05724</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>a412ed16-e60</externalid>
      <Title>Head of Partnerships</Title>
      <Description><![CDATA[<p><strong>Compensation</strong></p>
<p>$160K – $200K • 0.01% – 0.1% • Offers Commission</p>
<p><strong>Head of Partnerships</strong></p>
<p>You&#39;ll own the strategic partnerships that unlock access to the world&#39;s most valuable data for Firecrawl — negotiating licensing agreements, building relationships with major content and platform companies, and opening doors that don&#39;t have a &quot;Contact Sales&quot; button. This is a business development role at its core, but the deals you close will shape what Firecrawl can offer and how the product evolves.</p>
<p><strong>Salary Range:</strong> $180,000–$220,000/year OTE (Range shown is for U.S.-based employees in San Francisco, CA. Compensation outside the U.S. is adjusted fairly based on your country&#39;s cost of living. You can explore how we calculate this here: [https://www.firecrawl.dev/careers/compensation](https://www.firecrawl.dev/careers/compensation).)</p>
<p><strong>Equity Range:</strong> Up to 0.10%</p>
<p><strong>Location:</strong> San Francisco, CA (Hybrid; remote-friendly when needed)</p>
<p><strong>Job Type:</strong> Full-Time (SF)</p>
<p><strong>Experience:</strong> 5+ years in partnerships, BD, or licensing — ideally at a major tech or platform company</p>
<p><strong>Visa:</strong> US Citizenship/Visa required for SF</p>
<p><strong>About Firecrawl</strong></p>
<p>Firecrawl is the easiest way to extract data from the web. Developers use us to reliably convert URLs into LLM-ready markdown or structured data with a single API call.</p>
<p><strong>What You&#39;ll Do</strong></p>
<ul>
<li>Identify, pursue, and close strategic partnerships with major content owners, media companies, and platform businesses</li>
</ul>
<ul>
<li>Negotiate data licensing and access agreements that expand what Firecrawl can offer its customers</li>
</ul>
<ul>
<li>Build and maintain relationships with senior decision-makers at large organizations — legal, BD, product, and policy teams</li>
</ul>
<ul>
<li>Develop the partnership strategy: who to go after, in what order, and why — then execute it</li>
</ul>
<ul>
<li>Navigate complex deal structures involving legal, compliance, and data governance stakeholders</li>
</ul>
<ul>
<li>Work closely with product and engineering to understand what data access unlocks and how partnerships translate into product value</li>
</ul>
<ul>
<li>Represent Firecrawl in rooms where AI infrastructure and data access are being shaped</li>
</ul>
<p><strong>What We&#39;re Looking For</strong></p>
<p><strong>A proven dealmaker in tech.</strong> You&#39;ve closed complex, high-stakes partnerships — not just warm-intro handshakes, but real agreements with legal review, commercial terms, and multiple stakeholders. You know how big organizations actually make decisions and you have the patience and persistence to navigate that.</p>
<p><strong>Deep relationships in the right rooms.</strong> You have an existing network across major tech companies, media organizations, or platform businesses. You don&#39;t need to cold-start every conversation — you know people, and people take your calls.</p>
<p><strong>Fluent in the AI landscape.</strong> You understand why data access matters in the age of LLMs. You can speak credibly about how AI companies use web data, what licensing means in this context, and why Firecrawl&#39;s position is unique. You don&#39;t need to write code, but you need to hold your own in technical conversations.</p>
<p><strong>Comfortable selling a fast-moving startup to slow-moving enterprises.</strong> You&#39;ve bridged the gap between startup speed and enterprise process before. You know how to create urgency without burning relationships and how to keep momentum through long deal cycles.</p>
<p><strong>High agency with executive presence.</strong> You represent Firecrawl in rooms with senior leaders at large companies. You&#39;re polished enough to earn trust and scrappy enough to follow up relentlessly until the deal is done.</p>
<p><strong>Backgrounds that often do well:</strong> BD or partnerships leads at major platform, social media, or tech companies. People who&#39;ve negotiated content or data licensing deals. Senior BD at AI companies who&#39;ve worked the data access side. Ex-founders who&#39;ve closed enterprise partnerships through sheer force of will.</p>
<p><strong>What We&#39;re NOT Looking For</strong></p>
<p><strong>Partnership managers who manage, not close.</strong> If your experience is maintaining existing partner relationships and running QBRs, this isn&#39;t the role. You&#39;ll be sourcing and closing from scratch.</p>
<p><strong>AI tourists.</strong> If you can&#39;t explain why data licensing matters for LLMs or how web data powers AI applications, you&#39;ll struggle to have credible conversations with partners. You need genuine understanding, not talking points.</p>
<p><strong>People who need a big brand behind them to open doors.</strong> You&#39;ll be representing a startup, not a household name. If your dealmaking depends on the logo on your business card rather than your own credibility and hustle, this won&#39;t be a fit.</p>
<p><strong>A Note On Pace</strong></p>
<p>We operate at an absurd level of urgency because the window for what we&#39;re building won&#39;t stay open forever. If that excites you, keep reading. If it doesn&#39;t, no hard feelings — but this role probably isn&#39;t for you.</p>
<p><strong>Benefits &amp; Perks</strong></p>
<p><strong>Available to all employees</strong></p>
<ul>
<li><strong>Salary that makes sense</strong> — $180,000–220,000/year OTE (U.S.-based), based on impact, not tenure</li>
</ul>
<ul>
<li><strong>Own a piece</strong> — Up to 0.10% equity in what you&#39;re helping build</li>
</ul>
<ul>
<li><strong>Generous PTO</strong> — 15 days mandatory, anything after 24 days, just ask (holidays excluded); take the time you need to recharge</li>
</ul>
<ul>
<li><strong>Parental leave</strong> — 12 weeks fully paid, for moms and dads</li>
</ul>
<ul>
<li><strong>Wellness stipend</strong> — $100/month for the gym, therapy, massages, or whatever keeps you human</li>
</ul>
<ul>
<li><strong>Learning &amp; Development</strong> — Expense up to $1000/year toward anything that helps you grow professionally</li>
</ul>
<ul>
<li><strong>Team offsites</strong> — A change of scenery, minus the trust falls</li>
</ul>
<ul>
<li><strong>Sabbatical</strong> — 3 paid months off after 4 years, do something fun and new</li>
</ul>
<p><strong>Available to US-based full-time employees</strong></p>
<p>-</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>Full time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>Hybrid</Workarrangement>
      <Salaryrange>$180,000–$220,000/year OTE</Salaryrange>
      <Skills>Partnerships, Business Development, Licensing, Data Access, AI Infrastructure, Data Governance, Legal, Compliance, Product Management, Engineering, Fluent in the AI landscape, Deep relationships in the right rooms, Proven dealmaker in tech, High agency with executive presence</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Firecrawl</Employername>
      <Employerlogo>https://logos.yubhub.co/firecrawl.dev.png</Employerlogo>
      <Employerdescription>Firecrawl is a small, fast-moving, technical team building essential infrastructure super-intelligence will use to gather data on the web. They&apos;ve hit 8 figures in ARR and 80k+ GitHub stars in just a year.</Employerdescription>
      <Employerwebsite>https://www.firecrawl.dev/careers/compensation</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/firecrawl/8064002a-3398-4eb3-a7b8-f5ebf5beebf8</Applyto>
      <Location>San Francisco, CA (Hybrid)</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>b2545e93-ab4</externalid>
      <Title>Enterprise Account Executive, Financial Services &amp; Insurance</Title>
      <Description><![CDATA[<p>As an Enterprise Account Executive, Financial Services &amp; Insurance based in Tokyo, you&#39;ll drive the adoption of safe, frontier AI technology across Japan&#39;s financial services and insurance organisations. You&#39;ll own the full sales cycle from prospecting to closing, working with senior leaders to help them modernise operations, strengthen risk management, and enhance customer experiences with Anthropic&#39;s AI solutions.</p>
<p>Responsibilities:</p>
<ul>
<li>Own and exceed revenue targets by winning new major FSI accounts and expanding existing relationships across Japan&#39;s banks, insurers, asset managers, and fintech companies</li>
<li>Build and execute territory plans, identifying high-value opportunities across megabanks, regional banks, life and non-life insurers, and diversified financial groups</li>
<li>Lead consultative sales processes with senior executives — including CIOs, CDOs, Chief Risk Officers, and compliance leaders — positioning AI as a driver for digital transformation, regulatory compliance, and improved customer experience</li>
<li>Orchestrate internal teams (Product, Engineering, Applied AI, Partnerships) to deliver solutions that meet the stringent regulatory, security, and data governance requirements of the FSI sector</li>
</ul>
<p>You may be a good fit if you have:</p>
<ul>
<li>8+ years of enterprise sales experience in Japan with significant exposure to financial services and insurance accounts, driving adoption of emerging technologies with a consultative, solutions-oriented approach</li>
<li>A track record of managing complex, long-cycle sales processes and securing strategic deals with FSI organisations by understanding sector-specific regulatory and compliance frameworks and crafting compliant solutions</li>
<li>Demonstrated ability to navigate FSI stakeholder ecosystems, building consensus across IT, risk, compliance, and business operations teams with multiple budget authorities and approval layers</li>
<li>Proven experience exceeding revenue targets by effectively managing pipeline across the varying budget cycles and fiscal timelines of large financial institutions</li>
<li>A knack for bringing order to chaos and an enthusiastic &#39;roll up your sleeves&#39; mentality — you are a true team player who thrives in ambiguous, startup-like environments</li>
<li>A strategic, analytical approach to identifying opportunities within Japan&#39;s FSI sector combined with patient, relationship-focused execution</li>
<li>A passion for and/or experience with advanced AI systems. You feel strongly about ensuring frontier AI systems are developed safely and responsibly for broad benefit</li>
<li>Excellent communication skills in both Japanese and English</li>
</ul>
<p>What We Offer:</p>
<ul>
<li>Competitive base salary and commission structure commensurate with experience</li>
<li>Equity participation</li>
<li>Comprehensive benefits package</li>
<li>Hybrid work model with flexibility</li>
<li>Access to cutting-edge AI technology and world-class research team</li>
</ul>
<p>Logistics:</p>
<p>Education requirements: We require at least a Bachelor&#39;s degree in a related field or equivalent experience. Location-based hybrid policy: Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.</p>
<p>Visa sponsorship: We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>
<p>We encourage you to apply even if you do not believe you meet every single qualification. Not all strong candidates will meet every single qualification as listed. Research shows that people who identify as being from underrepresented groups are more prone to experiencing imposter syndrome and doubting the strength of their candidacy, so we urge you not to exclude yourself prematurely and to submit an application if you&#39;re interested in this work.</p>
<p>Your safety matters to us. To protect yourself from potential scams, remember that Anthropic recruiters only contact you from @anthropic.com email addresses. In some cases, we may partner with vetted recruiting agencies who will identify themselves as working on behalf of Anthropic. Be cautious of emails from other domains. Legitimate Anthropic recruiters will never ask for money, fees, or banking information before your first day. If you&#39;re ever unsure about a communication, don&#39;t click any links—visit anthropic.com/careers directly for confirmed position openings.</p>
<p>How we&#39;re different:</p>
<p>We believe that the highest-impact AI research will be big science. At Anthropic we work as a single cohesive team on just a few large-scale research efforts. And we value impact — advancing our long-term goals of steerable, trustworthy AI — rather than work on smaller and more specific puzzles. We view AI research as an empirical science, which has as much in common with physics and biology as with traditional efforts in computer science. We&#39;re an extremely collaborative group, and we host frequent research discussions to ensure that we are pursuing the highest-impact work at any given time. As such, we greatly value communication skills.</p>
<p>The easiest way to understand our research directions is to read our recent research. This research co</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>executive</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Enterprise sales experience, Financial services and insurance accounts, Emerging technologies, Consultative, solutions-oriented approach, Complex, long-cycle sales processes, Sector-specific regulatory and compliance frameworks, Compliant solutions, Strategic, analytical approach, Patient, relationship-focused execution, Advanced AI systems, Excellent communication skills in Japanese and English, Digital transformation, Regulatory compliance, Improved customer experience, Risk management, Data governance, Pipeline management, Budget cycles, Fiscal timelines, Team player, Ambiguous, startup-like environments</Skills>
      <Category>Sales</Category>
      <Industry>Finance</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic&apos;s mission is to create reliable, interpretable, and steerable AI systems. The company is a quickly growing group of committed researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://job-boards.greenhouse.io</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5104753008</Applyto>
      <Location>Tokyo, Japan</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>1ace7478-7a2</externalid>
      <Title>Staff+ Software Engineer, Data Infrastructure</Title>
      <Description><![CDATA[<p><strong>About the role</strong></p>
<p>Data Infrastructure designs, operates, and scales secure, privacy-respecting systems that power data-driven decisions across Anthropic. Our mission is to provide data processing, storage, and access that are trusted, fast, and easy to use.</p>
<p>We&#39;re looking for infrastructure engineers who thrive working at the intersection of data systems, security, and scalability. You&#39;ll tackle diverse challenges ranging from building financial reporting pipelines to architecting access control systems to ensuring cloud storage reliability. This role offers the opportunity to work directly with data scientists, analysts, and business stakeholders while diving deep into cloud infrastructure primitives.</p>
<p><strong>Responsibilities:</strong></p>
<p>Within Data Infra, you may be matched to critical business areas including:</p>
<ul>
<li><strong>Data Governance &amp; Access Control:</strong> Design and implement robust access control systems ensuring only authorized users can access sensitive data. Build infrastructure for permission management, audit logging, and compliance requirements. Work on IAM policies, ACLs, and security controls that scale across thousands of users and systems.</li>
</ul>
<ul>
<li><strong>Financial Data Infrastructure:</strong> Build and maintain data pipelines and warehouses powering business-critical reporting. Ensure data integrity, accuracy, and availability for complex financial systems, including third party revenue ingestion pipelines; manage the external relationships as needed to drive upstream dependencies. Own the reliability of systems processing revenue, usage, and business metrics.</li>
</ul>
<ul>
<li><strong>Cloud Storage &amp; Reliability:</strong> Architect disaster recovery, backup, and replication systems for petabyte-scale data. Ensure high availability and durability of data stored in cloud object storage (GCS, S3). Build systems that protect against data loss and enable rapid recovery.</li>
</ul>
<ul>
<li><strong>Data Platform &amp; Tooling:</strong> Scale data processing infrastructure using technologies like BigQuery, BigTable, Airflow, dbt, and Spark. Optimize query performance, manage costs, and enable self-service analytics across the organization.</li>
</ul>
<p><strong>You might be a good fit if you:</strong></p>
<ul>
<li>Have 10+ years (not including internships or co-ops) of experience in a Software Engineer role, building data infrastructure, storage systems, or related distributed systems</li>
</ul>
<ul>
<li>Have 3+ years (not including internships or co-ops) of experience leading large scale, complex projects or teams as an engineer or tech lead</li>
</ul>
<ul>
<li>Can set technical direction for a team, not just execute within it</li>
</ul>
<ul>
<li>Have deep experience with at least one of:</li>
</ul>
<ul>
<li>Strong proficiency in programming languages like Python, Go, Java, or similar</li>
</ul>
<ul>
<li>Experience with infrastructure-as-code (Terraform, Pulumi) and cloud platforms (GCP, AWS)</li>
</ul>
<p><strong>Strong candidates may also have:</strong></p>
<ul>
<li>Background in data warehousing, ETL/ELT pipelines, or analytics infrastructure</li>
</ul>
<ul>
<li>Experience with Kubernetes, containerization, and cloud-native architectures</li>
</ul>
<ul>
<li>Track record of improving data reliability, availability, or cost efficiency at scale</li>
</ul>
<ul>
<li>Knowledge of column-oriented databases, OLAP systems, or big data processing frameworks</li>
</ul>
<ul>
<li>Experience working in fintech, financial services, or highly regulated environments</li>
</ul>
<ul>
<li>Security engineering background with focus on data protection and access controls</li>
</ul>
<p><strong>Technologies We Use:</strong></p>
<ul>
<li>Data: BigQuery, BigTable, Airflow, Cloud Composer, dbt, Spark, Segment, Fivetran</li>
</ul>
<ul>
<li>Storage: GCS, S3</li>
</ul>
<ul>
<li>Infrastructure: Terraform, Kubernetes, GCP, AWS</li>
</ul>
<ul>
<li>Languages: Python, Go, SQL</li>
</ul>
<p><strong>Logistics</strong></p>
<p><strong>Education requirements:</strong> We require at least a Bachelor&#39;s degree in a related field or equivalent experience.</p>
<p><strong>Location-based hybrid policy:</strong> Currently, we expect all staff to be in one of our offices at least 25% of the time. However, some roles may require more time in our offices.</p>
<p><strong>Visa sponsorship:</strong> We do sponsor visas! However, we aren&#39;t able to successfully sponsor visas for every role and every candidate. But if we make you an offer, we will make every reasonable effort to get you a visa, and we retain an immigration lawyer to help with this.</p>
<p><strong>We encourage you to apply even if you do not believe you meet every single qualification.</strong> Not all strong candidates will meet every single qualification as listed. Research shows that people who identify as being from underrepresented groups are more prone to experiencing imposter syndrome and doubting the strength of their candidacy, so we urge you not to exclude yourself prematurely and to submit an application if you&#39;re interested in this work.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$405,000 - $485,000 USD</Salaryrange>
      <Skills>Python, Go, Java, Terraform, Pulumi, GCP, AWS, BigQuery, BigTable, Airflow, dbt, Spark, Segment, Fivetran, GCS, S3, Kubernetes, containerization, cloud-native architectures, data warehousing, ETL/ELT pipelines, analytics infrastructure, column-oriented databases, OLAP systems, big data processing frameworks, fintech, financial services, highly regulated environments, security engineering, data protection, access controls, data governance, access control, cloud storage, reliability, data platform, tooling, self-service analytics, data processing infrastructure, query performance, cost management</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Anthropic</Employername>
      <Employerlogo>https://logos.yubhub.co/anthropic.com.png</Employerlogo>
      <Employerdescription>Anthropic&apos;s mission is to create reliable, interpretable, and steerable AI systems. The company is a quickly growing group of committed researchers, engineers, policy experts, and business leaders working together to build beneficial AI systems.</Employerdescription>
      <Employerwebsite>https://job-boards.greenhouse.io</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://job-boards.greenhouse.io/anthropic/jobs/5114768008</Applyto>
      <Location>San Francisco, CA | Seattle, WA</Location>
      <Country></Country>
      <Postedate>2026-03-08</Postedate>
    </job>
    <job>
      <externalid>cbf4a173-e70</externalid>
      <Title>Data Scientist</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in analysing and interpreting complex data sets to inform our racing strategy and improve our performance on the track.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Work closely with our racing team to understand their needs and develop data-driven solutions to improve their performance</li>
<li>Develop and maintain complex data models and algorithms to analyse and interpret large data sets</li>
<li>Collaborate with our data engineering team to design and implement data pipelines and architectures</li>
<li>Communicate complex technical information to non-technical stakeholders, including our racing team and senior management</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>A strong background in data science, including experience with machine learning and statistical modelling</li>
<li>Proficiency in programming languages such as Python and R</li>
<li>Experience with data visualisation tools such as Tableau and Power BI</li>
<li>Strong communication and interpersonal skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a highly skilled and experienced team</li>
<li>Access to cutting-edge technology and resources</li>
<li>Flexible working hours and remote working options</li>
<li>Regular social events and team-building activities</li>
</ul>
<p><strong>Preferred Qualifications</strong></p>
<ul>
<li>Experience working in a Formula One team or a similar high-performance environment</li>
<li>Knowledge of racing strategy and tactics</li>
<li>Experience with data visualisation tools such as Matplotlib and Seaborn</li>
<li>Strong understanding of data governance and data quality principles</li>
</ul>
<p>If you are a highly motivated and skilled Data Scientist looking for a new challenge, please apply now.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>Python, R, Machine learning, Statistical modelling, Data visualisation, Tableau, Power BI, Matplotlib, Seaborn, Data governance, Data quality</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>Williams Racing</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.williamsf1.com.png</Employerlogo>
      <Employerdescription>Williams Racing is a British Formula One racing team that has been competing in the sport since 1977. The team is based in Grove, Oxfordshire, and has a strong reputation for innovation and technical excellence.</Employerdescription>
      <Employerwebsite>https://careers.williamsf1.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/ai-delivery-manager-in-grove-wantage-jid-514</Applyto>
      <Location>Grove, Oxfordshire</Location>
      <Country></Country>
      <Postedate>2026-03-07</Postedate>
    </job>
    <job>
      <externalid>6e6d3e44-db8</externalid>
      <Title>Data Scientist</Title>
      <Description><![CDATA[<p>We are seeking a highly skilled Data Scientist to join our team. As a Data Scientist, you will play a key role in analysing and interpreting complex data to inform business decisions and drive performance improvement.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Work closely with the data engineering team to design and implement data pipelines and data warehousing solutions</li>
<li>Develop and maintain data visualisation tools and reports to support business decision-making</li>
<li>Collaborate with cross-functional teams to identify business opportunities and develop data-driven solutions</li>
<li>Conduct statistical analysis and machine learning modelling to inform business decisions</li>
<li>Develop and maintain data quality and governance processes to ensure data accuracy and integrity</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Bachelor&#39;s degree in a quantitative field such as mathematics, statistics, or computer science</li>
<li>Proven experience in data analysis and machine learning</li>
<li>Strong programming skills in languages such as Python or R</li>
<li>Experience with data visualisation tools such as Tableau or Power BI</li>
<li>Excellent communication and interpersonal skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary and benefits package</li>
<li>Opportunity to work with a leading Formula One team</li>
<li>Collaborative and dynamic work environment</li>
<li>Professional development and training opportunities</li>
<li>Access to state-of-the-art technology and tools</li>
<li>Flexible working hours and remote working options</li>
<li>Annual bonus scheme</li>
<li>25 days&#39; annual leave</li>
<li>Pension scheme</li>
<li>Free on-site parking and meals</li>
<li>Access to on-site gym and fitness classes</li>
<li>Discounts on team merchandise and hospitality events</li>
</ul>
<p><strong>Preferred Qualifications</strong></p>
<ul>
<li>Master&#39;s degree in a quantitative field</li>
<li>Experience with cloud-based data platforms such as AWS or GCP</li>
<li>Experience with big data technologies such as Hadoop or Spark</li>
<li>Certification in data science or machine learning</li>
<li>Experience with data governance and quality processes</li>
</ul>
<p>If you are a motivated and talented Data Scientist looking for a new challenge, please submit your application. We look forward to hearing from you.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>Competitive salary and benefits package</Salaryrange>
      <Skills>Python, R, Tableau, Power BI, AWS, GCP, Hadoop, Spark, Data visualisation, Machine learning, Data governance, Data quality, Cloud-based data platforms, Big data technologies, Certification in data science or machine learning</Skills>
      <Category>Engineering</Category>
      <Industry>Motorsport</Industry>
      <Employername>Williams Racing</Employername>
      <Employerlogo>https://logos.yubhub.co/careers.williamsf1.com.png</Employerlogo>
      <Employerdescription>Williams Racing is a British Formula One racing team with a rich history in the sport. The team has been competing in F1 since 1977 and has won nine constructors&apos; championships.</Employerdescription>
      <Employerwebsite>https://careers.williamsf1.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://careers.williamsf1.com/job/test-and-validation-senior-test-engineer-in-grove-wantage-jid-395</Applyto>
      <Location>Grove, Oxfordshire</Location>
      <Country></Country>
      <Postedate>2026-03-07</Postedate>
    </job>
    <job>
      <externalid>aa015612-5ff</externalid>
      <Title>Product &amp; Solutions Lead, Safety and Security</Title>
      <Description><![CDATA[<p><strong>Job Posting</strong></p>
<p><strong>Product &amp; Solutions Lead, Safety and Security</strong></p>
<p><strong>Location</strong></p>
<p>San Francisco</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Intelligence &amp; Investigations</p>
<p><strong>Compensation</strong></p>
<ul>
<li>$288K – $425K • Offers Equity</li>
</ul>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p>More details about our benefits are available to candidates during the hiring process.</p>
<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>
<p><strong>About the Team</strong></p>
<p>The Intelligence &amp; Investigations (I2) team detects and disrupts abuse and strategic risks so people can use AI safely. We translate real-world signals, investigations, and external threat intelligence into practical mitigations, operating guidance, and partner-ready support that improves safety outcomes across the AI ecosystem.</p>
<p><strong>About the Role</strong></p>
<p>As a Product &amp; Solutions Lead focused on safety and security, you will build and operate 0–1 products, services, and technical solution packages that help developers and public institutions move from experimentation to durable, trusted outcomes—while maintaining public safety, transparency, and respect for privacy and rights.</p>
<p>This role balances two modes of delivery:</p>
<ol>
<li>Bespoke products and technical solutions for strategic internal and external partners, and</li>
</ol>
<ol>
<li>Scalable product and solution packages that can be reused broadly across partners and deployments.</li>
</ol>
<p>Training is a component of scale, but not the center of gravity. You will also ship reference implementations, playbooks, evaluation kits, and repeatable operating models that partners can adopt and operate.</p>
<p>You will work directly with engineers and a multidisciplinary group of safety and geopolitical analysts, and data and quantitative scientists to convert complex, evolving challenges into solutions that teams can adopt in high-stakes environments.</p>
<p>This role is based in San Francisco, CA (hybrid, 3 days/week). Relocation support is available.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li>Own the 0–1 roadmap for safety and security solution offerings: define the target users, problem statements, tools, operating models, success metrics, and the set of reusable deliverables we ship.</li>
</ul>
<ul>
<li>Design and ship bespoke technical solutions for priority partners (internal and external), then abstract what works into reusable patterns and toolkits.</li>
</ul>
<ul>
<li>Build partner-ready technical artifacts: solution blueprints, reference architectures, evaluation and monitoring guidance, incident/response playbooks, and deployment checklists.</li>
</ul>
<ul>
<li>Package open-source and proprietary capabilities into adoption-ready solutions (e.g., reference implementations, configuration patterns, validated workflows).</li>
</ul>
<ul>
<li>Maintain a consistent delivery model across engagements: intake, scoping, governance alignment, execution cadence, and retrospectives that improve the offering over time.</li>
</ul>
<ul>
<li>Translate evolving threats into actionable guidance and updates for solution packages (e.g., scams/fraud patterns, cyber-enabled threats, ecosystem abuse trends).</li>
</ul>
<ul>
<li>Develop lightweight enablement components as needed: targeted technical modules, hands-on labs, and readiness assessments that accelerate adoption of the solutions.</li>
</ul>
<ul>
<li>Define and instrument impact measurement: adoption milestones, readiness indicators, reliability and safety posture improvements, and partner satisfaction with outputs.</li>
</ul>
<ul>
<li>Partner closely across engineering, safety, geopolitical analysis, and quantitative teams to ensure solutions are technically credible, threat-informed, and measurable.</li>
</ul>
<ul>
<li>Communicate crisply and decision-readily to internal and external stakeholders: progress, trade-offs, risks, and recommendations.</li>
</ul>
<p><strong>You might thrive in this role if you:</strong></p>
<ul>
<li>Have 6+ years in product, technical program leadership, solutions, or platform operations, especially in safety, security, risk, integrity, or enterprise/public-sector contexts.</li>
</ul>
<ul>
<li>Have built 0–1 solution offerings (product plus services or productized services): taking ambiguous needs, shipping something concrete, then scaling it into a repeatable model.</li>
</ul>
<ul>
<li>Have a builder’s mindset: comfortable incubating early-stage ideas, testing them with partners, and evolving them into durable, repeatable safety and security solutions.</li>
</ul>
<ul>
<li>Can go deep with engineers and still produce partner-ready artifacts that are clear</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$288K – $425K</Salaryrange>
      <Skills>product leadership, technical program leadership, solutions, platform operations, safety, security, risk, integrity, enterprise/public-sector contexts, product development, solution development, technical writing, communication, project management, team leadership, collaboration, problem-solving, analytical skills, data analysis, data visualization, machine learning, artificial intelligence, cybersecurity, threat intelligence, incident response, compliance, regulatory affairs, cloud computing, containerization, DevOps, agile development, scrum, kanban, continuous integration, continuous deployment, continuous testing, test automation, security testing, penetration testing, vulnerability assessment, compliance testing, regulatory testing, data protection, information security, cybersecurity frameworks, risk management, compliance management, regulatory compliance, data governance, information governance, data quality, data integrity, data validation, data verification, data certification, data assurance, data security, data encryption, data masking, data tokenization, data anonymization, data pseudonymization, data aggregation, data fusion, data integration, data warehousing, data mart, data lake, data catalog, data governance, data quality, data integrity, data validation, data verification, data certification, data assurance, data security, data encryption, data masking, data tokenization, data anonymization, data pseudonymization, data aggregation, data fusion, data integration, data warehousing, data mart, data lake, data catalog</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is a technology company that focuses on developing and applying artificial intelligence in a way that benefits humanity. It was founded in 2015 and has since grown to become one of the leading AI research and development companies in the world.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/c664cc09-d996-450c-8683-ad591ac27c11</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>f7c3a63e-aaa</externalid>
      <Title>Product Marketing Lead, Privacy</Title>
      <Description><![CDATA[<p><strong>Job Posting</strong></p>
<p><strong>Product Marketing Lead, Privacy</strong></p>
<p><strong>Location</strong></p>
<p>San Francisco</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Marketing</p>
<p><strong>Compensation</strong></p>
<p>$284K – $415K</p>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<p><strong>Benefits</strong></p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
<li>401(k) retirement plan with employer match</li>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
<li>Mental health and wellness support</li>
<li>Employer-paid basic life and disability coverage</li>
<li>Annual learning and development stipend to fuel your professional growth</li>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
<li>Relocation support for eligible employees</li>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p>More details about our benefits are available to candidates during the hiring process.</p>
<p><strong>About the team</strong></p>
<p>OpenAI’s mission is to ensure the responsible and widespread adoption of artificial intelligence. In support of that mission, the Marketing team helps deeply understand customer audiences and market dynamics, influence the development of the right products, build sustainable and customer-aligned monetization models, and drive awareness, adoption, and usage across OpenAI’s products and platform.</p>
<p>We take a data-driven approach to understand markets, develop monetization strategies, and uncover customer needs that shape product strategy and messaging. We partner closely with Sales, Partnerships, Product, Engineering, Research, Comms, and Design to deliver a cohesive end-to-end customer experience and lead go-to-market efforts for new product launches across channels.</p>
<p><strong>About the role</strong></p>
<p>We’re looking for a Privacy Product Marketer to drive company-wide privacy narratives, strategy, and execution across our entire product portfolio. You’ll combine strong judgment with customer empathy and cross-functional leadership to translate privacy principles, user needs, and regulatory expectations into clear product requirements, crisp messaging, and scalable go-to-market readiness. This role is ideal for someone who thrives in ambiguity, enjoys influencing and aligning organizations and is energized by partnering deeply across teams to deliver trustworthy products.</p>
<p><strong>Responsibilities</strong></p>
<ul>
<li>Lead discovery to understand user expectations, enterprise requirements, and market dynamics related to privacy, trust, and data stewardship</li>
<li>Translate privacy insights into clear product requirements and programs in close partnership with Product, Engineering, Legal, and Privacy/Policy stakeholders</li>
<li>Partner with Design, Data Science, and Analytics to define success metrics for privacy experiences (e.g., comprehension, control adoption, incident reduction, support burden) and create measurement plans</li>
<li>Identify privacy risk areas across the product lifecycle (data collection, retention, training use, access controls, sharing, third parties) and drive cross-functional mitigation plans</li>
<li>Build clear, durable narratives and materials that explain privacy choices simply (e.g., product messaging, help center content, FAQs, internal enablement, exec-ready updates)</li>
<li>Coordinate readiness across teams (Product, Legal, Comms, Support, Sales) so launches are accurate, consistent, and scalable</li>
<li>Develop and maintain a privacy positioning framework across products—what we do, why it matters, and how we differentiate—while staying grounded in real product capabilities</li>
<li>Continuously evaluate and improve existing privacy experiences to increase clarity, trust, and user adoption of controls</li>
</ul>
<p><strong>Requirements</strong></p>
<ul>
<li>Have strong structured problem-solving skills—taking ambiguous privacy/trust problems, breaking them into answerable parts, and driving execution through to implementation</li>
<li>Can connect user expectations and regulatory/partner requirements to product strategy, influencing prioritization across Product/Engineering/Design</li>
<li>Have experience with privacy-adjacent domains (privacy UX, trust &amp; safety, security, data governance, compliance, or responsible AI) and enjoy navigating complexity</li>
<li>Communicate clearly and persuasively—turning nuanced policy/technical concepts into crisp user value and straightforward guidance</li>
<li>Are fluent in qualitative and quantitative approaches (user research, surveys, funnel analysis, experimentation, support signals) and can turn insights into decisions</li>
<li>Thrive in fast-paced, high-ambiguity environments and can manage multiple cross-company workstreams with strong judgment</li>
<li>Are highly collaborative, proactive, and comfortable leading cross-functional efforts without direct authority</li>
<li>Care deeply about building products that users can understand, control, and trust</li>
</ul>
<p><strong>Experience Level</strong></p>
<p>Mid</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Workplace Type</strong></p>
<p>Hybrid</p>
<p><strong>Category</strong></p>
<p>Marketing</p>
<p><strong>Industry</strong></p>
<p>Technology</p>
<p><strong>Salary Range</strong></p>
<p>$284K – $415K</p>
<p><strong>Required Skills</strong></p>
<ul>
<li>Privacy UX</li>
<li>Trust &amp; safety</li>
<li>Security</li>
<li>Data governance</li>
<li>Compliance</li>
<li>Responsible AI</li>
<li>User research</li>
<li>Surveys</li>
<li>Funnel analysis</li>
<li>Experimentation</li>
<li>Support signals</li>
<li>Cross-functional leadership</li>
<li>Strong judgment</li>
<li>Customer empathy</li>
<li>Data-driven approach</li>
<li>Product strategy</li>
<li>Messaging</li>
<li>Scalable go-to-market readiness</li>
</ul>
<p><strong>Preferred Skills</strong></p>
<ul>
<li>Product management</li>
<li>Engineering</li>
<li>Design</li>
<li>Data science</li>
<li>Analytics</li>
<li>Marketing</li>
<li>Sales</li>
<li>Partnerships</li>
<li>Research</li>
<li>Comms</li>
<li>Legal</li>
<li>Policy</li>
<li>Privacy</li>
<li>Trust</li>
<li>Safety</li>
<li>Security</li>
<li>Data governance</li>
<li>Compliance</li>
<li>Responsible AI</li>
<li>User research</li>
<li>Surveys</li>
<li>Funnel analysis</li>
<li>Experimentation</li>
<li>Support signals</li>
<li>Cross-functional leadership</li>
<li>Strong judgment</li>
<li>Customer empathy</li>
<li>Data-driven approach</li>
<li>Product strategy</li>
<li>Messaging</li>
<li>Scalable go-to-market readiness</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$284K – $415K</Salaryrange>
      <Skills>privacy UX, trust &amp; safety, security, data governance, compliance, responsible AI, user research, surveys, funnel analysis, experimentation, support signals, cross-functional leadership, strong judgment, customer empathy, data-driven approach, product strategy, messaging, scalable go-to-market readiness, product management, engineering, design, data science, analytics, marketing, sales, partnerships, research, comms, legal, policy, privacy, trust, safety, security, data governance, compliance, responsible AI, user research, surveys, funnel analysis, experimentation, support signals, cross-functional leadership, strong judgment, customer empathy, data-driven approach, product strategy, messaging, scalable go-to-market readiness</Skills>
      <Category>marketing</Category>
      <Industry>technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is a technology company that focuses on developing and applying artificial intelligence in a responsible and beneficial way. It is a privately held company with a large team of engineers and researchers.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/1e87291e-5322-461d-ae29-5a4945066315</Applyto>
      <Location>San Francisco</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>d92c7cad-e5d</externalid>
      <Title>Software Engineer, Codex for Teams</Title>
      <Description><![CDATA[<p><strong>Software Engineer, Codex for Teams</strong></p>
<p><strong>About the Team</strong></p>
<p>With Codex we’re building an AI software engineer. One that you can pair with, delegate to, or even ask to take on future tasks proactively. Our team is a fast-moving group within OpenAI, bringing together research, engineering, design, and product. We iteratively build the Codex agent harness and product to get the most out of the model, and we iteratively train the model to be great at complex software engineering tasks.</p>
<p><strong>About the Role</strong></p>
<p>This role is for a software engineer working on Codex, with a specific focus on enabling team-scale adoption across a wide spectrum of environments, from internal teams at OpenAI to external customers ranging from startups to large enterprises. You’ll work directly with customers, Go To Market (GTM) teams, and other engineers and researchers across Codex and OpenAI. You will turn diverse team requirements into products that scale across organizations. The role bridges what teams with Codex’s capabilities, ensuring that they are robust, repeatable, and deeply aligned with how developers work in real-world environments. You’ll own systems end-to-end (architecture, implementation, production operations) with a strong bias for both quality and velocity.</p>
<p><strong>In this role, you will:</strong></p>
<ul>
<li>Ship fundamental capabilities including analytics dashboards and APIs, compliance and audit surfaces, workspace RBAC and admin controls, managed configuration and constraints, and rate limits, usage, and pricing primitives for teams.</li>
</ul>
<ul>
<li>Design and build robust, full-stack services and APIs that power Codex across surfaces (web/app, CLI/local, IDEs, CI/CD) with strong observability, reliability, and security.</li>
</ul>
<ul>
<li>Enable standardized team deployments by building team configuration packaging and distribution patterns that make it easy to roll out consistent experiences across workspaces.</li>
</ul>
<ul>
<li>Integrate with enterprise identity and governance systems (e.g., SSO/SAML/OIDC, SCIM, RBAC, policy enforcement), and build data-access patterns that are secure, performant, and compliant for large customers.</li>
</ul>
<ul>
<li>Partner with GTM to work hands-on with teams through deep engagements to accelerate adoption, iterate rapidly, and translate real-world feedback into scalable product and platform improvements.</li>
</ul>
<p><strong>You might thrive in this role if you:</strong></p>
<ul>
<li>Have strong software engineering fundamentals and experience turning ideas into productionized systems, thinking holistically about speed, performance, and user experience.</li>
</ul>
<ul>
<li>Are proficient in one or more backend languages (e.g., Python, Go, Rust) and distributed systems concepts, with a focus on reliability, observability, and security.</li>
</ul>
<ul>
<li>Enjoy building cross-cutting platform capabilities that unlock product velocity, and you’re comfortable working across services, APIs, and end-user product surfaces.</li>
</ul>
<ul>
<li>Have experience with team/enterprise foundations such as identity and access (SAML/OIDC), SCIM, RBAC, audit/compliance logging, policy enforcement, and data governance controls.</li>
</ul>
<ul>
<li>Have built developer tools and workflows (CLI/IDE/SDK), automation systems (triggers/scheduling), or integration platforms that connect products to a broader ecosystem of tools.</li>
</ul>
<ul>
<li>Like working directly with users/customers (or alongside GTM/solutions teams), and can translate messy, diverse requirements into opinionated implementations that scale across many teams.</li>
</ul>
<ul>
<li>Enjoy 0 -&gt; 1 environments, can navigate ambiguity, and bring crisp product thinking to technical trade-offs.</li>
</ul>
<p><strong>About OpenAI</strong></p>
<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>mid</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>Python, Go, Rust, Distributed systems concepts, Reliability, Observability, Security, Backend languages, APIs, Services, End-user product surfaces, Team/enterprise foundations, Identity and access, SCIM, RBAC, Audit/compliance logging, Policy enforcement, Data governance controls, Developer tools and workflows, Automation systems, Integration platforms, Cross-cutting platform capabilities, Product velocity, Services, APIs, End-user product surfaces, Team/enterprise foundations, Identity and access, SCIM, RBAC, Audit/compliance logging, Policy enforcement, Data governance controls, Developer tools and workflows, Automation systems, Integration platforms</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/c1c8b058-2f0d-4192-8a9e-c21d0f24952c</Applyto>
      <Location>London, UK</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>ec06a431-7fa</externalid>
      <Title>Software Engineer - Privacy &amp; Compliance</Title>
      <Description><![CDATA[<p><strong>Software Engineer - Privacy &amp; Compliance</strong></p>
<p><strong>Location</strong></p>
<p>San Francisco; Seattle</p>
<p><strong>Employment Type</strong></p>
<p>Full time</p>
<p><strong>Department</strong></p>
<p>Applied AI</p>
<p><strong>Compensation</strong></p>
<ul>
<li>$230K – $385K • Offers Equity</li>
</ul>
<p>The base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. If the role is non-exempt, overtime pay will be provided consistent with applicable laws. In addition to the salary range listed above, total compensation also includes generous equity, performance-related bonus(es) for eligible employees, and the following benefits.</p>
<ul>
<li>Medical, dental, and vision insurance for you and your family, with employer contributions to Health Savings Accounts</li>
</ul>
<ul>
<li>Pre-tax accounts for Health FSA, Dependent Care FSA, and commuter expenses (parking and transit)</li>
</ul>
<ul>
<li>401(k) retirement plan with employer match</li>
</ul>
<ul>
<li>Paid parental leave (up to 24 weeks for birth parents and 20 weeks for non-birthing parents), plus paid medical and caregiver leave (up to 8 weeks)</li>
</ul>
<ul>
<li>Paid time off: flexible PTO for exempt employees and up to 15 days annually for non-exempt employees</li>
</ul>
<ul>
<li>13+ paid company holidays, and multiple paid coordinated company office closures throughout the year for focus and recharge, plus paid sick or safe time (1 hour per 30 hours worked, or more, as required by applicable state or local law)</li>
</ul>
<ul>
<li>Mental health and wellness support</li>
</ul>
<ul>
<li>Employer-paid basic life and disability coverage</li>
</ul>
<ul>
<li>Annual learning and development stipend to fuel your professional growth</li>
</ul>
<ul>
<li>Daily meals in our offices, and meal delivery credits as eligible</li>
</ul>
<ul>
<li>Relocation support for eligible employees</li>
</ul>
<ul>
<li>Additional taxable fringe benefits, such as charitable donation matching and wellness stipends, may also be provided.</li>
</ul>
<p>More details about our benefits are available to candidates during the hiring process.</p>
<p>This role is at-will and OpenAI reserves the right to modify base pay and other compensation components at any time based on individual performance, team or company results, or market conditions.</p>
<p>We’re looking for a <strong>Software Engineer</strong> to architect and build backend systems that enforce data privacy and automate compliance at scale. You’ll work closely with product, infrastructure, security, and legal teams to embed privacy-by-design into our data and access layers.</p>
<p>This is a hands-on, high-impact role for an experienced engineer who is passionate about protecting user data while enabling innovation.</p>
<p><strong><strong>What You’ll Do</strong></strong></p>
<ul>
<li>Design, build, and operate backend services that enforce policy-driven data access, lifecycle controls, and privacy protections.</li>
</ul>
<ul>
<li>Develop distributed authorization and identity-aware enforcement mechanisms integrated directly into data services and control planes.</li>
</ul>
<ul>
<li>Implement auditability, policy hooks, and enforcement observability to ensure compliance is continuously verifiable.</li>
</ul>
<ul>
<li>Partner with Security, Legal, and Compliance to convert privacy requirements into scalable technical designs and developer-friendly APIs.</li>
</ul>
<ul>
<li>Harden data platforms and backend services through schema-level controls and data handling constraints by default.</li>
</ul>
<ul>
<li>Collaborate with infrastructure teams to ensure consistent enforcement across systems while minimizing duplicated implementations.</li>
</ul>
<ul>
<li>Contribute patterns, libraries, and education that elevate trustworthy data access patterns across the organization.</li>
</ul>
<p><strong><strong>You Might Thrive in This Role If You Have</strong></strong></p>
<ul>
<li><strong>5+ years of industry experience</strong> building and operating backend or infrastructure systems in production.</li>
</ul>
<ul>
<li><strong>Strong software engineering fundamentals</strong>, with fluency in at least one major programming language (e.g., Python, Go, Rust, C++, Java).</li>
</ul>
<ul>
<li>Experience with distributed authorization, RBAC/ACL systems, encryption-based access, or policy engines.</li>
</ul>
<ul>
<li><strong>Familiarity with global privacy regulations</strong> and their architectural implications.</li>
</ul>
<ul>
<li><strong>Ability to influence and collaborate</strong> with teams across legal, compliance, product, and engineering.</li>
</ul>
<ul>
<li>A <strong>bias toward practical, impactful solutions</strong> that balance privacy protections with product needs.</li>
</ul>
<p><strong><strong>Nice to Have</strong></strong></p>
<ul>
<li>Experience with cloud platforms (e.g., Azure, AWS, GCP) and large-scale data systems.</li>
</ul>
<ul>
<li>Background in security engineering, privacy engineering, or data governance.</li>
</ul>
<ul>
<li>Experience with control-plane or metadata-driven enforcement systems.</li>
</ul>
<ul>
<li>Exposure to data platforms or ML infrastructure.</li>
</ul>
<ul>
<li>Prior experience in a regulated or highly sensitive data environment.</li>
</ul>
<p><strong>About OpenAI</strong></p>
<p>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products. AI is an extremely powerful tool that must be created with safety and human needs at its core, and to achieve our mission, we must encompass and value the many different perspectives, voices, and experiences that form the full spectrum of humanity.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$230K – $385K • Offers Equity</Salaryrange>
      <Skills>Python, Go, Rust, C++, Java, Distributed authorization, RBAC/ACL systems, Encryption-based access, Policy engines, Global privacy regulations, Cloud platforms, Large-scale data systems, Security engineering, Privacy engineering, Data governance, Control-plane or metadata-driven enforcement systems, Data platforms, ML infrastructure</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>OpenAI</Employername>
      <Employerlogo>https://logos.yubhub.co/openai.com.png</Employerlogo>
      <Employerdescription>OpenAI is an AI research and deployment company dedicated to ensuring that general-purpose artificial intelligence benefits all of humanity. We push the boundaries of the capabilities of AI systems and seek to safely deploy them to the world through our products.</Employerdescription>
      <Employerwebsite>https://jobs.ashbyhq.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ashbyhq.com/openai/23b158fe-709e-4bf5-856c-d10953d32f60</Applyto>
      <Location>San Francisco, Seattle</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>e2dd1bdf-0b0</externalid>
      <Title>Member of Technical Staff - Data Infra - MAI Superintelligence Team</Title>
      <Description><![CDATA[<p><strong>Summary</strong></p>
<p>Microsoft AI are looking for a talented Member of Technical Staff - Data Infra - MAI Superintelligence Team at their Redmond office. This role sits at the heart of strategic decision-making, turning market data into actionable insights for a company that&#39;s revolutionising haptic entertainment technology. You&#39;ll work directly with leadership to shape the company&#39;s direction in the cinema and simulation markets.</p>
<p><strong>About the Role</strong></p>
<p>We are looking for outstanding individuals excited about contributing to the next generation of systems that will transform the field. In particular, we are looking for candidates who: Are passionate about the role of data in large-scale AI model training Will thrive in a highly collaborative, fast-paced environment Have a high degree of expertise and pay close attention to details Demonstrate a proactive attitude and enthusiasm for exploring new methods and technologies Effectively manage multiple responsibilities and can adjust to shifting priorities.</p>
<p><strong>Accountabilities</strong></p>
<ul>
<li>Design and develop data pipelines that ingest enormous amounts of multi-modal training data (text, audio, images, video).</li>
<li>Own and maintain critical data infrastructures, including spark, ray, vector databases, and others.</li>
</ul>
<p><strong>The Candidate we&#39;re looking for</strong></p>
<p><strong>Experience:</strong></p>
<ul>
<li>Master&#39;s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling, or data engineering OR Bachelor&#39;s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 8+ years experience in business analytics, data science, software development, data modeling, or data engineering OR equivalent experience.</li>
</ul>
<p><strong>Technical skills:</strong></p>
<ul>
<li>4+ years experience with data governance, data compliance and/or data security.</li>
</ul>
<p><strong>Personal attributes:</strong></p>
<ul>
<li>A high degree of expertise and pay close attention to details.</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary range: $163,000 - $296,400 per year.</li>
<li>Benefits and other compensation.</li>
<li>Flexible work arrangements, including remote and hybrid options.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange>$163,000 - $296,400 per year</Salaryrange>
      <Skills>data governance, data compliance, data security, data engineering, data science, data governance, data compliance, data security</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Microsoft AI</Employername>
      <Employerlogo>https://logos.yubhub.co/microsoft.ai.png</Employerlogo>
      <Employerdescription>Microsoft AI is a leading technology company that aims to empower every person and every organization on the planet to achieve more. They are on a mission to create the largest and most advanced multimodal dataset in the world, which will power the training of the world&apos;s most capable AI frontier models.</Employerdescription>
      <Employerwebsite>https://microsoft.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://microsoft.ai/job/member-of-technical-staff-data-infra-mai-superintelligence-team-2/</Applyto>
      <Location>Redmond</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>375d722d-93c</externalid>
      <Title>Sr. Sales Operations Business Analyst</Title>
      <Description><![CDATA[<p><strong>Summary</strong></p>
<p>Microsoft are looking for a talented Sr. Sales Operations Business Analyst at their Redmond office. This role sits at the heart of executive decision-making, turning complex executive-level questions into insights that shape business decisions and inform what the organization does next.</p>
<p><strong>About the Role</strong></p>
<p>As a Senior Sales Operations Business Analyst in Microsoft Advertising, you will partner closely with managers and executive leaders to analyze complex business questions, deliver insights, and inform critical decisions. You will shape analyses end to end, synthesizing data, developing clear recommendations, and helping leaders determine next steps that drive organizational outcomes.</p>
<p><strong>Accountabilities</strong></p>
<ul>
<li>Build a deep understanding of the business across multiple domains to anticipate business problems and identify analytical solutions.</li>
<li>Translate evolving business needs into clear data requirements; determine new data sources needed and proactively address data gaps or quality issues.</li>
</ul>
<p><strong>The Candidate we&#39;re looking for</strong></p>
<p><strong>Experience:</strong></p>
<ul>
<li>Master’s Degree in Mathematics, Analytics, Engineering, Computer Science, Marketing, Business, Economics or related field AND 3+ years’ experience in data analysis and reporting, business intelligence, or business and financial analysis OR Bachelor’s Degree in Statistics, Finance, Mathematics, Analytics, Engineering, Computer Science, Marketing, Business, Economics or related field AND 4+ years’ experience in data analysis and reporting, business intelligence, or business and financial analysis OR equivalent experience</li>
</ul>
<p><strong>Technical skills:</strong></p>
<ul>
<li>Strong analytical and problem-solving skills</li>
<li>Proficiency in data analysis and reporting tools such as Excel, SQL, and data visualization tools</li>
<li>Experience with data management and data governance principles</li>
</ul>
<p><strong>Personal attributes:</strong></p>
<ul>
<li>Strong communication and presentation skills</li>
<li>Ability to work effectively in a team environment</li>
<li>Strong analytical and problem-solving skills</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Competitive salary</li>
<li>Comprehensive benefits package</li>
<li>Opportunities for professional growth and development</li>
<li>Collaborative and dynamic work environment</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>USD $106,400 – $203,600 per year</Salaryrange>
      <Skills>data analysis, business intelligence, data visualization, data management, data governance, SQL, Excel, data analysis, business intelligence</Skills>
      <Category>Sales</Category>
      <Industry>Technology</Industry>
      <Employername>Microsoft</Employername>
      <Employerlogo>https://logos.yubhub.co/microsoft.ai.png</Employerlogo>
      <Employerdescription>Microsoft is a multinational technology company that develops, manufactures, licenses, and supports a wide range of software products, services, and devices. The company is headquartered in Redmond, Washington, and is one of the largest and most successful technology companies in the world.</Employerdescription>
      <Employerwebsite>https://microsoft.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://microsoft.ai/job/sr-sales-operations-business-analyst/</Applyto>
      <Location>Redmond</Location>
      <Country></Country>
      <Postedate>2026-03-06</Postedate>
    </job>
    <job>
      <externalid>18d14369-6b5</externalid>
      <Title>Member of Technical Staff - Data Infra - MAI Superintelligence Team</Title>
      <Description><![CDATA[<p><strong>Summary</strong></p>
<p>Microsoft AI are looking for a talented Member of Technical Staff - Data Infra - MAI Superintelligence Team at their Mountain View office. This role sits at the heart of strategic decision-making, turning market data into actionable insights for a company that&#39;s revolutionising AI technology. You&#39;ll work directly with leadership to shape the company&#39;s direction in the AI and data science markets.</p>
<p><strong>About the Role</strong></p>
<p>We are looking for outstanding individuals excited about contributing to the next generation of systems that will transform the field. In particular, we are looking for candidates who: Are passionate about the role of data in large-scale AI model training Will thrive in a highly collaborative, fast-paced environment Have a high degree of expertise and pay close attention to details Demonstrate a proactive attitude and enthusiasm for exploring new methods and technologies Effectively manage multiple responsibilities and can adjust to shifting priorities.</p>
<p><strong>Accountabilities</strong></p>
<ul>
<li>Design and develop data pipelines that ingest enormous amounts of multi-modal training data (text, audio, images, video).</li>
<li>Own and maintain critical data infrastructures, including spark, ray, vector databases, and others.</li>
<li>Build and maintain cutting-edge infrastructure that can store and process the petabytes of data needed to power models.</li>
</ul>
<p><strong>The Candidate we&#39;re looking for</strong></p>
<p><strong>Experience:</strong></p>
<ul>
<li>Master&#39;s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling, or data engineering OR Bachelor&#39;s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 8+ years experience in business analytics, data science, software development, data modeling, or data engineering OR equivalent experience.</li>
</ul>
<p><strong>Technical skills:</strong></p>
<ul>
<li>4+ years experience with data governance, data compliance and/or data security.</li>
</ul>
<p><strong>Personal attributes:</strong></p>
<ul>
<li>Embody our culture and values.</li>
</ul>
<p><strong>Benefits</strong></p>
<ul>
<li>Data Engineering IC6 – The typical base pay range for this role across the U.S. is USD $163,000 – $296,400 per year.</li>
<li>There is a different range applicable to specific work locations, within the San Francisco Bay area and New York City metropolitan area, and the base pay range for this role in those locations is USD $220,800 – $331,200 per year.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>staff</Experiencelevel>
      <Workarrangement>onsite</Workarrangement>
      <Salaryrange>USD $163,000 – $296,400 per year</Salaryrange>
      <Skills>data governance, data compliance, data security, data engineering, data science, data governance, data compliance, data security</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Microsoft AI</Employername>
      <Employerlogo>https://logos.yubhub.co/microsoft.ai.png</Employerlogo>
      <Employerdescription>Microsoft AI is a leading technology company that specializes in artificial intelligence, machine learning, and data science. They are on a mission to create the largest and most advanced multimodal dataset in the world, which will power the training of the world&apos;s most capable AI frontier models. Microsoft AI is a startup-like team inside Microsoft, created to push the boundaries of AI toward Humanist Superintelligence—ultra-capable systems that remain controllable, safety-aligned, and anchored to human values.</Employerdescription>
      <Employerwebsite>https://microsoft.ai</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://microsoft.ai/job/member-of-technical-staff-data-infra-mai-superintelligence-team/</Applyto>
      <Location>Mountain View</Location>
      <Country></Country>
      <Postedate>2026-03-05</Postedate>
    </job>
    <job>
      <externalid>1a13623f-086</externalid>
      <Title>Manager / Senior Manager SAP Data Management &amp; Governance</Title>
      <Description><![CDATA[<p>Opening. As Manager / Senior Manager SAP Data Management &amp; Governance, you will play a central role in the professional, organizational, and sales development of our SAP Data Management &amp; Governance Practice. You will be responsible for complex customer projects, design SAP data strategies, and drive the expansion of our service portfolio, particularly in the areas of SAP data archiving and master data management (MDM).</p>
<p><strong>What you&#39;ll do</strong></p>
<p>As Manager / Senior Manager SAP Data Management &amp; Governance, you will be responsible for the following tasks:</p>
<ul>
<li>Project management of SAP data projects and workstreams, as well as coordination with overarching S/4HANA transformation programs</li>
<li>Implementation of complex SAP Data Management &amp; Governance initiatives at customers, including SAP Master Data Management (MDM), SAP data archiving, and optionally SAP MDG (design, implementation, rollout, optimization)</li>
<li>Consulting on best-practice data processes, governance methods, and their strategic relevance in SAP landscapes (ECC &amp; S/4HANA)</li>
<li>Development of solution concepts and creation of high-quality data strategy and governance roadmaps, including data archiving concepts</li>
<li>Conducting customer meetings, workshops, and presentations, translating complex technical topics into clear, actionable recommendations</li>
<li>Participation in business development and presales, particularly in the preparation of offers</li>
<li>Active role in the strategic development of our SAP Data Management &amp; Governance practice</li>
</ul>
<p><strong>What you need</strong></p>
<p>To be successful as (Senior) Manager SAP Data Management &amp; Governance (all genders), you will need the following qualifications:</p>
<ul>
<li>Completed studies in business administration, IT, engineering, or a related field</li>
<li>Passion for SAP consulting with a focus on data management and governance</li>
<li>Expertise in SAP Master Data Management, SAP data archiving (ILM knowledge is an asset), and SAP MDG (desirable)</li>
<li>Your work style is analytical, structured, and conceptually strong, combined with excellent communication skills</li>
</ul>
<p><strong>Why this matters</strong></p>
<p>As Manager / Senior Manager SAP Data Management &amp; Governance, you will play a key role in shaping the future of our SAP Data Management &amp; Governance practice and contributing to the success of our customers.</p>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>SAP Master Data Management, SAP data archiving, SAP MDG, ILM knowledge, SAP data governance</Skills>
      <Category>IT</Category>
      <Industry>Technology</Industry>
      <Employername>MHP - A Porsche Company</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.porsche.com.png</Employerlogo>
      <Employerdescription>MHP is a technology and business partner that digitalizes processes and products for its customers and accompanies them in their IT transformations along the entire value chain. As a digitalization pioneer in the sectors of mobility and manufacturing, MHP transfers its expertise to various industries and is the premium partner for thought leaders on the way to a better tomorrow.</Employerdescription>
      <Employerwebsite>https://jobs.porsche.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.porsche.com/index.php?ac=jobad&amp;id=13376</Applyto>
      <Location>Deutschlandweit</Location>
      <Country></Country>
      <Postedate>2026-02-18</Postedate>
    </job>
    <job>
      <externalid>92026dc5-6a0</externalid>
      <Title>Senior Director, Generative AI Product Management</Title>
      <Description><![CDATA[<p>Join us at Electronic Arts (EA), where our mission is to inspire the world to play. As Senior Director of Product Management AI Technologies, you will guide our global AI product strategy and collaborate with partners across engineering, data science, and operations to shape the future of interactive entertainment.</p>
<p><strong>What you&#39;ll do</strong></p>
<p>You will define and evolve enterprise-wide AI product strategies, aligning with EA&#39;s business priorities and guiding a portfolio of AI use cases such as copilots, automation, and content generation.</p>
<ul>
<li>You will establish and enforce product-level governance, experimentation guardrails, and vendor standards for security, IP, and bias mitigation in partnership with Legal, Privacy, Security, and HR.</li>
</ul>
<ul>
<li>You will build, mentor, and scale a senior product management team, standardizing product operating models and metrics across regions to support our diverse, global workforce.</li>
</ul>
<ul>
<li>You will translate business outcomes into technically actionable requirements, collaborating with engineering leadership to balance speed, reliability, cost, and technical debt for scalable AI platforms.</li>
</ul>
<p><strong>What you need</strong></p>
<ul>
<li>10+ years of product management experience, including leadership of senior PMs and management of complex, multi-product portfolios in enterprise environments.</li>
</ul>
<ul>
<li>Experience guiding AI/ML product development, data governance, and roadmap ownership, with measurable success in launching AI solutions for large, multinational organizations.</li>
</ul>
<ul>
<li>Experience partnering with Legal, Privacy, Security, and Compliance to manage AI risk and regulatory compliance, including GDPR and AI Act requirements.</li>
</ul>
<ul>
<li>Bachelor&#39;s degree in Computer Science, Engineering, Business, Data Science, or a related field; advanced degree welcomed.</li>
</ul>
<p style="margin-top:24px;font-size:13px;color:#666;">XML job scraping automation by <a href="https://yubhub.co">YubHub</a></p>]]></Description>
      <Jobtype>full-time</Jobtype>
      <Experiencelevel>senior</Experiencelevel>
      <Workarrangement>hybrid</Workarrangement>
      <Salaryrange></Salaryrange>
      <Skills>product management, AI/ML product development, data governance, roadmap ownership, leadership, collaboration, communication</Skills>
      <Category>Engineering</Category>
      <Industry>Technology</Industry>
      <Employername>Electronic Arts</Employername>
      <Employerlogo>https://logos.yubhub.co/jobs.ea.com.png</Employerlogo>
      <Employerdescription>Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter.</Employerdescription>
      <Employerwebsite>https://jobs.ea.com</Employerwebsite>
      <Compensationcurrency></Compensationcurrency>
      <Compensationmin></Compensationmin>
      <Compensationmax></Compensationmax>
      <Applyto>https://jobs.ea.com/en_US/careers/JobDetail/Senior-Director-Generative-AI-Product-Management/212362</Applyto>
      <Location>Austin, Texas</Location>
      <Country></Country>
      <Postedate>2026-02-05</Postedate>
    </job>
  </jobs>
</source>